Created
April 3, 2016 06:29
-
-
Save ducnh1022/8eb402e28e63eb9c7c8c0818ae08eefb to your computer and use it in GitHub Desktop.
Classification week3 - decision trees
This file contains hidden or bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
step 1 start with empty tree | |
step 2 select a feature to split data | |
FOr each split of tree | |
step3 if nothing more to, make predictions | |
step4 ohter wise go to step 2 and continue recurse on this split | |
Feature split learning = decision stump learning | |
what better: split on credit or term | |
step 1: y hat = class of majority of data node | |
step 2: calculate classification error of predicting yhat for this data | |
2 stopping condition: when all data agree on 1 choice or when no more feature to split | |
threshold for continous inputs |
Sign up for free
to join this conversation on GitHub.
Already have an account?
Sign in to comment