Skip to content

Instantly share code, notes, and snippets.

@ducnh1022
Created April 3, 2016 06:29
Show Gist options
  • Save ducnh1022/8eb402e28e63eb9c7c8c0818ae08eefb to your computer and use it in GitHub Desktop.
Save ducnh1022/8eb402e28e63eb9c7c8c0818ae08eefb to your computer and use it in GitHub Desktop.
Classification week3 - decision trees
step 1 start with empty tree
step 2 select a feature to split data
FOr each split of tree
step3 if nothing more to, make predictions
step4 ohter wise go to step 2 and continue recurse on this split
Feature split learning = decision stump learning
what better: split on credit or term
step 1: y hat = class of majority of data node
step 2: calculate classification error of predicting yhat for this data
2 stopping condition: when all data agree on 1 choice or when no more feature to split
threshold for continous inputs
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment