Overfitting can be one problem that describes if your model no longer generalizes well.
Overfitting happens when any learning processing overly optimizes training set error at the cost test error. While it’s possible for training and testing to perform equality well in cross validation, it could be as the result of the data being very close in characteristics, which may not be a huge problem. In the case of decision tree’s they can learn a training set to a point of high granularity that makes them easily overfit. Allowing a decision tree to split to a granular degree, is the behavior of this model that makes it prone to learning every point extremely well — to the point of perfect classification — ie: overfitting.
I recommend the following steps to avoid overfitting:
- Use a test set that is not exactly like the training set, or different enough that error rates are going to be easy to see.