Why Overfitting is More Dangerous than Just Poor Accuracy, Part I - Predictive Analytics Times - machine learning & data science news
Predictive Analytics Times
Predictive Analytics Times
EXCLUSIVE HIGHLIGHTS
SHARE THIS:

5 years ago
Why Overfitting is More Dangerous than Just Poor Accuracy, Part I

 Arguably, the most important safeguard in building predictive models is complexity regularization to avoid overfitting the data. When models are overfit, their accuracy is lower on new data that wasn’t seen during training, and therefore when these models are deployed, they will disappoint, sometimes even leading decision makers to believe that predictive modeling “doesn’t work”. Overfit, however, is thankfully a well-known problem and every algorithm has ways to avoid it. CART® and C5 trees use pruning to remove branches that are prone to overfitting, CHAID trees require splits are statistically significant to add complexity to the trees. Neural networks use

To view this content
Login OR subscribe for free

Already receive the Predictive Analytics Times emails?
As of January 2014, the Predictive Analytics Times now requires legacy email subscribers to upgrade their subscription - one time only - in order to attain a password-protected login and gain complete access.

Click here to complete this one-time subscription upgrade

  

Existing Users Log In
   
New User Registration
*Required field

Comments are closed.