Overfitting

Definition - What does Overfitting mean?

In statistics and machine learning, overfitting occurs when a model tries to predict a trend in data that is too noisy. Overfitting is the result of an overly complex model with too many parameters. A model that is overfitted is inaccurate because the trend does not reflect the reality of the data.

Techopedia explains Overfitting

An overfitted model is a model with a trend line that reflects the errors in the data that it is trained with, instead of accurately predicting unseen data. This is better seen visually with a graph of data points and a trend line. An overfitted model shows a curve with higher and lower points, while a properly fitted model shows a smooth curve or a linear regression.

The main problem with overfitting is that the model has effectively memorized existing data points rather than trying to predict how unseen data points would be.

Overfitting typically results from an excessive number of training points. There are a number of techniques that machine learning researchers can use to mitigate overfitting, including cross-validation, regularization, early stopping, pruning, Bayesian priors, dropout and model comparison.