4 Overfitting
4 Overfitting
G
Overfitting refers to the phenomenon
where a neural network models the
training data very well but fails when it
sees new data from the same problem
domain.
Overfitting is caused by noise in the
training data that the neural network picks
up during training and learns it as an
underlying concept of the data.
Common regularization techniques:
1. Dropout (randomly shutdowns some unit
to prevents co-adaptation)
2. Drop connect ( randomly drop
connections)
Bias- Training data error
Variance- test data error
Underfitting- model is not learning enough
(high bias)
Overfitting-model is not flexible enough
(high variance)
Good balance – balance btn overfitting and
underfitting (regularize the model)
12/10/2 6
024