Day-4 🧵: Avoid overfitting with Early Stopping callback! 🚀
@PyTorchLightnin Early stopping callback automatically stops training once it detects that there is no improvement in the monitored metrics (for example validation accuracy). ⚡️
1/3
It provides additional parameters that stop training at extreme points:
1️⃣ stopping_threshold
Stop training if the monitored metric has reached this threshold.
2️⃣ divergence_threshold
Stop training if the monitored metric is worse than this threshold.
2/3
3️⃣ Check_finite
This will check if your monitored metric is NaN or infinite.
4️⃣ check_on_train_epoch_end
It checks the metric at the end of the training epoch instead of the validation epoch.