My Authors
Read all threads
Sometimes validation loss < training loss. Ever wondered why? 1/5
The most common reason is regularization (e.g., dropout), since it applies during training, but not during validation & testing. If we add the regularization loss to the validation loss, things look much different. 2/5
Oh and the training loss is measured *during* each epoch, while the validation loss is measured *after* each epoch, so on average the training loss is measured ½ an epoch earlier. If we shift it by ½ an epoch to the left (where is should be), things again look much different. 3/5
Or perhaps the val set is easier than the training set! This can happen by chance if the val set is too small, or if it wasn't properly sampled (e.g., too many easy classes). Or the train set leaked into the val set. Or you are using data augmentation during training. 4/5
Even if the val loss is close to the train loss, your model may still be overfitting. Account for the regularization loss when comparing, shift the train loss by half an epoch, and make sure the val set is large, sampled from the same distribution as train, without leaks. 5/5 🦎
Missing some Tweet in this thread? You can try to force a refresh.

Enjoying this thread?

Keep Current with Aurélien Geron

Profile picture

Stay in touch and get notified when new unrolls are available from this author!

Read all threads

This Thread may be Removed Anytime!

Twitter may remove this content at anytime, convert it as a PDF, save and print for later use!

Try unrolling a thread yourself!

how to unroll video

1) Follow Thread Reader App on Twitter so you can easily mention us!

2) Go to a Twitter thread (series of Tweets by the same owner) and mention us with a keyword "unroll" @threadreaderapp unroll

You can practice here first or read more on our help page!

Follow Us on Twitter!

Did Thread Reader help you today?

Support us! We are indie developers!


This site is made by just two indie developers on a laptop doing marketing, support and development! Read more about the story.

Become a Premium Member ($3.00/month or $30.00/year) and get exclusive features!

Become Premium

Too expensive? Make a small donation by buying us coffee ($5) or help with server cost ($10)

Donate via Paypal Become our Patreon

Thank you for your support!