Chapter : Lesson 4

Episode 10 - Learning Curves

face Josiah Wang

Summary:

  • Learning curves
    • Visualise loss across number of epochs
    • Useful for knowing when to stop training and for debugging your machine learning algorithm
  • Training loss: Loss computed on the training set
  • Validation loss: Loss computed on the validation/dev set
  • High training and validation loss: Underfitting
  • Low training loss but high validation loss: Overfitting
  • Low training and validation loss: Just right!
  • NOTE (not discussed in video):
    • There are also learning curves that plot the loss/performance across dataset size rather than the number of epochs. Don’t confuse these two learning curves!