Skip to content Skip to footer

Regularisation Techniques: Neural Networks 101 | by Egor Howell | Dec, 2023


How to avoid overfitting whilst training your neural network

https://www.flaticon.com/free-icons/neural-network. title=”neural network icons.” Neural network icons created by Vectors Tank — Flaticon.
  1. Background
  2. What is Overfitting?
  3. Lasso (L1) and Ridge (L2) Regularisation
  4. Early Stopping
  5. Dropout
  6. Other Methods
  7. Summary

So far in this neural networks 101 series we have discussed two ways to improve the performance of neural networks: hyperparameter tuning and faster gradient descent optimisers. You can check those posts below:

There is one other set of techniques that aid in performance and that is regularisation. This helps prevent the model from overfitting to the training dataset to have more accurate and consistent predictions.

In this article, we will cover a wide range of methods to regularise your neural network and how you can do it in PyTorch!

If you are enjoying this article, make sure to subscribe to my YouTube Channel!

Click on the link for video tutorials that teach you core data science concepts in a digestible manner!

Let’s quickly recap of what we mean by overfitting in machine learning and statistics.



Source link