Regularization

Поделиться
HTML-код
  • Опубликовано: 16 окт 2024
  • Regularization is one way of reducing the total error, by finding a good trade-off between bias and variance automatically. To do so, we simply add another term to the loss function, which will try to minimize the complexity of the model. The importance of this additional term can be controlled by a scalar, which is an example of a hyperparameter. In this video, we discuss the two most prominent methods, ridge, and lasso.

Комментарии •