Supervised Machine Learning Review

Поделиться
HTML-код
  • Опубликовано: 8 сен 2024
  • A lecture that reviews ideas from supervised machine learning that are relevant for understanding deep neural networks. Includes the statistical machine learning framework, principles for selecting loss functions, and the bias-variance tradeoff. The lecture ends with the surprising double-descent behavior that neural networks can perform well even when highly overparameterized.
    This lecture is from Northeastern University's CS 7150 Summer 2020 class on Deep Learning, taught by Paul Hand.
    The notes are available at: khoury.northeas...

Комментарии • 2

  • @CRTagadiya
    @CRTagadiya 3 года назад

    What if we don't have Gaussian noise, we could have use MLE for any Function right? Gaussian gives you an easy way because of the exponential term?

  • @CRTagadiya
    @CRTagadiya 3 года назад

    If we already have an idea about error distribution, we could use it?