Adam Optimizer from scratch | Gradient descent made better | Foundations for ML [Lecture 26]

Поделиться
HTML-код
  • Опубликовано: 28 янв 2025

Комментарии • 8

  • @prashlovessamosa
    @prashlovessamosa 7 дней назад

    thanks for creating this series

  • @thulasi39
    @thulasi39 2 дня назад

    Thanks for creating the series. Still we have more videos on optimization? And still now programming not covered. Just curious from when it will be covered?

  • @datagigs5478
    @datagigs5478 5 дней назад

    I noticed that in the introduction lecture, it was mentioned that three lectures would be uploaded each week, but it seems there’s now a longer gap between them.

  • @MeftahSuchok
    @MeftahSuchok 12 дней назад

    Great Content! I am also a mechanical engineering grad but gradually getting interest in ML!

  • @adebolarahman9885
    @adebolarahman9885 13 дней назад

    Thank you for this!

  • @buildingforbillions3735
    @buildingforbillions3735 9 дней назад

    plz share the link of the code ... where i can get it ?

  • @praveenkoushik9741
    @praveenkoushik9741 8 дней назад

    What is the difference of this paid course available on your website and same free one on RUclips?
    The lectures are the same on both platforms. The only difference I can observe is the certification. Is that all?

    • @vizuara
      @vizuara  6 дней назад

      You will have access to
      1) Hand written lecture notes
      2) Assignment
      3) Private Discord community
      4) Lecture videos and
      5) Certificate on course completion