Batch Normalization - Part 1: Why BN, Internal Covariate Shift, BN Intro

Поделиться
HTML-код
  • Опубликовано: 8 сен 2024

Комментарии • 14

  • @aneesarom
    @aneesarom 2 дня назад

    bro start making videos once again, your explanataions are very good

  • @KhozemaPython
    @KhozemaPython 3 месяца назад +1

    Excellent Explanation . Will not forget concept now till going to grave

  • @anantmohan3158
    @anantmohan3158 Год назад +2

    Very Nicely explained. The concepts were very clear. Thank you for creating a playlist on this conceptual topics. Will complete all 3 videos. Thank you ..!

  • @Theo-cn2cy
    @Theo-cn2cy 5 месяцев назад +1

    Fantastic video! Clear explanations on a tough topic (at least for me since I'm a novice in this field). Thank you!

    • @MLForNerds
      @MLForNerds  5 месяцев назад

      Glad it was helpful!

  • @ayinlashehulukman2841
    @ayinlashehulukman2841 7 месяцев назад +1

    Thanks for doing this video, the concept was well explained. You just got a new subscriber

  • @andresvega4916
    @andresvega4916 5 месяцев назад +1

    Really good explanation, nice video!

  • @jamesbedichek6106
    @jamesbedichek6106 Год назад +1

    Good job! This was well done.

  • @akashnayak3752
    @akashnayak3752 7 месяцев назад +1

    So good! Thanks a lot

  • @resurrection355
    @resurrection355 8 месяцев назад +1

    keep up the good work!!!!!

  • @shredder-31
    @shredder-31 6 месяцев назад +1

    Great Video ❤

  • @muthukamalan.m6316
    @muthukamalan.m6316 11 месяцев назад

    could you please show in cnn as well ( batch norm and layer norm )