Why Batch Normalization (batchnorm) Works

Поделиться
HTML-код
  • Опубликовано: 9 фев 2025

Комментарии • 14

  • @Lewlcat_
    @Lewlcat_ Год назад +2

    Simple, short, and easy to understand, thank you!

  • @iacobsorina6924
    @iacobsorina6924 Год назад +1

    Short and clear explanation. Thanks!

  • @iiVEVO
    @iiVEVO Год назад +3

    Thanks for this vid. I'm currently looking at kernel functions, in particular radial basis functions. It would be nice if there was a good a visual intution for this function and maybe in comparison to the linear kernel function. And it's application to deep learning

    • @datamlistic
      @datamlistic  Год назад +1

      Noted, thanks for the suggestion! Most probably I will create a short series as I did for object detection. :)

  • @marcinstrzesak346
    @marcinstrzesak346 Год назад

    Very good video, clearly explained, thank you.

  • @rodrigo100kk
    @rodrigo100kk 8 месяцев назад +1

    Interesting ! Very good video! But why normalize the data in batchs and not previously in pre-processing ?

    • @datamlistic
      @datamlistic  8 месяцев назад

      Well, input data should be normalized in preprocessing. Hopefully I didn't say otherwise in the video. However, you can't normalize the data coming into a certain layer beforehand since you don't know what the nn weights would be at a certain step during traininng.

    • @rodrigo100kk
      @rodrigo100kk 8 месяцев назад +1

      @@datamlistic Great explanation! I got it, thank you so much.

    • @datamlistic
      @datamlistic  8 месяцев назад

      @@rodrigo100kk Glad I could help! :)

    • @rodrigo100kk
      @rodrigo100kk 8 месяцев назад

      @@datamlistic Your channel is pretty awesome. Keep up the great work!

    • @datamlistic
      @datamlistic  8 месяцев назад

      @@rodrigo100kk Thank you! I will! :)