Backpropagation in Neural Network with an Example By hand - TensorFlow Tutorial

Поделиться
HTML-код
  • Опубликовано: 18 ноя 2024

Комментарии • 30

  • @torgath5088
    @torgath5088 11 месяцев назад +6

    You are one of very few who actually shows the bias update, most of videos is purely about weight update.

    • @Koolac
      @Koolac  11 месяцев назад +1

      So happy to hear that. Glad you liked the video.

  • @davinchocamaron646
    @davinchocamaron646 18 дней назад

    The best explanation on backpropagation, greetings from Colombia!

  • @ArturFejklowicz
    @ArturFejklowicz 7 месяцев назад +5

    Thank you, after hours of searching this is the first video that explains properly the back propagation

    • @Koolac
      @Koolac  5 месяцев назад

      You're welcome.
      Glad you liked the video.

    • @mohammadosman4102
      @mohammadosman4102 5 месяцев назад

      Think about it this way, bias is also a weight just add an additional neuron with value always equal 1
      viola! bias is just a weight

  • @tejasvinnarayan2887
    @tejasvinnarayan2887 Год назад +4

    Thank you Koolac so much for a such a clear and simple explanation of a complex problem!

    • @Koolac
      @Koolac  Год назад

      Happy to hear that. Thank you so much for your comment and support.

  • @emreyaln7780
    @emreyaln7780 10 месяцев назад +1

    the best explanation i have ever seen so far, thx

  • @xiaofengliu5724
    @xiaofengliu5724 4 месяца назад +2

    This is the best video to explain backpropagation!

    • @Koolac
      @Koolac  4 месяца назад

      So happy to hear that.
      It's nice of you.

  • @alamgirqazi1
    @alamgirqazi1 3 месяца назад +1

    Extremely underrated video.

    • @Koolac
      @Koolac  3 месяца назад

      I appreciate that.

  • @stephanedibo8167
    @stephanedibo8167 Год назад

    This course is a true blessing

  • @ivanmounde5046
    @ivanmounde5046 Год назад +5

    there is a small issue with the input of neuron 2. it has taken on w3 instead of w2. otherwise everything is in order.

  • @dilipgyawali1776
    @dilipgyawali1776 3 месяца назад

    best tutorial ever i found, thank u very much....

  • @SergioEanX
    @SergioEanX 2 месяца назад

    Excellent example and explanation! Thanks a lot!

  • @Ziv-rv9xb
    @Ziv-rv9xb 6 месяцев назад +1

    The only explanation I understood, Thanks

    • @Koolac
      @Koolac  5 месяцев назад +1

      Glad to hear that.
      Many thanks for your feedback.

  • @khameelmustapha
    @khameelmustapha Год назад

    This such a brilliant explanation.

  • @madankhatri7727
    @madankhatri7727 10 месяцев назад

    nice explanation

  • @guillermosainzzarate5110
    @guillermosainzzarate5110 10 месяцев назад

    Thank you!!

  • @propertyhans
    @propertyhans 2 месяца назад

    great video subscribed

  • @JaHaHa7205
    @JaHaHa7205 28 дней назад

    Great!!!

  • @manthiwjs
    @manthiwjs Год назад +1

    well explained. Thank you.:)

    • @Koolac
      @Koolac  Год назад

      You're welcome. It's nice of you. Many thanks for your feedback and support.

  • @tejasvinnarayan2887
    @tejasvinnarayan2887 Год назад +1

    How to include activation functions on backward propogation?

    • @Koolac
      @Koolac  Год назад +1

      I've done so in the video as well. I even calculated ReLU as an example.

  • @evanhadi6395
    @evanhadi6395 8 месяцев назад

    isn't the true value supposed to be 3 and not 2 ? are maybe I'm wrong ?

  • @johndowling1861
    @johndowling1861 3 месяца назад

    only critisim is that you talk to quickly and the terms are not updated between slides ie it would have been helpfull to add h1/h2/h3 on the neurons so that we can follow along. keeping those in your brain whilst trying to learn the math put me over the top, I kept loosing what h1 meant
    God i take that back, this is a mess of derivatives without explanation of what the partial derivatives mean at each point in time