only critisim is that you talk to quickly and the terms are not updated between slides ie it would have been helpfull to add h1/h2/h3 on the neurons so that we can follow along. keeping those in your brain whilst trying to learn the math put me over the top, I kept loosing what h1 meant God i take that back, this is a mess of derivatives without explanation of what the partial derivatives mean at each point in time
You are one of very few who actually shows the bias update, most of videos is purely about weight update.
So happy to hear that. Glad you liked the video.
The best explanation on backpropagation, greetings from Colombia!
Thank you, after hours of searching this is the first video that explains properly the back propagation
You're welcome.
Glad you liked the video.
Think about it this way, bias is also a weight just add an additional neuron with value always equal 1
viola! bias is just a weight
Thank you Koolac so much for a such a clear and simple explanation of a complex problem!
Happy to hear that. Thank you so much for your comment and support.
the best explanation i have ever seen so far, thx
This is the best video to explain backpropagation!
So happy to hear that.
It's nice of you.
Extremely underrated video.
I appreciate that.
This course is a true blessing
there is a small issue with the input of neuron 2. it has taken on w3 instead of w2. otherwise everything is in order.
best tutorial ever i found, thank u very much....
Excellent example and explanation! Thanks a lot!
The only explanation I understood, Thanks
Glad to hear that.
Many thanks for your feedback.
This such a brilliant explanation.
nice explanation
Thank you!!
great video subscribed
Great!!!
well explained. Thank you.:)
You're welcome. It's nice of you. Many thanks for your feedback and support.
How to include activation functions on backward propogation?
I've done so in the video as well. I even calculated ReLU as an example.
isn't the true value supposed to be 3 and not 2 ? are maybe I'm wrong ?
only critisim is that you talk to quickly and the terms are not updated between slides ie it would have been helpfull to add h1/h2/h3 on the neurons so that we can follow along. keeping those in your brain whilst trying to learn the math put me over the top, I kept loosing what h1 meant
God i take that back, this is a mess of derivatives without explanation of what the partial derivatives mean at each point in time