Dude, I can't thank you enough. You should teach teachers how to teach. Feels like I could never get a degree if it weren't for Indians. Every helpful video I can find, it's always you guys. Bless you.
OMG! you are amazing! Wow, I have never seen teaching simple like that easy! Now I understand the base of Neural network! Especially in the hidden layer's processing. Thank you very, very much!
I was looking for something like this from long time, finally some one knows exactly what we need to know, someone provides us something different than others. thank you , thank you
explained all that stuff so easily. other videos on youtube, trying to make the concept more intuitive, made them difficult to grasp. thank you so much!!
This video is the best I've seen on youtube. Whiners would benefit from using the pause button and going back as many times as necessary. All the information is there and is thoroughly explained. It's a complicated subject and it's unrealistic to "get it" right away.
this guy taught me what my teachers couldn't in whole f*cking semester and that too within 13 minutes . tons of respect to you mann!! may god bless you
This video is just what I needed! You're one of the only people... if not the only one... that went through the process with actual examples of inps and outs so that I could check my code each step of the way. THANKYOUUUU
Thanks for your explanation! This is the most quantitative example of backpropagation I have been searching for a long time. It truly helps me understand the mathematics behind this algorithm.
immersive, intuitive, answering exact questions coming up in mind before asking, on the train of thoughts completely. appreciated it, dude! the only imperfection is I couldn't follow your accent pretty well, but you know what, amazingly - even though I couldn't manage to understand completely what you're saying I understand what you're saying - you know how neuron work, you know more how the human brains work. brilliant!
Neveen thank you, the example is a great description of how a back propagation perceptron works. Among many videos I have seen today, this is the best one.
Ohh Man!!!! I couldn't appreciate more than what others already appreciated you in below comments. You are hidden treasures. Hope that google shows up your videos in the first result. You have put lot of effort in making this video & it is to the point!! God bless you and keep posting! :)
Mr. Naveen Kumar Excellent job, could you add 1 - recursive neural network 2 - convolution neural network 3 -tensorflow and anything related to NN you think is necessary. Thanks very much Sir Good Luck Robert
Great tutorial m8. The one thing that I could not find a clear explanation of was updating the weights and you did an excellent job at explaining it. Thanks for the video.
You made a really good content. I have seen many animated illustrations and other articles as well, but this is by far the best. Very easy to understand. Adequate mixture of formulae and numerical values, to teach the content. Thank you so much for this wonderful work.
Excellent Naveen. Thank you so much for explaining this.. You explained this so clearly even the renowned instis failed to do so.. I was searching for this explanation for so long.. Thank you so very much..........
I like this kind of teaching. Now a days no body wants to write n chances of mistake is so high. Practise makes a person perfect n writing makes a person exact.
Numerical examples are best way to explain the concept. Thanks for posting. Just a small point wet to total error; generic formula is average of square of losses across all neurons in the the output layer. Calc is correct here as there are 2 neurons in the output layer.
Thank you for this nice example. It's a good resource for explaining to students the fundamentals of forward and back propagation. I found your presentation including your english quite clear.
Liked your explanation of the concept. As some people have mentioned you could improve on the narration by using a script and pacing better. But, I an not complaining. The explanation itself covers the ground level calculations that many others lack and the written stuff is very helpful. At 9:27 (dEtotal/dOutH1)=(dE1/dOutH1)+(dE2/dOutH1) Since E2 also contains the term OutH1 coming from y2=w7*H1+w8*H2+b2, so there would be a similar calculation for that term as well. Thank you!
@@krishnanarwani8105 No, even in Andrew Ng's course it is not same. there's a slight difference between them. H1 is calculated by summation of products of previous activation with weights, i.e H1 = x1*w1 + x2*w2 + ... so on. While out H1 i.e activation of H1 is calculated by passing the answers (result H1) through sigmoid function. So, activation of H1 or out H1 = sigmoid(H1) So, both are different. Hope it helps!
Thank you so much for this video! I have a test next week on this stuff and was stressing out because I didn't understand it, but after watching this video I feel so confident and prepared!
Great explanation sir one thing worth mentioning. If you use the squared error function to determine your error you will run into issues when you try to optimize your parameters, you will have multiple local minimal and no convergence to a single optimal value. you should probably use - 1/n Sigma target* log output + (1 - target)* log(1-output)
5 лет назад
What I was looking for is just an example. Other videos only have theory. And I finally found one that has example. Jesus.
you are too good bro..i am new to this Machine learning. infact watched few videos of it.. but couldnt get a clear picture of it. After watching your video.. understood the concept. Paper - pen model u teach.. awesome. love to see more videos of yours.
This is one of the amazing videos. Although I am not a very technical or statistical person this video cleared my few concepts about the Neural network - backpropagation and feedforward part and how to calculate the errors and update the weight to find the target value. I am hoping to see how to adjust the biases and variances to get efficiency in my model. Please explain those as well.
Dude, I can't thank you enough. You should teach teachers how to teach. Feels like I could never get a degree if it weren't for Indians. Every helpful video I can find, it's always you guys. Bless you.
💀
The details and clarity is so good 👍
he doesnt know what hes doing.
Very worst
I started watching this video muted. I was thinking that he explained how RNN's work,😉
Finally, someone who actually explains how to backpropagate!
Those videos are priceless if you are learning AI.
First we need to see it in biology
@@johncharalambous2488 not psychology logically
@@johncharalambous2488 yes its just a theory no need to des motivation or motivation just knowledge
@@johncharalambous2488 well what you men with chill
Yes, you are a genious. Your explanation is the best one on backward propagation I have ever come across. Thanks.
OMG! you are amazing!
Wow, I have never seen teaching simple like that easy! Now I understand the base of Neural network! Especially in the hidden layer's processing. Thank you very, very much!
I was looking for something like this from long time, finally some one knows exactly what we need to know, someone provides us something different than others. thank you , thank you
Honestly one of the best explanations out there for backpropagation, and for how the errors affect the network in general
explained all that stuff so easily. other videos on youtube, trying to make the concept more intuitive, made them difficult to grasp.
thank you so much!!
This video is the best I've seen on youtube. Whiners would benefit from using the pause button and going back as many times as necessary. All the information is there and is thoroughly explained. It's a complicated subject and it's unrealistic to "get it" right away.
*Give this man a medal*
Great work.
Going step by step through the back propagation was super helpful. Didn't miss a beat! Great explanation.
this guy taught me what my teachers couldn't in whole f*cking semester and that too within 13 minutes . tons of respect to you mann!! may god bless you
This video is just what I needed! You're one of the only people... if not the only one... that went through the process with actual examples of inps and outs so that I could check my code each step of the way. THANKYOUUUU
BEST EXPLAINATION I"VE been trying to get a hold of the idea and never really grasp it untill I saw this!
It's incredible. He was the first one who actually showed me each step with examples and not just formulas no one can read
Crystal Music exactly
@@Sh1r449 Are you still active in artificial intelligence ?
I heartly appreciate the efforts that you put in this vedio especially those solved examples , Thanks !
thanks, i am struggling with back propagation in 2 days, and you helped me within 12.44 min
This is the best explanation i have ever seen for back propogation.
Thanks for your explanation! This is the most quantitative example of backpropagation I have been searching for a long time. It truly helps me understand the mathematics behind this algorithm.
immersive, intuitive, answering exact questions coming up in mind before asking, on the train of thoughts completely. appreciated it, dude! the only imperfection is I couldn't follow your accent pretty well, but you know what, amazingly - even though I couldn't manage to understand completely what you're saying I understand what you're saying - you know how neuron work, you know more how the human brains work. brilliant!
It is epic! I know a lot of people have appreciated your work, I just wanna add a big THANK YOU.
Neveen thank you, the example is a great description of how a back propagation perceptron works. Among many videos I have seen today, this is the best one.
Ohh Man!!!!
I couldn't appreciate more than what others already appreciated you in below comments. You are hidden treasures. Hope that google shows up your videos in the first result.
You have put lot of effort in making this video & it is to the point!!
God bless you and keep posting! :)
Brilliantly, Simply and Clearly explained. Good job!
Finally, I understand something about backpropagation. Thank you so much for your mathematical explanation.
Finally got someone who explained well, very very awesome video
Thank You So much. Crisp and Clear explanation. I was not able to understand this concept earlier but you made it crystal clear for me. Great job!!!!
Once we get into your pace and accent this is a very useful tutorial walking us through each step, I like your style, many thanks for your patience
Mr. Naveen Kumar
Excellent job, could you add
1 - recursive neural network
2 - convolution neural network
3 -tensorflow
and anything related to NN you think is necessary.
Thanks very much Sir
Good Luck
Robert
Thank you so much for clearing everyone’s mind by doing a specific example. Beautifully done. That’s awesome.
Great tutorial m8. The one thing that I could not find a clear explanation of was updating the weights and you did an excellent job at explaining it. Thanks for the video.
I really want to have your notebook. You explained so well.
Perfect Numerical Example for my tomorrow exam. Thanks, Buddy
Now I am in your position
@@beletetekle5593 Now I am in your position
@@notaladeen4156 best wishes
same position dude
You made a really good content. I have seen many animated illustrations and other articles as well, but this is by far the best. Very easy to understand. Adequate mixture of formulae and numerical values, to teach the content. Thank you so much for this wonderful work.
full concept clear with working example....awesome video sir
Thankyou very much bro, you explained updating the weights in a neural network really really well !!!🙏🏽🙏🏽🙏🏽
Awesome video, it was all numbers and formulas untill i watched this.Everything fall into place.Thanks!
Excellent Naveen. Thank you so much for explaining this.. You explained this so clearly even the renowned instis failed to do so.. I was searching for this explanation for so long.. Thank you so very much..........
best video on net for back propagation. much better than all those big names from harvard, stanford, MIT etc.
I like this kind of teaching. Now a days no body wants to write n chances of mistake is so high. Practise makes a person perfect n writing makes a person exact.
Numerical examples are best way to explain the concept. Thanks for posting. Just a small point wet to total error; generic formula is average of square of losses across all neurons in the the output layer. Calc is correct here as there are 2 neurons in the output layer.
Best expalanation for backward propagation..Thank you.
This is awesome! After going through series of backpropagation videos, someone actually explains the concept with a clear example. Thumbs Up!
thanks for the video! the concept is clearly explained and simplified enough to understand. There can be improvement on delivery.
One of the best explanations so far
Dude, you are PURE GEM. cant thank you enough.
Brilliantly done! I’m very grateful for this work that you are sharing in this wonderful video you have made. Thank you
Really superb! even after doing a lot of courses, I could not understand. But after your explanation, I understood. Thanks a lot
even though I had a hard time understanding your accent, but you gave me the idea and I got your point, thanks
I had to leave a Video tutorial by IIT faculty and tried this video, I understood the concept much better here.
Thanks. Extremely practical way of understanding how how the neural networks work. Good job
very nice explanation. Superb.
you just kept saying "this" "this" and "this". It was hard catching up 😩 but thank you. Cleared most of my confusion in the first part
10/10....
Thanks alot...
absolutely clear. looking such vedio for a long time.
Best example I have ever seen on this topic. I cannnot wait to thank you. You are the Star*
Thank you for this nice example. It's a good resource for explaining to students the fundamentals of forward and back propagation. I found your presentation including your english quite clear.
I have started learning NN. This is a very good video, with explanation of Algo, Thank you
Thanks for such a wonderful explanation Back Propagation in Neural Network with an example. It make the concept of Back Propagation very clear.
Thank you so much for this! I finally understood how backpropagation works!
You are an excellent teacher sir, thank you.
This undoubtedly is a real quality video.
Liked your explanation of the concept. As some people have mentioned you could improve on the narration by using a script and pacing better. But, I an not complaining. The explanation itself covers the ground level calculations that many others lack and the written stuff is very helpful.
At 9:27 (dEtotal/dOutH1)=(dE1/dOutH1)+(dE2/dOutH1) Since E2 also contains the term OutH1 coming from y2=w7*H1+w8*H2+b2, so there would be a similar calculation for that term as well.
Thank you!
i agree with you my dear friend
Yes, that is a mistake. dE2/dOutH1 and dE2/dOutH2 make people confuse in this video.
Good effort...thanks for explaining the concept mathematically
SO nicely and clearly explained, many thanks Sir!!
Fantastic video, really clear. I search a lot for something such clear as your video! Thanks
Excellent Explanation! Thank you for all the effort that you have put in making this video!
At last, I understood the concept of Back Propagation after wasting lots of time.
Excellent job, hat's off and really you are a genius.
Keep it up.
he copied it form a website but this is more easy ot digest.
Thanks Naveen . I understood now clearly how machine is processing.
Finally a great video ❤❤❤❤❤
Finally Understood Backpropagation
Thank You
The best Neural network explanation video. Thank you so much :)
A really good explanation to understand back propagation
very well explained, thankyou for putting so much efforts sir
So i finally understood how the algorithm works!
Thankyou so much! Keep up the good work.
bro can you tell me why out H1 and H1 are different . In Lectures of Andrew Ng Out H1 and H1 are same
@@krishnanarwani8105 No, even in Andrew Ng's course it is not same. there's a slight difference between them. H1 is calculated by summation of products of previous activation with weights, i.e H1 = x1*w1 + x2*w2 + ... so on.
While out H1 i.e activation of H1 is calculated by passing the answers (result H1) through sigmoid function.
So, activation of H1 or out H1 = sigmoid(H1)
So, both are different. Hope it helps!
@@shamazafar8127 yes you are right brother
@@krishnanarwani8105 ya i am a girl btw! :D
@@shamazafar8127 ok😂
This was a great example, even the math was explained and how you substitute the values to go forwards and backwards. Great tutorial!!
Very detailed and clear explanation. Thank you for your work!
God gives the opportunity to see ur video, it's very useful. Keep rocking
Dear Naveen, Thank you very much!I thank you for explaining every step. Just pen and paper with each step. Great
Nicely done! Thanks for explaining this in detail!
Thank you so much! Your video helped me clarify the backpropagation algorithm. Simple and elegant explanation.
This video is priceless ! cant thank you enough
Best and simple explanation ever. Thank you so much dude ❤️
Thank you so much for this video! I have a test next week on this stuff and was stressing out because I didn't understand it, but after watching this video I feel so confident and prepared!
He is teaching from the book by Tareeq Rashid.. just search for it in libgen..it is a great book!
Sure thanks for letting me know!
Best video about backpropagation out there
Tottaly agree
Great explanation sir one thing worth mentioning. If you use the squared error function to determine your error you will run into issues when you try to optimize your parameters, you will have multiple local minimal and no convergence to a single optimal value. you should probably use - 1/n Sigma target* log output + (1 - target)* log(1-output)
What I was looking for is just an example. Other videos only have theory. And I finally found one that has example. Jesus.
Thanks Naveen for the video. After searching a lot finally I got the information. thank you.
Fantastic video. Thanks for your time and effort. You made my day.
Thank you very much for this wonderful explaination !
I wish I saw it 6 months ago ! this is just perfect :(
Excellent video on explanation of back propagation network.congratulations for the efforts undertaken...
Thanks Naveen. It helped me a lot. Keep up this good work. Thanks again for sharing.
Thank you very much sir. Very helpful and made easy. Sending love
wow someone who actually explained it properly!
I'm in love with his pedagogy ....
Dude .... thank you for this explanation.
Thank you very much sir, great video and explanation
Very well explained. Thank you for your video
Wow, you explained very well. Thanks. I got the point
Thank you. Simple and effective.
thank you sir sooo much your video help alot i dont find fine data form anywhere keep it up well job🖒
you are too good bro..i am new to this Machine learning. infact watched few videos of it.. but couldnt get a clear picture of it. After watching your video.. understood the concept. Paper - pen model u teach.. awesome. love to see more videos of yours.
This is one of the amazing videos. Although I am not a very technical or statistical person this video cleared my few concepts about the Neural network - backpropagation and feedforward part and how to calculate the errors and update the weight to find the target value. I am hoping to see how to adjust the biases and variances to get efficiency in my model. Please explain those as well.