This explanation was insane! What a level of abstraction, pal. Thanks for the effort. I'm a mechanical engineer and I'm beginning career change to software development. And let me tell you that this analogy just flew like a laminar non-viscous fluid towards me hehe. Also your channel looks great. Subscribed. Saludos amigo. Buen trabajo. 🤙
@@BrandonRohrer Hahah Cool. That encourages me even more. My undergrad thesis was on observer-based nonlinear control and it seems to have a lot to do with this. I expect to get started with Python next month and try ML some time soon. 🤙 ... hey, and thanks for correcting the verb 😆 hehe. Saludos.
very good, thank you! just one constructive criticism: the illustration (variable-names) at 4:45 should have been visible all the time (in small format) for people with bad memory like me. ;)
This is underrated, I find this very useful. I am not a smart student at the college, but this explanation is excellent. In the MIT Video, it said about how the small rate of weight_2 (y in your video) will affect the result of ML. I was confused because of a lack of understanding of calculus. However you give a proper sample in the real life. Thanks for the knowledge
Wow, thanks Brandon...Back propagation is a difficult subject....great to have such a clear and analogous resource that ties things back to tangible concepts like shower head flow :D Very cool !! Edit: And this is the best explanation of the chain rule !!
OK so you are saying that when I make an adjustment to the valves, after it back propagates, the actual sensitivity increments on the valves also change?
I understand everything up onto 10:50, if we only have an Error - yPrime. How can we define dx, dy, dm and dh ? Do we start out with a random delta? for all of them and run it twice? And if so, what is dy? dy of the random delta, or dy of the error? or y' - y?
Love it! Only one minor critics on didactics: Brandon you SAY that the shower handle position goes from 1 to 10 - in the presentation it goes from 1 to 9. It is absolutely not relevant to understanding but such minor flaws often disturb unexperienced learners. Just a thought.
Hi Brandon, a very good explanation of backpropagation. There's a small glitch at 16:33 where you bring up temperature but i assume shower head water flow was meant.
You can say it easier. For those who are only a little familiar with technology. Machine Learning Backpropagation is the equivalent of the industry's 50 year old PID controller.
Hi Mayank. Thank you. And absolutely yes! I encourage teachers to use my materials however they can. There are group discounts for the paid content if that's the direction you'd like to take your class.
i understand the purpose of all this but can some explain me this with concret application because i'm struggling to apply the "curly-d" with real numbers btw it's an incredible great job u just did
Thanks! Yes, you can absolutely follow along with applying this in a neural network in End-to-End Machine Learning Course 312: end-to-end-machine-learning.teachable.com/p/write-a-neural-network-framework
Hi Brandon. Huge fan. Planning to buy your whole bundle when i got a time to look. Can you recommend me a starting route? From where to start? I think that someone should understand machine learning before deep learning thus i think i should start from ml before dl. Any help? Thanks for your efforts.
Thanks bekonyn! I'm very happy to hear it. There isn't a strict order to the courses, except for 312 and 313, basics and advanced neural networks. Other than those two, you can work through the courses in any order. Where there are soft pre-requisites or supplementary material I call it out. But you can feel free to let your curiosity direct you. If you are feeling a little lost then feel free to start at 171 and proceed in numerical order. Enjoy!
The videos describing the python implementation of this are part of a the neural network fundamentals in End-to-End Machine Learning Course 312: end-to-end-machine-learning.teachable.com/p/write-a-neural-network-framework
Best video on backprop Ive seen so far, and I watched dozens. However this video still fell short for me. This first half was great, you used real numbers so I can see and understand, in the second half you completely abandoned using numbers and only used letters. Its easy for me to understand when you write 8 divided by four. But I dont understand h/x. Please stick to numbers for those of us who dont work with algabraic notation on a daily basis. Having both is ok, but you lost me when you dropped the numbers completely.
Using variable names made gaining the intuition unnecessarily more difficult than it needed to - for me. I constantly had to go back - wait what was x, what was y, m, h, w and so on, and it broke the flow. I would have preferred using the verbose but understandable version, and only when the intuition is there, then put labels on it. Great video nonetheless!
The most intuitive and beautiful explanation that I have seen for backpropagation.
Thank you very much Mihaela. I am actually blushing.
Excellent example for back prop and chain rule!
This explanation was insane! What a level of abstraction, pal. Thanks for the effort. I'm a mechanical engineer and I'm beginning career change to software development. And let me tell you that this analogy just flew like a laminar non-viscous fluid towards me hehe. Also your channel looks great. Subscribed. Saludos amigo. Buen trabajo. 🤙
Many thanks Andrés! I started life as a mechanical engineer too. I'm happy to hear it flowed :)
@@BrandonRohrer Hahah Cool. That encourages me even more. My undergrad thesis was on observer-based nonlinear control and it seems to have a lot to do with this. I expect to get started with Python next month and try ML some time soon. 🤙 ... hey, and thanks for correcting the verb 😆 hehe. Saludos.
Modern schools focus too much on arithmetic and not enough on math notation. Thank you for this vid, helped me a lot.
You're a great teacher Brandon. I joined your course based on these openly available videos. Great work buddy!
Thanks John! I'm happy they've been so helpful.
very good, thank you!
just one constructive criticism: the illustration (variable-names) at 4:45 should have been visible all the time (in small format) for people with bad memory like me. ;)
agreed, i like to see the formulas a lot. ;)
I immensly support this
This is underrated, I find this very useful. I am not a smart student at the college, but this explanation is excellent.
In the MIT Video, it said about how the small rate of weight_2 (y in your video) will affect the result of ML. I was confused because of a lack of understanding of calculus. However you give a proper sample in the real life.
Thanks for the knowledge
Thank you Alexander. I'm really happy to hear it.
Wow, thanks Brandon...Back propagation is a difficult subject....great to have such a clear and analogous resource that ties things back to tangible concepts like shower head flow :D Very cool !!
Edit: And this is the best explanation of the chain rule !!
you are freaking awesome man !!!
Unbelievable how you make it so easy & intuitive
Hope they teach like this in the classes
My hat off
OK so you are saying that when I make an adjustment to the valves, after it back propagates, the actual sensitivity increments on the valves also change?
Long awaited Brandon Rohrer video!!!!
Perfect! Very good explanation of back prop.
I understand everything up onto 10:50, if we only have an Error - yPrime. How can we define dx, dy, dm and dh ? Do we start out with a random delta? for all of them and run it twice? And if so, what is dy? dy of the random delta, or dy of the error? or y' - y?
Love it! Only one minor critics on didactics: Brandon you SAY that the shower handle position goes from 1 to 10 - in the presentation it goes from 1 to 9. It is absolutely not relevant to understanding but such minor flaws often disturb unexperienced learners. Just a thought.
No offense, but that's more of an OCD issue .. since neither 9 nor 10 was used to illustrate the point
Brandon: Nicely explained. Thx!
Hi Brandon, a very good explanation of backpropagation. There's a small glitch at 16:33 where you bring up temperature but i assume shower head water flow was meant.
Oops! Of course you are correct. Good catch. I've corrected it in the transcript.
Hey Brandon,
have you written any book?
if yes then give me a link and if not then please write it for us.
Thanks Hassan :)
I haven't written a book yet unless you count this as a meandering list of half-finished chapters: e2eml.school
16:33 you said temperature,but we were adjusting the flow rate
are the sensitivities constant in this example? If not, why did we calculate them in the beginning?
Great visualization with the pipe filter! I would really like a backprop through time in rnns video as well if you are interested!
Thank you! RNN and CNN backprop examples are on my roadmap as well.
whats not clear is where the iterations are. Would like to have seen at least 2 iterations worked thru with the real life numbers you started with
wow! you have the best videos
How does this look like in a diagram? It's hard to visualize the positioning.
You can say it easier. For those who are
only a little familiar with technology. Machine
Learning Backpropagation is the equivalent
of the industry's 50 year old PID controller.
Nice to see you back with a very amazing video. I was wondering if the video courses (on e2e) you have created can be used for teaching purposes?
Hi Mayank. Thank you. And absolutely yes! I encourage teachers to use my materials however they can. There are group discounts for the paid content if that's the direction you'd like to take your class.
i understand the purpose of all this but can some explain me this with concret application because i'm struggling to apply the "curly-d" with real numbers
btw it's an incredible great job u just did
Thanks! Yes, you can absolutely follow along with applying this in a neural network in End-to-End Machine Learning Course 312: end-to-end-machine-learning.teachable.com/p/write-a-neural-network-framework
Hi Brandon. Huge fan. Planning to buy your whole bundle when i got a time to look. Can you recommend me a starting route? From where to start? I think that someone should understand machine learning before deep learning thus i think i should start from ml before dl. Any help? Thanks for your efforts.
Thanks bekonyn! I'm very happy to hear it. There isn't a strict order to the courses, except for 312 and 313, basics and advanced neural networks. Other than those two, you can work through the courses in any order. Where there are soft pre-requisites or supplementary material I call it out. But you can feel free to let your curiosity direct you.
If you are feeling a little lost then feel free to start at 171 and proceed in numerical order. Enjoy!
Thank you so much Brandon
So it’s a feedback control system with control parameters in matrix
Is it possible to use this one as an intro, and then have a follow-up video that is slightly more technical with python?
The videos describing the python implementation of this are part of a the neural network fundamentals in End-to-End Machine Learning Course 312: end-to-end-machine-learning.teachable.com/p/write-a-neural-network-framework
please explain me, how we get d(y)/d(x) = 1/4 on 7.02, instead 1/2 as calculated on 6.06 ?????????? i have broken my head
The main valve handle’s illustration not being symmetric was kinda twitching me out. 😅
But this video is very good. I’ll share this with students.
Ha, after a very long time, please upload regularly, waiting for your videos
Great..Just Great
interesting analogy
Best video on backprop Ive seen so far, and I watched dozens. However this video still fell short for me. This first half was great, you used real numbers so I can see and understand, in the second half you completely abandoned using numbers and only used letters. Its easy for me to understand when you write 8 divided by four. But I dont understand h/x. Please stick to numbers for those of us who dont work with algabraic notation on a daily basis. Having both is ok, but you lost me when you dropped the numbers completely.
Perfect video
Better check with codes...
thanks
Using variable names made gaining the intuition unnecessarily more difficult than it needed to - for me. I constantly had to go back - wait what was x, what was y, m, h, w and so on, and it broke the flow. I would have preferred using the verbose but understandable version, and only when the intuition is there, then put labels on it.
Great video nonetheless!
I feel useless
My head is gonna explode
Thx!!!
You always explained in an impressive way. Great Work. Can you please provide me your mail I'd.
way too much for a beginners tutorial. Most BP intro tutorials are done with simple truth tables
First