How Backpropagation Works

Поделиться
HTML-код
  • Опубликовано: 20 окт 2024

Комментарии • 62

  • @PotatoMan1491
    @PotatoMan1491 Месяц назад +1

    Excellent example for back prop and chain rule!

  • @astaconteazamaiputin
    @astaconteazamaiputin 4 года назад +1

    The most intuitive and beautiful explanation that I have seen for backpropagation.

    • @BrandonRohrer
      @BrandonRohrer  4 года назад

      Thank you very much Mihaela. I am actually blushing.

  • @andresroca9736
    @andresroca9736 3 года назад +5

    This explanation was insane! What a level of abstraction, pal. Thanks for the effort. I'm a mechanical engineer and I'm beginning career change to software development. And let me tell you that this analogy just flew like a laminar non-viscous fluid towards me hehe. Also your channel looks great. Subscribed. Saludos amigo. Buen trabajo. 🤙

    • @BrandonRohrer
      @BrandonRohrer  3 года назад +1

      Many thanks Andrés! I started life as a mechanical engineer too. I'm happy to hear it flowed :)

    • @andresroca9736
      @andresroca9736 3 года назад +2

      @@BrandonRohrer Hahah Cool. That encourages me even more. My undergrad thesis was on observer-based nonlinear control and it seems to have a lot to do with this. I expect to get started with Python next month and try ML some time soon. 🤙 ... hey, and thanks for correcting the verb 😆 hehe. Saludos.

  • @ViralKiller
    @ViralKiller Год назад

    OK so you are saying that when I make an adjustment to the valves, after it back propagates, the actual sensitivity increments on the valves also change?

  • @satellite964
    @satellite964 5 лет назад +1

    Modern schools focus too much on arithmetic and not enough on math notation. Thank you for this vid, helped me a lot.

  • @leonardjohnny67
    @leonardjohnny67 4 года назад +1

    You're a great teacher Brandon. I joined your course based on these openly available videos. Great work buddy!

    • @BrandonRohrer
      @BrandonRohrer  4 года назад

      Thanks John! I'm happy they've been so helpful.

  • @windar2390
    @windar2390 5 лет назад +28

    very good, thank you!
    just one constructive criticism: the illustration (variable-names) at 4:45 should have been visible all the time (in small format) for people with bad memory like me. ;)

    • @Actanonverba01
      @Actanonverba01 5 лет назад

      agreed, i like to see the formulas a lot. ;)

    • @romanavr
      @romanavr 4 месяца назад

      I immensly support this

  • @mapperid
    @mapperid 3 года назад +1

    This is underrated, I find this very useful. I am not a smart student at the college, but this explanation is excellent.
    In the MIT Video, it said about how the small rate of weight_2 (y in your video) will affect the result of ML. I was confused because of a lack of understanding of calculus. However you give a proper sample in the real life.
    Thanks for the knowledge

    • @BrandonRohrer
      @BrandonRohrer  3 года назад

      Thank you Alexander. I'm really happy to hear it.

  • @BinaryReader
    @BinaryReader 5 лет назад +14

    Wow, thanks Brandon...Back propagation is a difficult subject....great to have such a clear and analogous resource that ties things back to tangible concepts like shower head flow :D Very cool !!
    Edit: And this is the best explanation of the chain rule !!

  • @mjar3799
    @mjar3799 5 лет назад +2

    you are freaking awesome man !!!
    Unbelievable how you make it so easy & intuitive
    Hope they teach like this in the classes
    My hat off

  • @Icenri
    @Icenri 5 лет назад +5

    Long awaited Brandon Rohrer video!!!!

  • @dr.michaelr.alvers17
    @dr.michaelr.alvers17 5 лет назад +3

    Love it! Only one minor critics on didactics: Brandon you SAY that the shower handle position goes from 1 to 10 - in the presentation it goes from 1 to 9. It is absolutely not relevant to understanding but such minor flaws often disturb unexperienced learners. Just a thought.

    • @ntt2k
      @ntt2k 4 года назад

      No offense, but that's more of an OCD issue .. since neither 9 nor 10 was used to illustrate the point

  • @jjolla6391
    @jjolla6391 4 года назад +2

    whats not clear is where the iterations are. Would like to have seen at least 2 iterations worked thru with the real life numbers you started with

  • @louis9116
    @louis9116 2 года назад

    are the sensitivities constant in this example? If not, why did we calculate them in the beginning?

  • @mayankpj
    @mayankpj 4 года назад +2

    Nice to see you back with a very amazing video. I was wondering if the video courses (on e2e) you have created can be used for teaching purposes?

    • @BrandonRohrer
      @BrandonRohrer  4 года назад +1

      Hi Mayank. Thank you. And absolutely yes! I encourage teachers to use my materials however they can. There are group discounts for the paid content if that's the direction you'd like to take your class.

  • @hassanshahzad3922
    @hassanshahzad3922 3 года назад +1

    Hey Brandon,
    have you written any book?
    if yes then give me a link and if not then please write it for us.

    • @BrandonRohrer
      @BrandonRohrer  3 года назад +1

      Thanks Hassan :)
      I haven't written a book yet unless you count this as a meandering list of half-finished chapters: e2eml.school

  • @rp9720
    @rp9720 4 года назад

    Brandon: Nicely explained. Thx!

  • @pgathogo
    @pgathogo 5 лет назад +2

    Perfect! Very good explanation of back prop.

  • @berknoyan7594
    @berknoyan7594 5 лет назад +1

    Hi Brandon. Huge fan. Planning to buy your whole bundle when i got a time to look. Can you recommend me a starting route? From where to start? I think that someone should understand machine learning before deep learning thus i think i should start from ml before dl. Any help? Thanks for your efforts.

    • @BrandonRohrer
      @BrandonRohrer  5 лет назад +1

      Thanks bekonyn! I'm very happy to hear it. There isn't a strict order to the courses, except for 312 and 313, basics and advanced neural networks. Other than those two, you can work through the courses in any order. Where there are soft pre-requisites or supplementary material I call it out. But you can feel free to let your curiosity direct you.
      If you are feeling a little lost then feel free to start at 171 and proceed in numerical order. Enjoy!

  • @brotherlui5956
    @brotherlui5956 5 лет назад +3

    Hi Brandon, a very good explanation of backpropagation. There's a small glitch at 16:33 where you bring up temperature but i assume shower head water flow was meant.

    • @BrandonRohrer
      @BrandonRohrer  5 лет назад +2

      Oops! Of course you are correct. Good catch. I've corrected it in the transcript.

  • @techwizpc4484
    @techwizpc4484 3 года назад

    How does this look like in a diagram? It's hard to visualize the positioning.

  • @connor-shorten
    @connor-shorten 5 лет назад +2

    Great visualization with the pipe filter! I would really like a backprop through time in rnns video as well if you are interested!

    • @BrandonRohrer
      @BrandonRohrer  5 лет назад +1

      Thank you! RNN and CNN backprop examples are on my roadmap as well.

  • @shubhpatni2123
    @shubhpatni2123 5 лет назад +1

    wow! you have the best videos

  • @Bjarkediedrage
    @Bjarkediedrage 2 года назад

    I understand everything up onto 10:50, if we only have an Error - yPrime. How can we define dx, dy, dm and dh ? Do we start out with a random delta? for all of them and run it twice? And if so, what is dy? dy of the random delta, or dy of the error? or y' - y?

  • @piyushmajgawali1611
    @piyushmajgawali1611 4 года назад +1

    16:33 you said temperature,but we were adjusting the flow rate

  • @ichinosevoid9034
    @ichinosevoid9034 5 лет назад +1

    i understand the purpose of all this but can some explain me this with concret application because i'm struggling to apply the "curly-d" with real numbers
    btw it's an incredible great job u just did

    • @BrandonRohrer
      @BrandonRohrer  5 лет назад +1

      Thanks! Yes, you can absolutely follow along with applying this in a neural network in End-to-End Machine Learning Course 312: end-to-end-machine-learning.teachable.com/p/write-a-neural-network-framework

  • @joy2000cyber
    @joy2000cyber 3 года назад

    So it’s a feedback control system with control parameters in matrix

  • @alphacharith
    @alphacharith 5 лет назад +2

    Thank you so much Brandon

  • @UrGuru
    @UrGuru 4 года назад

    Great..Just Great

  • @silberlinie
    @silberlinie 5 лет назад

    You can say it easier. For those who are
    only a little familiar with technology. Machine
    Learning Backpropagation is the equivalent
    of the industry's 50 year old PID controller.

  • @joe_hoeller_chicago
    @joe_hoeller_chicago 5 лет назад +1

    Is it possible to use this one as an intro, and then have a follow-up video that is slightly more technical with python?

    • @BrandonRohrer
      @BrandonRohrer  5 лет назад +1

      The videos describing the python implementation of this are part of a the neural network fundamentals in End-to-End Machine Learning Course 312: end-to-end-machine-learning.teachable.com/p/write-a-neural-network-framework

  • @bryanbischof4351
    @bryanbischof4351 4 года назад

    The main valve handle’s illustration not being symmetric was kinda twitching me out. 😅
    But this video is very good. I’ll share this with students.

  • @inradiusspace
    @inradiusspace 4 года назад

    please explain me, how we get d(y)/d(x) = 1/4 on 7.02, instead 1/2 as calculated on 6.06 ?????????? i have broken my head

  • @LucyRockprincess
    @LucyRockprincess 4 года назад

    interesting analogy

  • @patrickryckman3867
    @patrickryckman3867 4 года назад

    Best video on backprop Ive seen so far, and I watched dozens. However this video still fell short for me. This first half was great, you used real numbers so I can see and understand, in the second half you completely abandoned using numbers and only used letters. Its easy for me to understand when you write 8 divided by four. But I dont understand h/x. Please stick to numbers for those of us who dont work with algabraic notation on a daily basis. Having both is ok, but you lost me when you dropped the numbers completely.

  • @Bjarkediedrage
    @Bjarkediedrage 2 года назад

    Using variable names made gaining the intuition unnecessarily more difficult than it needed to - for me. I constantly had to go back - wait what was x, what was y, m, h, w and so on, and it broke the flow. I would have preferred using the verbose but understandable version, and only when the intuition is there, then put labels on it.
    Great video nonetheless!

  • @adiyogi-thefirstguru5144
    @adiyogi-thefirstguru5144 5 лет назад +1

    Ha, after a very long time, please upload regularly, waiting for your videos

  • @pptmtz
    @pptmtz 5 месяцев назад

    thanks

  • @ahmedelsabagh6990
    @ahmedelsabagh6990 5 лет назад

    Perfect video

  • @ehanzhang7844
    @ehanzhang7844 Год назад

    Thx!!!

  • @bonniewilson9709
    @bonniewilson9709 Год назад

    Better check with codes...

  • @chenyifa
    @chenyifa 5 лет назад +3

    I feel useless

  • @vasiliansotirov6976
    @vasiliansotirov6976 3 года назад

    My head is gonna explode

  • @sumanpreetkaur774
    @sumanpreetkaur774 5 лет назад +1

    You always explained in an impressive way. Great Work. Can you please provide me your mail I'd.

  • @chrischoir3594
    @chrischoir3594 4 года назад

    way too much for a beginners tutorial. Most BP intro tutorials are done with simple truth tables

  • @manuelaljibesrosas664
    @manuelaljibesrosas664 5 лет назад +4

    First