Backpropagation calculus | Chapter 4, Deep learning

Поделиться
HTML-код
  • Опубликовано: 27 сен 2024

Комментарии • 2,1 тыс.

  • @3blue1brown
    @3blue1brown  7 лет назад +1889

    Two things worth adding here:
    1) In other resources and in implementations, you'd typically see these formulas in some more compact vectorized form, which carries with it the extra mental burden to parse the Hadamard product and to think through why the transpose of the weight matrix is used, but the underlying substance is all the same.
    2) Backpropagation is really one instance of a more general technique called "reverse mode differentiation" to compute derivatives of functions represented in some kind of directed graph form.

    • @iamunknownperiod3355
      @iamunknownperiod3355 7 лет назад +18

      You should probably change the thumbnail. The snapshot of the variables with indices (which I didn't know were indices at the time) and subscripts almost deterred me from watching this although it really wasn't that complicated.

    • @TalathRhunen
      @TalathRhunen 7 лет назад +26

      I will probably be a TA for a lecture course on Deep Neural Networks again next semester and I will recommend this series to the students (we did it in a very math-heavy way this year and it was a bit too much for some of them, even though its a lecture for master students)

    • @sebaitor
      @sebaitor 7 лет назад +8

      I was hoping you'd explain in either of these 2 vids on backprop why the hadamard product and transposing are used, what a waste :(

    • @polychats5990
      @polychats5990 7 лет назад +5

      Amazing video, I think you did a really good job of making it as easy to understand as possible while also not simplifying things too much.

    • @bgoggin88
      @bgoggin88 7 лет назад +2

      Sirius Black what you could do is download the r package "deepnet" and poke around at the source code. Its written in base r so you can follow it around. This is how I learned, and IMHO the best way to learn.

  • @cineblazer
    @cineblazer 3 года назад +1494

    Dear Grant,
    A year ago, I decided I wanted to learn Machine Learning and how to use it to make cool stuff. I was struggling with some of the concepts, so I went to RUclips and re-discovered this series on your channel.
    Out of all the courses I've tried and all the hours of other content I've sat through, your videos stand out like a ray of sunshine. I just got my first full-time job as a Machine Learning Engineer, and I can confidently say it would never have happened without this series.
    Your channel may have affected the course of my life more than almost any other. Thanks for all your hard work!

    • @maruferieldelcarmen9573
      @maruferieldelcarmen9573 2 года назад +298

      You could say that this channel had the largest nudge to your activation value

    • @cineblazer
      @cineblazer 2 года назад +112

      @@maruferieldelcarmen9573 The partial derivative of Grant's videos with respect to my career is off the charts!

    • @souls.7033
      @souls.7033 2 года назад +13

      @@maruferieldelcarmen9573 get out 😂

    • @souls.7033
      @souls.7033 2 года назад +6

      @@cineblazer also i just saw your comment 11months ago, it's amazing to see your development! keep it up!!!

    • @khai7151
      @khai7151 2 года назад +6

      Congrats on your job. I was wondering, when you finished Andrew Ng’s ML course, what additional steps and how long did you have to take to become a full fledge ML engineer?
      Thanks in advance

  • @kslm2687
    @kslm2687 6 лет назад +2987

    “The definition of genius is taking the complex and making it simple.”
    - Albert Einstein
    You are genius.

    • @jean-francoiskener6036
      @jean-francoiskener6036 4 года назад +45

      I thought he said "You don't understand something well until you can explain it in a simple way"

    • @fractal5764
      @fractal5764 4 года назад +10

      That's not the definition of genius

    • @ericayllon7497
      @ericayllon7497 4 года назад +3

      @@jean-francoiskener6036 yes, it is a quote that appeared in this youtube channel

    • @Djorgal
      @Djorgal 4 года назад +99

      "More quotes are attributed to me than I could possibly have said during my entire life." - Albert Einstein

    • @shawnjames3242
      @shawnjames3242 4 года назад +3

      @@Djorgal Did he actually say that?
      \

  • @hiqwertyhi
    @hiqwertyhi 7 лет назад +936

    It's not that no-one else makes top-notch math/cs videos, it's that this guy makes it CLICK.

    • @ravenn2631
      @ravenn2631 5 лет назад +15

      hiqwertyhi It rivals even the website BetterExplained. People like this teach me how to teach.

    • @vgdevi5167
      @vgdevi5167 Год назад

      Hello, I'm impressed by the way he explained this topic too, but I'm looking for more such great quality resources, youtube channels, books on deep learning, and also math and comp science in general, what do you recommend?

  • @noahkupinsky1418
    @noahkupinsky1418 5 лет назад +1035

    Hey for all of you getting discouraged because you don’t understand this - that was me last year. I went and taught myself derivatives and came back to try again and suddenly I understand everything. It’s such an amazing feeling to see that kind of work pay off. Don’t give up kiddos

    • @kg3217
      @kg3217 2 года назад +6

      Thanks for the nice words 🙂

    • @angelbythewings
      @angelbythewings 2 года назад +15

      studied this 3 years ago in college and it all makes sense to me now

    • @xbutterguy4x
      @xbutterguy4x 2 года назад +8

      Yup. I tried to watch this series a year ago and make my own neural network which turned out to be disappointing. A semester into college and some passion for calculus is all it took for me to mostly understand this series!

    • @sukhresswarun
      @sukhresswarun 2 года назад +2

      Same here man
      I seen this video a year ago
      But now only i understand fully
      Keep commenting

    • @oskarwallberg4566
      @oskarwallberg4566 2 года назад +7

      I would say it’s recommended to have read calculus 2 (for partial derivatives and the Jacobian) and linear algebra (for matrix and vector multiplication). Otherwise, just looking up mentioned things is also fine. But it might take time to build up intuition for the math.

  • @thomasclark8922
    @thomasclark8922 Год назад +2029

    This series was my first introduction to Machine Learning 3 years ago. I now work full-time as an AIML Scientist, my life is forever changed. Thank you.

    • @envadeh
      @envadeh Год назад +30

      how hard was it? I am tryna code my own neural network from scratch, there's so little resources for that it seems. and how do I even make myself unique?

    • @thomasclark8922
      @thomasclark8922 Год назад +360

      ​@@envadeh Learn everything, use the feynman technique; if you can't explain how a machine learns to someone who knows nothing about it, keep filling in the gaps. Formal education is great, but honestly more of a waste of time than not. Teach yourself, learn how to learn, and then keep learning.
      I audited Andrew Ng's Deep Learning Specialization from Coursera, had some formal education, and self taught myself everything I could get my hands on, from theory to application, the underlying math to the practical programming. Understand the importance of data, DATA IS KING. Watch Interviews with industry leaders, understand the big turning points and continued development within the last two decades of AIML (you'll figure out what they are with time).
      It takes 10,000 hours to become an expert, I'm about 4,500 in, but all it took was a little bit of work every single day. Make learning a habit. Trust yourself, believe in your ability to become who you want to be.
      "It doesn't matter if your read two research papers in a week, what matters is if you read two research papers a week for a year, now you've read 100 papers" - Andrew Ng
      (Don't 'read' research papers, watch synopsis! Smarter not harder! There's so much free information, you could probably use a GPT model to teach you what you don't know!)
      Goodluck, and I believe in you! :)

    • @nczioox1116
      @nczioox1116 Год назад +4

      Did you need a CS or math degree to get into the field?

    • @thomasclark8922
      @thomasclark8922 Год назад +69

      @@nczioox1116 "Need" is a strong word, it just depends on what kind of work you want to do/who your employer is; people used to go to college because that was the only place you could learn these difficult subjects, but now it's just an archaic way of putting you in debt since you can learn these things online for free UNLESS you want to work for an employer where you need the degree to be recognized.
      If you are self-motivated and can teach yourself these subjects, seriously consider your options before assuming that spending 4 years of your life and 100k+ is necessary.
      I have an Electrical Engineering degree, but out of the 40+ classes I had to take for it, only 2 had any sort of impact on my daily job now. It all depends on the context.
      Goodluck, and I believe in you! :)

    • @nczioox1116
      @nczioox1116 Год назад +14

      @@thomasclark8922 Thank you! I have a mechanical engineering degree. I'm in the process of self teaching myself machine learning concepts and doing some projects. Lots of job postings I've seen in the field seem to require a bachelors or masters in CS, math, or neuroscience. Of course these seem to be for larger companies so maybe smaller companies might take a more holistic approach

  • @yashjindal9822
    @yashjindal9822 Год назад +88

    I just started out with my ML career. This entire series made me feel as if I knew it all along. Thank you Grant
    I will return to this comment to share my professional progress😊

    • @Mayank-lf2ym
      @Mayank-lf2ym 8 месяцев назад +13

      Now it's time to return to tell your progress

    • @azibekk
      @azibekk 5 месяцев назад

      Any update?

    • @rmmaccount
      @rmmaccount 4 месяца назад

      we are waiting

  • @hutc22222222
    @hutc22222222 Год назад +214

    Your work of making high levels of math accessible to anyone wishing to learn a variety of new topics is not obvious to me. You succeed to explain everything so clearly, making me want to start learning maths again, reminding me of and introducing me to beautiful aspects of math, and you deserve more than a 'thank you' :)

  • @SaifUlIslam-db1nu
    @SaifUlIslam-db1nu 5 лет назад +331

    It has taken me about 3-4 days worth time to understand all of these 4 lectures, lectures which are in total, no longer than 1 hour and 30 minutes.
    And I feel proud.

    • @debajyotimajumder2656
      @debajyotimajumder2656 4 года назад +12

      you should get the t-shirt-merch from 3b1b's description site, the shirt says "pause and ponder"

    • @danielcampelo2
      @danielcampelo2 4 года назад +12

      Same here. Took my time to hear all explanations. This last video is by far more complex than the previous ones, yet still very well explained.

    • @polycreativity
      @polycreativity 4 года назад +18

      I'm attempting to implement it from scratch in C# with no matrix math library or anything so I can get a feel for the nuts and bolts. This is the boss level!

    • @polycreativity
      @polycreativity 4 года назад +1

      @@berkebayraktar3556 Yeah, I'd love to once I can get it to train properly! So finicky.

    • @vibaj16
      @vibaj16 3 года назад

      Daniel McKinnon me too! I’m working on the back propagation, this math is hard

  • @SaintKhaled
    @SaintKhaled Год назад +93

    The quality of this education is top-tier. I absolutely am speechless that you make it freely accessible. Thank you so much!

  • @snf303
    @snf303 2 года назад +60

    At time when I just finished my university - I could not imagine that at one chilly Sunday evening, in almost 15 years after the graduation, I will sit with a bottle of beer, watch math videos, and have so much fun! Thank you!

  • @antoniobernardo9884
    @antoniobernardo9884 7 лет назад +537

    this is easily the best channel in youtube today! once I get a job i will more than glad to support you!

    • @utsavprabhakar5072
      @utsavprabhakar5072 6 лет назад +22

      Exactly what i was thinking!

    • @chinmayrath8494
      @chinmayrath8494 4 года назад +23

      It has been two years. Have you supported yet??

    • @pranaysingh3702
      @pranaysingh3702 4 года назад +21

      Did you get a job ?

    • @fitokay
      @fitokay 4 года назад +7

      Two years ago, could you get AI job?

    • @hozelda
      @hozelda 4 года назад +42

      @@fitokay I think AI won and got his job.

  • @vectozavr
    @vectozavr 6 лет назад +331

    That is the reason for learning the math! To understand such a beautiful things! That is awesome! Thank's a lot!!!

    • @anthead7405
      @anthead7405 3 года назад +12

      Math on his own is also the reason for learning math.

    • @vvii3250
      @vvii3250 3 года назад

      Интересно повстречать тебя тут. :)

    • @owaisfarooqui6485
      @owaisfarooqui6485 3 года назад +1

      and when I asked my math teacher, that person told me you need this to pass the test. that didn't make a lot of sense back then

    • @krenciak
      @krenciak 3 года назад +3

      @@vvii3250 Ага, чувствую себя как в своеобразном мини-клубе, где собралась небольшая компашка и тусуется))

    • @bocik2854
      @bocik2854 3 года назад

      @@krenciak Ыыыыыы

  • @shofada
    @shofada 6 лет назад +322

    This is how 21st teaching should look like. It feels like your work should be made a "human right". Thank you.

    • @fakecubed
      @fakecubed 5 месяцев назад +1

      No human has the right to another human's labor. That's called slavery.

  • @bradleydennis210
    @bradleydennis210 4 года назад +45

    I just finished up calc iii this semester and I have never felt happier with myself for being able to apply my new knowledge than this episode. I also don't think I have ever been more excited to hear calc iii topics being brought up in a field I am trying to teach myself currently. Thank you for making such a simple to understand series!

  • @vedant7090
    @vedant7090 3 года назад +29

    Man u deserve a Nobel Prize for teaching Machine Learning with this simplicity.

  • @borg286
    @borg286 6 лет назад +56

    The point where you addressed the concern that the example you were using was too simple, having only 1 edge, was spot on as you were leading me down this merry garden path. I appreciate how much you watch your own videos and predict where the watcher would mentally say, "but what about..."

  • @Jabrils
    @Jabrils 6 лет назад +749

    youre a deity Grant

    • @Jabrils
      @Jabrils 6 лет назад +11

      Haha, why hello Bill. Nice to find you on 3B1B's channel :D

    • @bevel1702
      @bevel1702 6 лет назад +10

      wut

    • @anjelpatel36
      @anjelpatel36 4 года назад +1

      wut

    • @gumbo64
      @gumbo64 4 года назад +1

      wut

    • @idr7789
      @idr7789 3 года назад +1

      wut

  • @thfreakinacage
    @thfreakinacage 6 лет назад +46

    My god! A basic machine learning video series that actually makes sense to completely beginners!
    Subscribed, and waiting in great anticipation for the next one! :D

  • @elizfikret7489
    @elizfikret7489 Год назад +17

    Thank you so much! I have understood more math from this channel than from all teachers I have had in high school or university in total.

    • @vgdevi5167
      @vgdevi5167 Год назад +1

      Hello, I'm impressed by the way he explained this topic too, but I'm looking for more such great quality resources, youtube channels, books on deep learning, and also math and comp science in general, what do you recommend?

  • @nomasan
    @nomasan 3 месяца назад +1

    Rewatching this over and over and over again... that really does help with understanding it.
    It builds connections in the brain

  • @Redrumy0
    @Redrumy0 6 лет назад +21

    Literally the only youtube channel, that makes studying 2 hours of math, go by in a blink of an eye

  • @samuelreed5481
    @samuelreed5481 6 лет назад +20

    These videos are unbelievably well produced. Thank you so much for your effort. You've made this topic incredibly clear and I cannot understate how much I appreciate the amount of effort you put into these. You have incredible talent as a teacher.

  • @LimitedWard
    @LimitedWard 4 года назад +22

    Absolutely brilliant explanation! I took a course on deep learning in college, but ended up auditing it in the end because I couldn't grasp the concepts well enough to pass the tests. You just took the entire first unit of the course, which took several weeks, and condensed it into 4 easily digestible videos that anyone can understand!

  • @Kevin-cy2dr
    @Kevin-cy2dr 4 года назад +4

    Honestly this channel doesn't deserve a dislike button. It took me days to figure out one video(at the beginning),but the concepts remain still in my head. This channel taught us that maths is not just changing numbers, but its conceptual and intuitive just like science. Grant if you are ever read this, please know that you are one of the very few people that change the world. I just dont have words for you man, great job is an understatement for you. I promise once i earn enough i will contribute to your channel

  • @sainandandesetti3268
    @sainandandesetti3268 4 года назад +5

    Stunningly beautiful...
    The best part of the series (for me, obviously) is that the beauty of this series does NOT make it very easy to understand.
    No. Each video may need multiple views. But these videos are so beautifully made that you'd want to watch them again and again, not with the frustration of getting your head over a concept but with the thrill of unravelling a mystery...
    So for creating such excitement in me, thank you.

  • @Erioch
    @Erioch 6 лет назад +4

    Honestly, this is one of the best (If not the best) channel on Mathematics/Science education I have seen. Intuitive but not oversimplified. Thank you so much that for offering your spectacular work and you help so many people understand these concepts.

  • @samarthsingla1082
    @samarthsingla1082 5 лет назад +8

    The amount of help you are providing is nothing short of amazing.

  • @zilongzhao3274
    @zilongzhao3274 3 года назад +3

    your video should be shown in every university's lesson, the animation makes the calculation just so easy to understand.

  • @rohitdatla724
    @rohitdatla724 4 года назад +2

    u r not just teaching NN concept but how to think, break down and understand any complex problem and digest, U R AWESOME!!!!!

  • @atiehhisham
    @atiehhisham 3 месяца назад

    This series explained all of this to me 100x better than the courses I paid for. You are a genius, keep up the great work !

  • @giron716
    @giron716 7 лет назад +6

    I seriously have a hard time explaining how much I appreciate this video. I am far and away a symbolic thinker, as opposed to a geometric one, and while I love all of your videos and how intuitive you make the concepts, it's sometimes hard for me to think about the geometry. I am much more comfortable working with symbols and that's why I treasure videos like this. Thank you :)

  • @meeradad
    @meeradad Год назад +2

    These videos are the best ways to make a high schooler fall in love with calculus instead of hating it or fearing it. And open his/her mind to the joy of creativity rooted in mathematical insights.

  • @grigorioschatziandreou2558
    @grigorioschatziandreou2558 9 месяцев назад +1

    THANK YOU - MSc student here. Taken a module on machine learning. My university is world class and so the module was very well taught, so I had already good knowledge of neural networks. But now, I am doing research and need to dive deep into them and realised how much I lack deep understanding. This has helped A LOT.

  • @sjgmc
    @sjgmc 7 лет назад +6

    As an hobbyist programmer, i can't thank you enough! Once i finish my studies i will donate to you. :)

  • @micahsheller101
    @micahsheller101 6 лет назад +4

    Beautiful work! Reminds me of my late father who was a math professor: he had the same gentle, happy style, and believed heartily in making math a safe place for everyone to learn and have fun. Gonna make me tear up :)

  • @ashkankiafard4493
    @ashkankiafard4493 2 года назад

    The fact that I can understand what you're talking about shows that your teaching is flawless!

  • @13thxenos
    @13thxenos 7 лет назад +13

    Nicely done video.
    I knew I learned backpropagation before, but it was hard, and I didn't use it manually ( I used frameworks like TensorFlow which uses computational graphs and backpropagate automatically) so I've forgotten how it actually worked.
    But this video is a great resource for newcomers to ANNs and people like me that have forgotten the theory behind it all. Thank you.

  • @aravindkannan9490
    @aravindkannan9490 7 лет назад +26

    This is by far the best video I have ever seen in Neural Networks. Thanks for this! :)

    • @tisajokt7676
      @tisajokt7676 7 лет назад

      I also suggest the video series on neural networks by Welch Labs, or if you've seen it already, I'd be interested to hear your comparison between it and 3Blue1Brown's series.

    • @aravindkannan9490
      @aravindkannan9490 6 лет назад +3

      Just completed their playlist! equally good :) I like the application-oriented explaination
      However, I would still recommend 3B1B for an absolute beginner because of the in-depth explanation and for the help in visualizing the math behind it

  • @MeriaDuck
    @MeriaDuck 7 лет назад +4

    After seeing a few pieces of books, descriptions on the internet about back propagation, with this video I finally reached some kind of enlightenment (especially at about 4:50 into this video). Thank you so much for that!
    Just as a hobby, I was trying to implement a neural network from scratch in java: plain objects for neurons, connections and layers. I wanted to really visualize how a neural network WORKS. (Visualize either as computer code, but maybe I even want to create some visual for it...) This will certainly help me on my way!

  • @colorlace
    @colorlace 3 года назад +2

    Best explanation I've ever seen of the chain rule in a video that isn't about the chain rule

    • @cineblazer
      @cineblazer 3 года назад +1

      Ok but SAME like I genuinely didn't get the chain rule until this video and now it's like "wait -- that's it? It's that... simple?!" THIS IS WHY GRANT IS THE ABSOLUTE GOAT.

  • @bigbluetunafish4997
    @bigbluetunafish4997 10 месяцев назад +2

    Finally I finished these 4 chapters of neural networks, and some of your linear algebra and calculus stuff. I feel much better that now I have deeper understanding of how neural network works and have built up that base for further exploration of machine learning. Thanks very much for your effort creating all these great videos together.

  • @GaborGyebnar
    @GaborGyebnar 6 лет назад +80

    Awesome material. :) Please make a video about convolutional neural networks, too.

  • @prashamsht
    @prashamsht 5 лет назад +6

    One of the best lectures I have ever heard. Great explanation of NN, cost functions, activation functions etc. Now I understand NN far far better...(P.S. I saw previous videos Part 1, 2,3 as well)

  • @ehsanmon
    @ehsanmon 6 лет назад +6

    Thank you so much, Grant. I finally learned back prop, and I have become a patron. I wish I could do more.

  • @DavidUgarteChacon
    @DavidUgarteChacon 3 месяца назад

    It is just absolutely crazy how well this guy teaches

  • @n9537
    @n9537 4 года назад

    This 10 min video is pure gold. Lays down the math in an easy to understand, intuitive manner.

  • @Abstruct
    @Abstruct 7 лет назад +51

    This stuff is an amazing supplication to Andrew Ng's courses, it gives a lot more intuition and visual understandings of the formulas.

    • @claytonharting9899
      @claytonharting9899 6 лет назад +1

      It certainly is a huge help for backprop. Just the tree visual is a huge help. Btw, what do you think of 3b1b’s use of a bias value vs Ng’s use of a weighted bias node? I think 3b1b’s may be more clear, but the node version is more computationally efficient.

    • @Viplexify
      @Viplexify 6 лет назад +13

      ... in which Ng mentioned that he still doesn't fully understands backprop. I wondered if it was true or just a consolation for beginners.

    • @ab452
      @ab452 6 лет назад

      Consolation, it is just to sooth your frustration. But he can also be referring that you can understand how to compute it for a simple case ,but it in a large instance you simple lose track of it. Without a computer is would be a hopeless task.

    • @tanmaybhayani
      @tanmaybhayani 5 лет назад +1

      andrew should link this series in his course, cos this is just beautiful!

    • @hayden.A0
      @hayden.A0 4 года назад

      I'm actually here in between Andrew Ng's course on machine learning. there were a few concepts I didn't completely understand but they are quite clear now.

  • @bean217
    @bean217 2 года назад +2

    I am currently going through Michael Nielson's "Neural Networks and Deep Learning" book. This video helps to clear up and visualize the chapter on back propagation a lot. Thank you for making this video series.

  • @tekashisun585
    @tekashisun585 4 месяца назад

    Learned ML in an intro to AI course offered in my university, it’s the content of the last couple of weeks. Lots of details are left out, so this series has been putting things into perspective for me. Thanks

  • @kirilllosik7054
    @kirilllosik7054 Год назад +7

    Thanks a lot for creating such a fantastic content! Anticipating to see more videos about AI, ML, Deep Learning!

  • @cowcannon8883
    @cowcannon8883 6 лет назад +664

    Neural networks have layers, ogres have layers
    Shrek is an AI confirmed

    • @icegod4849
      @icegod4849 5 лет назад +4

      Goddamn nice reference would like it a thousand times over if I could

    • @minecraftermad
      @minecraftermad 5 лет назад +4

      Shrek is our AI overlord

    • @Ammothief41
      @Ammothief41 5 лет назад +6

      My AI overlord has decided they're both onions.

    • @inlandish
      @inlandish 5 лет назад +2

      ~onions have layers too~

    • @shiveshramlall2809
      @shiveshramlall2809 5 лет назад +1

      Big. Brain.

  • @zafar0132
    @zafar0132 2 года назад

    I don't know how you do it 3Blue1Brown.... but give YOUR self a pat on the back... with only a single view of the 1st 3 and two views of the 4th. I actually understand how neural nets work. congratulations to you for making a complex subject seem so easy!

  • @securemax
    @securemax Год назад +3

    For everyone wanting to implement backprop from scratch: don't use dC/da = 2*(a-y). Instead use dC/da = a-y. This is because the cost function would actually be defined with a factor 1/2 in front which is missing here. Hence, the derivative changes. All other derivatives are good :)

    • @carloscortes2391
      @carloscortes2391 9 месяцев назад

      Why would there be a 1/2 factor?, to average the square error since there are 2 outputs?

  • @qwert-cj4ld
    @qwert-cj4ld 4 года назад +13

    9:28 sums up the whole thing

  • @vil9386
    @vil9386 Год назад

    How easy it is to understand this through your lectures in just 10minutes. THANK YOU.

  • @antovrdoljak1317
    @antovrdoljak1317 3 года назад +2

    It really can`t get any better than this. Awesome! This is truly the peak of learning methodology and didactics!

  • @thomasschwarz1973
    @thomasschwarz1973 Год назад +5

    This is truly awesome, as pedegogy and as math and as programming and as machine learning. Thank you! ...one comment about layers, key in your presentation is the one-neuron-per-layer, four layers. And key in the whole idea of the greater description of the ratio of cost-to-weight/cost-to-bias analysis, is your L notation (layer) and L - 1 notation. Problem, your right most neuron is output layer or "y" in your notation. So one clean up in the desction is to make some decisions: the right most layer is Y the output (no L value), because C[0]/A[L] equals 2(A[L] - y). So the right most three neurons, from right to left, should be Y (output), then L then L minus one, then all the math works. Yes?

  • @notbobbobby
    @notbobbobby 7 лет назад +7

    Right now, I am so thankful for having taken vector calculus and several numerical approximation courses. This was an AWESOME video to watch. Thanks! :)

  • @michaelarmstrong2251
    @michaelarmstrong2251 Год назад

    Just watch this series of videos. I'm a mechanical engineer with no prior experience of machine learning - now I feel like I understand quite a few concepts that were hard to wrap my head around when learning from other sources. Absolutely awesome videos - well done!

  • @mukundholo6019
    @mukundholo6019 5 лет назад +2

    this is math at the best and art too at the highest. the grace of the animation, the subtle music, the perfectly paced narration and the wonderful colour scheme! math and art or let's say math is art!

  • @bilalsedef9545
    @bilalsedef9545 2 года назад +3

    This is a great and very educational video. But I think it needs one more part to show how the weights are updated.

  • @ssachdev1
    @ssachdev1 2 года назад

    "So pat yourself on the back! If all of this makes sense, you have now looked deep into the heart of backpropagation, the work horse behind how neural networks learn." felt soooo goooooood

  • @catchingphotons
    @catchingphotons 3 года назад

    Unarguably one of the best "tutorial" videos of all times! The carefully taken logical steps of understanding, the animations, the visualizations, the tempo, the examples... boggles my mind! This is a masterpiece!
    Greetings
    -Chris

  • @蘇志雄-h6n
    @蘇志雄-h6n 5 лет назад +5

    真是大師級的作品,解釋得非常清楚,太神奇了! Awesome!!

  • @4AneR
    @4AneR 7 лет назад +82

    What an art of math, Jesus Christ

  • @amulya1284
    @amulya1284 Год назад

    never in my life have I come across a calculus video explained so beautifully! in awe with this intuition

  • @Mizar88
    @Mizar88 4 года назад

    I am speechless. This channel undoubtedly contains the best pedagogical scientific material on RUclips, and possibly in the world. Thanks for making these videos, your skills and passion are unreachable!

  • @NitinNataraj-gf3vx
    @NitinNataraj-gf3vx 7 лет назад +251

    Netflix can show these rather than some other questionable material.

    • @4.0.4
      @4.0.4 6 лет назад +26

      Nitin they would probably make it about the gender/race of different numbers, or draw some number in a way that fans of that number don't like.

    • @NitinNataraj-gf3vx
      @NitinNataraj-gf3vx 6 лет назад +31

      A new phrase would emerge, "Backprop and chill"!

    • @sohailape
      @sohailape 6 лет назад +5

      netflix show what people want . Don't blame them , blame people around you,your friends , family and yOU .

    • @vineetk1998
      @vineetk1998 5 лет назад +1

      then it won't be free(unaccessible to who can't afford) and netflix doesn't care and get their hands on whatever gets them money

    • @vineetk1998
      @vineetk1998 5 лет назад +2

      bdw it would be great if they could make education addictive. lol that would be really great, Imagine haha

  • @omerfarukozturk9720
    @omerfarukozturk9720 2 года назад

    literally thank you. I learned the information that I could not learn at school for 5 weeks in a 10-minute video. The animations of the video are absolutely magnificent. Thank you thank you thank you

  • @ErnestoEnriquez-du4jj
    @ErnestoEnriquez-du4jj 3 месяца назад

    Phew, it finally clicked. Oh boy what a time to be alive!

  • @youngnam1175
    @youngnam1175 6 лет назад +6

    Thanks 3B1B. I'm understanding machine learning mush better, and following your video while note taking was the easiest method for learning.
    I'm a little confused about what the change in C_0 with respect to change in a_(L-1, k) for the k-th activation in L-1 layer (I just changed to this notation because I feel more comfortable writing like this in one line text). That's 8:40 part of the video I guess. It doesn't make intuitive sense for me as to why you need the summation of impact of a_(L-1, k) on a_(L, 0~n), say without any multiplier or something.
    Trying to understand the meaning of `dC_0/da_(L-1, k)` I thought of a Neural Network where there are only two layers, input and output layer, and input layer having 1 neuron and output layer having 2 neurons.
    Does it ever make sense for a_(L-1, k) to be an activation (or neuron?) in an input layer? If so, I think it makes to add the 'impact' all up especially when the weights are all same 'direction' or sign because if so summing them all up would result in greater number, and this would mean changing the input has the biggest impact in this scenario.
    If not, I'm still confused what `dC_0/da_(L-1, k)` is and why it has the summation.

    • @danielniels22
      @danielniels22 3 года назад

      hello, how are you? I know it's been 3 years since you made your comment. But for me, it's my first few weeks started to learn Deep Learning, and now trying to build Neural Network from scratch, before I try to learn frameworks like Tensorflow, Keras, etc.
      Do you happen to know now, why do we sum the activation layers? Before watch this video, i thought of square root each of its squared value, like we determine the scalar value of a vector. And i turned out I was wrong after watched this video. I really looking forward for an explanation or any source of information from you, about why do we sum the change of dC/da(L-1).
      Thank you 😊🙏

  • @zhexiangxd
    @zhexiangxd 4 года назад

    Sir, i cant thank you enough of how simply and clearly you explained this. Makes university professors look bad tbh. Thank you so much!

  • @youngsoochoy5592
    @youngsoochoy5592 4 года назад

    This is the best mathematical explanation about the backpropagation of neural network. I've watched other coursera courses twice, but nothing can be compared to this well-visualized and easy to understand explanation.

  • @TheZenytram
    @TheZenytram 7 лет назад +4

    WOW i thought that the math behind it would be waaayy more complicated. i know that this video has a lot information to digest but it's not complicated.

  • @deliciousnoodles5505
    @deliciousnoodles5505 4 года назад +1

    Wow this video made all my doubts from the first 3 videos clear! Longest 10 min video I've ever watched!

  • @homieboi5352
    @homieboi5352 5 месяцев назад

    I’m gonna need to rewatch this a few times to grasp it all, but wow, what a thorough explanation of back propagation! I adore how you referenced the entire equation earlier in the series and it made no sense, but now you’ve broken it down entirely. Phenomenal work!

  • @mike_o7874
    @mike_o7874 4 года назад +5

    i watched this video like 10 times at least and, Finally! finally! i was able to build my own neural network with back prop, in c#
    and it actually works!
    Btw, i found out that if my layers have a lot of neurons the System often over shoting, and deviding the Cost for each neuron by how many neurons,
    are in that layer help to fix that issue, i mean i can have the same training value for any type of neural network and it wont over shoot at all.
    but in the video you just said to add the costs for each neuron together rather then getting the average cost for each neuron....
    hmm...

    • @honeyant3119
      @honeyant3119 4 года назад

      can you show your backprop code? What are your parameters and what does the cost get to? I can't get mine to go below 1.

    • @mike_o7874
      @mike_o7874 4 года назад

      @@honeyant3119 well here it is
      github.com/miko-t/UnityMyNN/blob/master/Assets/MyNet/SimpleBrain.cs
      it writen in c# and iam not using metrecies just hard codded array's
      Watch the function Train

    • @mike_o7874
      @mike_o7874 4 года назад

      @@honeyant3119 it works fine when predicting stuff like, Graph slopes etc... or just the basic stuff, tried to make it learn how to drive a "car" and it fails misserably, but that fail might be just due to the fact i dont know how to calculate real time, cost.

    • @honeyant3119
      @honeyant3119 4 года назад

      @@mike_o7874 search up reinforcement learning for real-time tasks. Also, driving a car should be really easy. I trained a network to do so using evoltuion and all it took was making a good cost function.

    • @mike_o7874
      @mike_o7874 4 года назад

      @@honeyant3119 genetic algorithems are easy yeah i already did it that way just wanna try to use back prop for it because people do it.

  • @PowerOfTheMirror
    @PowerOfTheMirror 5 лет назад +5

    How do you determine the desired value of a neuron in the hidden layer?

    • @luciesayo
      @luciesayo 5 лет назад +1

      From what I understand, that value is based on the weights and biases of the previous layer, which if it is a hidden layer, depends on the weights and biases of the layer before it and so on until the input layer is reached. The value of the neuron in the hidden layer is based on the inputs of the Neural network. The desired value is found based on the cost function, or whatever minimizes the cost function for all the samples in the mini-batch for stochastic backpropogation or the entire training set for regular backpropogation.

  • @fossar_
    @fossar_ 2 года назад

    Had this recommended to me by RUclips for weeks now and glad I've finally have time to sit down and watch the miniseries. Excellent videos.

  • @funnyz1112
    @funnyz1112 2 года назад

    suddenly I know how this 'black magic' works, I'm having trouble understanding this topic, you have no idea how helpful this video is, thanks a lot!!!

  • @Freaklyfreakboy
    @Freaklyfreakboy 2 года назад

    Months of working with deep neural networks and i still have to come back to this video to re-digest the math. Don't get discouraged if you don't understand it! As 3blue1brown said it, it is very complex and takes time to digest.

  • @OrangeXenon54
    @OrangeXenon54 4 года назад

    This saved my life and made so much sense! Why can't more teachers be ACTUAL teachers like you instead of just assuming you know everything?!

  • @nepali_8848
    @nepali_8848 5 лет назад +2

    Just about an hour ago, I was totally alien to AI guys especially when they said "machine learns". Hats off to your selflessness making even a medico able to understand how a machine actually learns, relatively easier when I compared it to natural neural networks in our nervous system. Mark my words, with raging AI in healthcare, your videos will be a connecting link for someone away from AI, to know about how AI works.

  • @shaun2201
    @shaun2201 3 года назад

    @3Blue1Brown you are PICASSO of Math and Data Science. Love and Respect.

  • @Celastrous
    @Celastrous 7 лет назад +4

    What function is he saying at 2:00? He says Sigmoid and then something that I don't know how to spell to look up

    • @fasligand7034
      @fasligand7034 7 лет назад +8

      Sean Desert ReLU - rectified linear unit

    • @sage5296
      @sage5296 6 лет назад

      Basically it takes all reals and condenses it to something between 0 and 1, with 0 mapping to 0.5 and larger values mapping closer to 1 and negative values closer to 0. He shows a graph in a previous video. It looks kinda like arctan.

    • @cedv37
      @cedv37 6 лет назад

      Sean Demers They talk about it at the end of the first video of the series.

  • @sushimeng
    @sushimeng 6 месяцев назад

    It's lucky when you have a question and find 3b1b has a tutorial for it.

  • @semtex6412
    @semtex6412 8 месяцев назад

    six years now and this video is still a GEM!

  • @vivekpujaravp
    @vivekpujaravp Год назад

    Your insight and eloquence are exquisite. You are quite possibly the greatest educator to ever exist. Thank you for everything, please keep making more. I will continue studying and appreciating your brilliant work.

  • @rvg296
    @rvg296 4 года назад

    This is what we call "God Level Explanation". I did not find a video or resource which explained better than this. Thank you very much 3B1B.

  • @orientalnipper
    @orientalnipper 4 года назад

    You are the best lecturer that ever existed

  • @dhirajkumarsahu999
    @dhirajkumarsahu999 4 года назад

    You are a gift to humanity....I thank God, for such a valuable gift

  • @user-su9pg1jo4x
    @user-su9pg1jo4x Год назад

    After completing this series, I just watched the 2023 MIT Take (lecture) on this topic - I was constantly thinking "Uhh, i already know" the whole time. In fact, there was hardly anything new. Everything was just a way more formal and less understandable presentation of what is explained here. So in conclusion, it is quite easy to understand and fun to watch the MIT lecture: I watched this series before the one from MIT "MIT Introduction to Deep Learning | 6.S191"
    Thank you very much for making this so clear!

  • @LouisEmery
    @LouisEmery 5 лет назад

    I wish I saw this video much earlier since I'm good at chain rule and also optimization problems. I attended a lecture on neural networks in 1984. I didn't really understand how one can determine weights without fitting. Looked to me like back propagation was a swindle until I saw this video. Now I can show my friends using a couple of lines on the whiteboard.

  • @steelcitysi
    @steelcitysi 4 года назад +1

    This was the best animated description I've come across. I hope you can continue on this topic more. Especially interested in the jump to CNNs, and the intuition for the effects of changing the number of layers and number of nodes in the hidden layers.

  • @aron.mp4
    @aron.mp4 Год назад

    You explained this better than any professor could at my university. Thank you, sir!

  • @aaradhyadixit9564
    @aaradhyadixit9564 Год назад

    one of these best videos out there that explains the basics of neural networks, gradient descent, backpropagation in such easy language and intuitively. Great work!!

  • @prachisharma4494
    @prachisharma4494 5 лет назад +1

    You are the best to learn from with no weights and bias attached!!!

  • @Cube2deth
    @Cube2deth 3 года назад

    unbelievable. unbelievable. you are the best teacher on youtube.

  • @jasperbutcher2596
    @jasperbutcher2596 4 года назад

    Man here i am looking through the comments to answer questions, but 99% of them are complements to how amazing these videos are...