The Most Important Algorithm in Machine Learning
HTML-код
- Опубликовано: 12 май 2024
- Shortform link:
shortform.com/artem
In this video we will talk about backpropagation - an algorithm powering the entire field of machine learning and try to derive it from first principles.
OUTLINE:
00:00 Introduction
01:28 Historical background
02:50 Curve Fitting problem
06:26 Random vs guided adjustments
09:43 Derivatives
14:34 Gradient Descent
16:23 Higher dimensions
21:36 Chain Rule Intuition
27:01 Computational Graph and Autodiff
36:24 Summary
38:16 Shortform
39:20 Outro
USEFUL RESOURCES:
Andrej Karpathy's playlist: • Neural Networks: Zero ...
Jürgen Schmidhuber's blog on the history of backprop:
people.idsia.ch/~juergen/who-...
CREDITS:
Icons by www.freepik.com/
Join Shortform for awesome book guides and get 5 days of unlimited access! shortform.com/artem
Can you talk about liquid neural networks? I’m interested to know if that’s a revolutionary work that deserves more recognition and following.
arxiv.org/pdf/2006.04439.pdf
Back prop is a hard, heavy thing to explain, and this video does it extremely well. I mean, that section 'Computational Graph and Autodiff' might be the best explanation of that subject on the internet. I'm very impressed - well done!
You two are the best channels I have found in the SoME episodes. It's great to see this interaction between you guys.
Love your videos
If there is no mention of sine waves in neural networks then it won't be total.
Funnily enough, the calculus portion of the video is probably one of the best explained I've seen
Why would that be 'funnily enough'? What a diss lmao.
@@George70220 I don't think CuriousLad meant it as a diss, it's just that when Artem made the video, he explained the Calculus section as a background information. The partial derivates and gradient descent wasn't the main topic of the vid, yet you could show this to Calculus I student and they would be thanking him for the explanation, even if they have not interest in learning back propagation! That's why funnily enough, while the intro Calc topics wasn't the main part of the video, that portion would be very helpful to anyone starting out int Calc!
I dont agree for example the act of minimizing loss function and gradient descend were not properly linked there were just two pieces of information unprocessed dumped in series
"Wait, It's all derivatives?"
"Always has been"
Great work pal. Provides excellent clarity.
Looking forward to the second part.
😂 Turns out back propagation isn’t just magic
It makes sense that you would cover both computational neuroscience AND machine learning since they both play a significant role in AI research. The sort of content you're making is definitely 3Blue1Brown level. Keep up the good work!
The visuals on this video is from another planet . So Good !!!!!!!!
By far the best ML explanation I have seen on internet.
It’s probably the best explanation of backward propagation. Hats off to your hard work and saving this so valuable content.
this's by far the most clearer explaination and simplification of backpropagation i have watched
This is one of, if not the, best videos I’ve seen that throughly explains back propagation. It will definitely help me to be able to better explain the algorithm to others, so thank you for creating it.
There could not have been a better explanation. Hats off to you
This is the best ML explanation I have seen on YT
So clear and concise! Thank you for creating this.
Damn, I was wondering where you've been since over half a year, whilst I was stuck in backpropagation😂 and here you came back like a true mind reader. Glad to see you back❤
He was calculating your backward step so you can make your next forward step (sorry, couldnt resist) XD
This is incredibly well done and helped me visualize derivatives comprehensively. Thank you.
This is a visual masterpiece! Well done!
Much of this was a review for me as I took the time to go through all this last year. I did an implementation of the MNIST handwritten number neural network and had to learn all the calculus covered here to work out the backpropagation math. You really do have to dig in to it to get a good handle on it but it's fun stuff.
Dude, this is the most beautiful ML video i've ever seen. Highly informative yes, but also beautifully made. Thank you for your work.
This just might be the most underrated video on Back Propagation that I've ever seen! I hope more people come across this
This is the best ever explanation I have seen. Thanks for taking the time and doing something extraordinary.
This has to be the best explanation of the chain rule ever! Thanks
I just have to say this goes way beyond the quality of the many chainrule videos I've seen so far. Good job man, you've got some impressive skills to keep me watching a math video and take notes past my usual bedtime
you take notes?
He is back! Greetings from Brazil, we've all been waiting for this release!
all these basic concepts such as derivatives, least square method, I'm learning it in my college. watching these kind of machine learning videos has made me understand the practical applications of these theoretical concepts a bit better now 😌
It's very very nice to see that are you updating.
Excellent video, thank you. I'm already looking forward to the synaptic plasticity video!
thank you so much! The most clear explanation of the topic i've seen so far, amazing job! I wish i had this kind of videos during school education.
Beat graphical experience with a clear information, Really enjoyed throughout the video !!!
Always impressive! Looking forward to the second one.
Hands down the best explanation there is to backprop
The world needs more of you bro
This is the best youtube channel in my feed, and I have many.
Excellent visualization! Keep posting like this! 😃😃
Very insightful video. Can't wait to see the second part. I would really love to see a video from you on spiking neural networks too!
Great video! Very elegant explanation of back propagation, and I’m super excited to see the different mechanics of biological neural networks! Keep up the good work.
This is just superb, thank you Artem! Timing couldn't be any better as the gradient descent algorithm was mentioned in Grahaene's "How We Learn" which I'm currently reading.
Make more videos like this. I learned so much. Thank you for making this great videos.
A million dollar explanation. Thank you @Artem
Amazing video. Underrated channel.
You are the best source of understanding computation that is biological and organic (all ml stuff), thank you.
I think I just found my favourite channel of all times.
I've been on YT since 2011 and never had a crush for a YT channel before today é.è
I have been doing ML research for a few years now but somehow I was drawn to this video. I am glad to say that it did not disappoint! You have done an amazing job, putting things in perspective and showing respect to calculus where it is due. We forget how a simple derivatives powers all of ML. Thank you for reminding that!
Thank you! That’s really nice to hear!
Excellent explanation - I already understood this conceptually but this video gives a very good intuition for the repeated chain rule application
I loved this content. You rock it! Congratulations! ❤
amazing video!!!!
I am recently doing AI by Hand and was stuck on the back-propagation concept.
It really help deepen my understanding of neural networks and back-propagation.
Thank you for this excellent explanations !
Wow, hats off to you! Can't even imagine how long it takes to make something like this
Ya ví el video completo como 5 veces en estas semanas, este tema me fascina
Glad to see ML related video from you ! As you have neuroscience background I would love to see some video that compare the current state of the art architecture work in ML with some of the inner working of the brain. For exemple if there are any structure in the brain with some ressemblance with GPT/transformers architecture, even thought the brain is light-years away I think that could be interesting :)
This is insane. I loved the video, keep it up!
31 years now, had like 13 years of math in school and another 5 years at university, first time i really understood how derivatives work, bcs visualisation instead of "you calculate it this way and derive it that way, now memorize"
Great job Artem
Outstanding explanation. Thanks
Thank you for illustration!
Mindblowing. Just the video I was looking for. TBH, initially, I was a bit put off by your English as I am not a mothertongue myself. However, your knowledge, competence, hard work and research behind this video got me hooked. Liked and subscribed. And I will be watching this video many times.Well done!
Some people just want to see the world learning. Great Video Artem!
Absolutely brilliant
Animation is great, but more and more people are doing it now. What make this special is the story, the complexity build-up is perfect and efficient. One needs a deep understanding of the subject and strong teaching skills to produce this.
omg, what an explanation. You legend, more power to you !!!
I cannot imagine just how much effort and work this took to make.
Artem back with another masterclass!
Amazing explanation!
This was amazing and mind blowing 🤩
Fantastic explanation and animations!
This video explains the mathematical base of neural networks in a way I understood it the frist time enough to be able to explain it to somebody else. Thank You for that. I can't even imagine how much work you put into the animations. A master piece!
Wonderful video, many thanks!
Really nice work! Congrats.
I enjoy watching your videos, thank you .
Superb explanation.
great explanation!
Nice explanation!
i just made that in python for a simple quadratic equation.....THANK YOU !!!! i just learned python and machine learning !!!!!!!!!!
Using desired y=0 i could also find one solution of the equation... wow i love this so much!!
The only different i did was to make x the weight and not the coeficients which i wanted them to be fixed inputs
What you helped me realise is that any system that can put in a computational graph like that 30:04 ...it can be embeded backpropagation regardles
THANK YOU im out of words
Also when the next loss is bigger or equal than the preview loss after one iteration... i divided the learning rate by a factor of 2 or 10 for more accuracy and if the next loss was smaller than the preview one i multiple the learning rate by a factor of 1.1 to 1.5 to speed up the proccess...thus having results in hundreds or even thousands less generations/iterations and less time consuming!!!!!
I can use this for optimizing my desired outputs in any system !!! JUST WOW!!
this is the only thing I never understood, I hope to finally understan it. I's weird how this video gets recommended just as I wanted to google about backpropagation
Amazing, enjoying very much!
Excellent presentation. You made it let from basic calculus, machine learning is just one simple step. What would be interesting is - what are the theoretical underpinnings of this method? When do we say learning is successful? What is the computational complexity of neural networks?
Excellent explanation
Thanks Artem
Top notch visuals man
Man this is such a great channel.
Phenomenal video
Yo, I'm hyped for the next video
You're doing pure ML content now? Excellent! Always glad to see more of your work, looking forward to watching the beautiful manim visuals and clear explanations as usual.
thanks! ;)
Yep! The channel so far has been a reflection of my research interests, and since i've joined an more computational theory neuro-AI lab, i figured more ML content with relevant topics of what i'm learning could be a nice addition
@@ArtemKirsanov the reason, why I am watching your videos, is exactly because of the fact you draw common traits and differences between biology (neuroscience) and ML/"models" of it. Thank you for these!
Good Work, Congrats
Как всегда великолепно!
I need the next video yesterday please!
Thank you sir.
Most Comprehensive Explanation EVER
my opinion : better than
3b 1b, No offence to 3b 1b Hes great at it and one of the pioneers who did these kind kf visual explanations.
But i like your explanation as it is slow paced & comprehensive
This is beautiful!
Aha! I get it now. Impressive effort to explain, thanks
This is up there with 3Blue1Brown for mathematical explanation, animation quality and overall elegance. Well done.
for your next video, there's some interesting work on RNA involvement in neuron plasticity in sea hare. I think the focus on structural changes in neurons as the key driver for engrams ignores major influences within the individual cell for AP propagation. Vision for instance is really dependent on how the optic neurons can encode and transmit rapidly fluctuating input signals where that whole concept breaks down.
There's some great lectures on fly vision encoding that recently dropped, and obviously a ton of work on modeling different types of neurons and how they encode or collect information based on the context of nearby cells. I'm not a compneurologist, just a molbio grad with a cs minor surrounding GNNs so I can only really comment on what I've seen in lecture that's influenced my understanding. I love your animations and approach to this sort of research though.
I think this video alone made all my Calculus I and II classes make sense now
That is a very good explanation
Wow. Wow. Wow. Thank you so much. This is instrumental for my study. Makes AI math a lot more approachable.
Amazing video ❤
The legend is back!
This was amazing, Artem 🙏🙏
The first half explaining derivatives was one of the best calculus lessons I've ever seen, and it was only background for the main topic!! 🤯
One note: you pronounced g (gee) more like j (jay) and it was throwing me off for a lil bit 😅
Can't wait for part 2!! 🧠✨
hehe, thanks! :)
I'm curious: Are your video editing skills superior, or do your tech skills take the lead? Your expertise is remarkable! I'd love to see a video on Transformer from you.
Great job, as always! I'm glad you don't forget about this channel and about us, your fans ^_^
The back propagation topic definitely is important. Nevertheless, it’s a neural network’s implementation of a known from cybernetics feedback. A neural network’s simulation of source data, based of interpolation of the source data, with following extrapolation, is an equilibrium of a neural network. Such interpolation widely described in mathematics, for example, in method Monte Carlo.
Thank you for considering this topic.
Great video sir, thanks. Please continue with more videos on AI.