Can you talk about liquid neural networks? I’m interested to know if that’s a revolutionary work that deserves more recognition and following. arxiv.org/pdf/2006.04439.pdf
Artem, i want to thank you, not only for publishing an excellent material (from the 100's of DL/ML videos i've saved, yours is top 5 - really), as well as having a great intonation, that helps A LOT in capturing the attention in this day and age of constant distractions. THANK YOU 🙌
Back prop is a hard, heavy thing to explain, and this video does it extremely well. I mean, that section 'Computational Graph and Autodiff' might be the best explanation of that subject on the internet. I'm very impressed - well done!
@@George70220 I don't think CuriousLad meant it as a diss, it's just that when Artem made the video, he explained the Calculus section as a background information. The partial derivates and gradient descent wasn't the main topic of the vid, yet you could show this to Calculus I student and they would be thanking him for the explanation, even if they have not interest in learning back propagation! That's why funnily enough, while the intro Calc topics wasn't the main part of the video, that portion would be very helpful to anyone starting out int Calc!
I dont agree for example the act of minimizing loss function and gradient descend were not properly linked there were just two pieces of information unprocessed dumped in series
I found it unnecessary. Anyone who clicks on a video about back propagation likely already knows calculus, and if they don’t, that short primer is not going to be enough foundation for the rest of the content.
It makes sense that you would cover both computational neuroscience AND machine learning since they both play a significant role in AI research. The sort of content you're making is definitely 3Blue1Brown level. Keep up the good work!
I knew that calculus is important for machine learning but never knew that 12th grade derivatives are that much important. When you said about chain rule, that bring me back to my school days , I never thought that derivatives, integration and probabilities will be used this way in future. Well explained video. Thanks for sharing this knowledge and conveying process much simply.
I've been trying to get into ML for quite a while now. This is by far the best explanation of gradient descent and back propagation hands down!!! Amazing work!!!
That was an outstanding explanation. Your ability to explain higher mathematical concepts in such simple terms is really an amazing service to the rest of us who wanna understand these subjects but don’t have a mathematics degree. Thank you.
I actually pictured this all in my head successfully where I thought I had everything in a canonical deep neural network figured out the other day. It’s one thing to hold it, it’s another to do the detailed, gritty work of explaining it in video format. Very well done.
Guru of Fundamentals. I can't resist subscribing to your channel and watch all of your videos. The way you explained Chain Rule : the logic behind it is awesome. I am trying to visualize the Quotient Rule of Derivatives in your way. A good Teacher always makes you THINK 🙏
This is a visual masterpiece! Well done! Much of this was a review for me as I took the time to go through all this last year. I did an implementation of the MNIST handwritten number neural network and had to learn all the calculus covered here to work out the backpropagation math. You really do have to dig in to it to get a good handle on it but it's fun stuff.
I just have to say this goes way beyond the quality of the many chainrule videos I've seen so far. Good job man, you've got some impressive skills to keep me watching a math video and take notes past my usual bedtime
all these basic concepts such as derivatives, least square method, I'm learning it in my college. watching these kind of machine learning videos has made me understand the practical applications of these theoretical concepts a bit better now 😌
This is one of, if not the, best videos I’ve seen that throughly explains back propagation. It will definitely help me to be able to better explain the algorithm to others, so thank you for creating it.
Damn, I was wondering where you've been since over half a year, whilst I was stuck in backpropagation😂 and here you came back like a true mind reader. Glad to see you back❤
Understanding something complex requires high intelligence. Explaining it simply requires even higher intelligence. You are one of the best teachers that I have encountered in my life! I'm grateful!
i just made that in python for a simple quadratic equation.....THANK YOU !!!! i just learned python and machine learning !!!!!!!!!! Using desired y=0 i could also find one solution of the equation... wow i love this so much!! The only different i did was to make x the weight and not the coeficients which i wanted them to be fixed inputs What you helped me realise is that any system that can put in a computational graph like that 30:04 ...it can be embeded backpropagation regardles THANK YOU im out of words Also when the next loss is bigger or equal than the preview loss after one iteration... i divided the learning rate by a factor of 2 or 10 for more accuracy and if the next loss was smaller than the preview one i multiple the learning rate by a factor of 1.1 to 1.5 to speed up the proccess...thus having results in hundreds or even thousands less generations/iterations and less time consuming!!!!! I can use this for optimizing my desired outputs in any system !!! JUST WOW!!
This is the best resource I have found to learn backpropagation. The visualization of each concept made this very clear. I can't even imagine the amount of effort you have put into this video.
The best and most understandable explanation I have ever seen. You explained the essential basis of Artificial Neural Networks so beautifully. I really congratulate you
I have been doing ML research for a few years now but somehow I was drawn to this video. I am glad to say that it did not disappoint! You have done an amazing job, putting things in perspective and showing respect to calculus where it is due. We forget how a simple derivatives powers all of ML. Thank you for reminding that!
This video explains the mathematical base of neural networks in a way I understood it the frist time enough to be able to explain it to somebody else. Thank You for that. I can't even imagine how much work you put into the animations. A master piece!
I just watched this video after completing my first lecture of Deep Learning on Backpropagation and Gradient Descent. thanks man! appreciated. Really solid content.
As a student in this business, who has passed through a bunch of professors, I can say with confidence! With this trader, you will both learn and earn and, importantly, receive advice. Everything is competent and clear, without a bunch of any unnecessary movements! Keep up the good work!🤣
in simple words, backpropagation is method of finding gradient descent. to minimising the losses their is need to find the correct values of each function and it is get by chain rule which is based on darivatives and calculus. it's direction in backward direction hence it is known as backward propagation. still so much confusion in mind regarding this process. video is very useful and editing is extraordinary.
Where you were so far.... I've been trying to understand this concept from past 2 years, and now it's cleared after watching this video. Honestly, my maths was not up to the mark. After seeing your video so many important concepts are cleared. Don't have enough words to thank you.... God bless you. I'll share your videos with my friends. Please keep it up.....🙏🙏🙏🙏🙏🙏🙏🙏🙏
Man, you really nailed it, especially the Computational Graph and Autodiff part. I heard so many times about them in lectures on Stanford and others. However, this was impressive.
I should be watching gameplays but here I am procrastinating. Jokes aside, im 13 minutes into the video and astonished by your crystal clear explanations and the quality of your material, this is gold.
I cannot tell how much excited this video has got me once I realized I am understanding every single step effortlessly.😂😂😂 Thanks so much for the explanation. God bless you!🙏🙏🙏🙏🙏🙏
Glad to see ML related video from you ! As you have neuroscience background I would love to see some video that compare the current state of the art architecture work in ML with some of the inner working of the brain. For exemple if there are any structure in the brain with some ressemblance with GPT/transformers architecture, even thought the brain is light-years away I think that could be interesting :)
31 years now, had like 13 years of math in school and another 5 years at university, first time i really understood how derivatives work, bcs visualisation instead of "you calculate it this way and derive it that way, now memorize"
@@ArnaudMEURET Sorry, but I don't believe anybody who have no idea what a tangent line of a function in point x looks like, and what it means (despite milion of excersises teaching you it's meaning), and dozens of graphs in literally every workbook, could actually go through 5 years of math related subject. This guy is straight up lying or trolling.
@@WsciekleMleko exactly. People shit on school because "muh education system bad" and forget about all of the interesting stuff they actually teach. No, you're not heroes trying to fight against the big bad. You're just lazy and want an excuse to keep being one.
Um, how long did it take you to graduate grade school? 13 years of math! Even if you did start learning maths in PK or K to 12, it’s basic mathematics until about grade 4 when you’re starting to build off the foundation of that such as algebra, geometry etc In other words, the maths ain’t mathing!
Very well explained how backpropagation and how the loss function helps in determining the optimal minimum by using calculus, great detail which helps newbies like me understand this complex topic much better.
Very good video, very well explained. But there is one problem you didn't mention. When training very deep neural networks and using a sigmoid or tanh funktion as the activation funktion, Backpropagation loses it's "powers". The learning prozess becomes extremely slow and results are suboptimal. One of many solutions to this is a ReLU or a ELU funktion in the hidden layers instead of Sigmoid or Tanh. And also how we initialize our weights at the beginning. For example the He-Initialiazation...
Most Comprehensive Explanation EVER my opinion : better than 3b 1b, No offence to 3b 1b Hes great at it and one of the pioneers who did these kind kf visual explanations. But i like your explanation as it is slow paced & comprehensive
Yeah 3b1b definitely deserves respect from me, but I think he will to recognize this video is very carefully done. I like that these people just care about the truth and the perfection, and even with a little bit of envy, care about the best product being done.
Watching this video was like a breath of fresh air after some heavy math calculations! The visual explanations really helped solidify my understanding of backpropagation. I appreciate how clear and easy to follow the graphs were. Keep up the fantastic work! Can't wait for more graphic doses like this.
On the computational graph @35:22 can anyone explain why the partial derivatives w.r.t the polynomial terms multiplied by the constants (i.e k_0, k_1 * x, ..., k_5 * x^5) suddenly had a negative sign (-2 * delta_y1)? The partials were back-propagated from a summation operation so I'd think they would remain positive... Or is it just an error?
This is just superb, thank you Artem! Timing couldn't be any better as the gradient descent algorithm was mentioned in Grahaene's "How We Learn" which I'm currently reading.
Waiting patiently for the second video 🫰♥️. Much love from Kenya, thank you for making me understand back propagation. Started watching your channel because of Obsidian, stayed for the AI lessons 🫰.
I wish the Chain Rule was explained in this manner when I was in university. I understood how to do it on paper just fine, but this explanation makes the reasoning behind it make complete sense.
WOW!!! The amount of animation you have made is just incredible. I would really like to not know about backprop again in order to fully appreciate this video!
Great video! Very elegant explanation of back propagation, and I’m super excited to see the different mechanics of biological neural networks! Keep up the good work.
You're doing pure ML content now? Excellent! Always glad to see more of your work, looking forward to watching the beautiful manim visuals and clear explanations as usual.
thanks! ;) Yep! The channel so far has been a reflection of my research interests, and since i've joined an more computational theory neuro-AI lab, i figured more ML content with relevant topics of what i'm learning could be a nice addition
@@ArtemKirsanov the reason, why I am watching your videos, is exactly because of the fact you draw common traits and differences between biology (neuroscience) and ML/"models" of it. Thank you for these!
Such a good video. Have liked and subscribed! I love the Curve Fitter 6000 machine in the animations to explain these concepts. Most textbooks are just too abstract and confusing but you have done a great job.
Join Shortform for awesome book guides and get 5 days of unlimited access! shortform.com/artem
Can you talk about liquid neural networks? I’m interested to know if that’s a revolutionary work that deserves more recognition and following.
arxiv.org/pdf/2006.04439.pdf
Artem, i want to thank you, not only for publishing an excellent material (from the 100's of DL/ML videos i've saved, yours is top 5 - really), as well as having a great intonation, that helps A LOT in capturing the attention in this day and age of constant distractions. THANK YOU 🙌
Back prop is a hard, heavy thing to explain, and this video does it extremely well. I mean, that section 'Computational Graph and Autodiff' might be the best explanation of that subject on the internet. I'm very impressed - well done!
You two are the best channels I have found in the SoME episodes. It's great to see this interaction between you guys.
Love your videos
If there is no mention of sine waves in neural networks then it won't be total.
Where is that section 'Computational Graph and Autodiff' ?
Yeah really helped me get the significance of autodiff
Funnily enough, the calculus portion of the video is probably one of the best explained I've seen
Why would that be 'funnily enough'? What a diss lmao.
@@George70220 I don't think CuriousLad meant it as a diss, it's just that when Artem made the video, he explained the Calculus section as a background information. The partial derivates and gradient descent wasn't the main topic of the vid, yet you could show this to Calculus I student and they would be thanking him for the explanation, even if they have not interest in learning back propagation! That's why funnily enough, while the intro Calc topics wasn't the main part of the video, that portion would be very helpful to anyone starting out int Calc!
I dont agree for example the act of minimizing loss function and gradient descend were not properly linked there were just two pieces of information unprocessed dumped in series
I found it unnecessary. Anyone who clicks on a video about back propagation likely already knows calculus, and if they don’t, that short primer is not going to be enough foundation for the rest of the content.
Nasdaq please buy toggle 0:25
this's by far the most clearer explaination and simplification of backpropagation i have watched
It’s probably the best explanation of backward propagation. Hats off to your hard work and saving this so valuable content.
"Wait, It's all derivatives?"
"Always has been"
Great work pal. Provides excellent clarity.
Looking forward to the second part.
😂 Turns out back propagation isn’t just magic
It makes sense that you would cover both computational neuroscience AND machine learning since they both play a significant role in AI research. The sort of content you're making is definitely 3Blue1Brown level. Keep up the good work!
He also managed to squeeze an entire calc 1 course into this single video. It's amazing
By far the best ML explanation I have seen on internet.
This just might be the most underrated video on Back Propagation that I've ever seen! I hope more people come across this
The visuals on this video is from another planet . So Good !!!!!!!!
This has to be one of the greatest explanation of the inner working of learning in ML, I love it!
indeed
I knew that calculus is important for machine learning but never knew that 12th grade derivatives are that much important.
When you said about chain rule, that bring me back to my school days , I never thought that derivatives, integration and probabilities will be used this way in future.
Well explained video.
Thanks for sharing this knowledge and conveying process much simply.
Very True!
Remember when people said nobody needs higher dimensions expect those stupid quantum scientist and nothing useful would come out of it? yeah...^^
I've been trying to get into ML for quite a while now. This is by far the best explanation of gradient descent and back propagation hands down!!!
Amazing work!!!
That was an outstanding explanation. Your ability to explain higher mathematical concepts in such simple terms is really an amazing service to the rest of us who wanna understand these subjects but don’t have a mathematics degree. Thank you.
I actually pictured this all in my head successfully where I thought I had everything in a canonical deep neural network figured out the other day. It’s one thing to hold it, it’s another to do the detailed, gritty work of explaining it in video format. Very well done.
I've seen probably 20 videos on this and your explanation of the derivatives for someone not in calculus was really helpful. thanks.
Guru of Fundamentals. I can't resist subscribing to your channel and watch all of your videos. The way you explained Chain Rule : the logic behind it is awesome. I am trying to visualize the Quotient Rule of Derivatives in your way. A good Teacher always makes you THINK 🙏
There could not have been a better explanation. Hats off to you
This is a visual masterpiece! Well done!
Much of this was a review for me as I took the time to go through all this last year. I did an implementation of the MNIST handwritten number neural network and had to learn all the calculus covered here to work out the backpropagation math. You really do have to dig in to it to get a good handle on it but it's fun stuff.
I just have to say this goes way beyond the quality of the many chainrule videos I've seen so far. Good job man, you've got some impressive skills to keep me watching a math video and take notes past my usual bedtime
you take notes?
@@marc_frank It's generally a good idea if you are trying to learn. Don't be passive if you want it to stick.
Taking notes, making sketches of the ideas, doing the math are excellent learning techniques. Old timers like me always do that 👍
Absolutely one of the best videos explaining data points and regression formulas I have ever seen. Amazing work
all these basic concepts such as derivatives, least square method, I'm learning it in my college. watching these kind of machine learning videos has made me understand the practical applications of these theoretical concepts a bit better now 😌
This is the best ever explanation I have seen. Thanks for taking the time and doing something extraordinary.
Dude, this is the most beautiful ML video i've ever seen. Highly informative yes, but also beautifully made. Thank you for your work.
This is one of, if not the, best videos I’ve seen that throughly explains back propagation. It will definitely help me to be able to better explain the algorithm to others, so thank you for creating it.
Damn, I was wondering where you've been since over half a year, whilst I was stuck in backpropagation😂 and here you came back like a true mind reader. Glad to see you back❤
He was calculating your backward step so you can make your next forward step (sorry, couldnt resist) XD
@@highchillerhe just gave you the right explanation gradient so that you can optimize your learning loss function 😂
Understanding something complex requires high intelligence. Explaining it simply requires even higher intelligence. You are one of the best teachers that I have encountered in my life! I'm grateful!
i just made that in python for a simple quadratic equation.....THANK YOU !!!! i just learned python and machine learning !!!!!!!!!!
Using desired y=0 i could also find one solution of the equation... wow i love this so much!!
The only different i did was to make x the weight and not the coeficients which i wanted them to be fixed inputs
What you helped me realise is that any system that can put in a computational graph like that 30:04 ...it can be embeded backpropagation regardles
THANK YOU im out of words
Also when the next loss is bigger or equal than the preview loss after one iteration... i divided the learning rate by a factor of 2 or 10 for more accuracy and if the next loss was smaller than the preview one i multiple the learning rate by a factor of 1.1 to 1.5 to speed up the proccess...thus having results in hundreds or even thousands less generations/iterations and less time consuming!!!!!
I can use this for optimizing my desired outputs in any system !!! JUST WOW!!
Excellent explanation!! You have done a selfless service to humanity.
He is back! Greetings from Brazil, we've all been waiting for this release!
This is the best resource I have found to learn backpropagation. The visualization of each concept made this very clear. I can't even imagine the amount of effort you have put into this video.
The best and most understandable explanation I have ever seen. You explained the essential basis of Artificial Neural Networks so beautifully. I really congratulate you
I have been doing ML research for a few years now but somehow I was drawn to this video. I am glad to say that it did not disappoint! You have done an amazing job, putting things in perspective and showing respect to calculus where it is due. We forget how a simple derivatives powers all of ML. Thank you for reminding that!
Thank you! That’s really nice to hear!
This is the best educational video i ever seen on internet.. explaining the Backpropogation with visualisation. Amazingly super😊😊
So clear and concise! Thank you for creating this.
This video explains the mathematical base of neural networks in a way I understood it the frist time enough to be able to explain it to somebody else. Thank You for that. I can't even imagine how much work you put into the animations. A master piece!
The best explanation about Deep Learning. Grateful.
This has to be the best explanation of the chain rule ever! Thanks
I just watched this video after completing my first lecture of Deep Learning on Backpropagation and Gradient Descent.
thanks man! appreciated. Really solid content.
One of the best visual explanations of the backpropagation algorithm I've seen! The animations are really good.
Sure that it was the back propagation algorithm?
What a great explanation and clarification especially for all mathematics required to understand Back prop algorithm, appreciate this so much
Finally a solid explantation of backpropagation. Thank you!!
As a student in this business, who has passed through a bunch of professors, I can say with confidence! With this trader, you will both learn and earn and, importantly, receive advice. Everything is competent and clear, without a bunch of any unnecessary movements! Keep up the good work!🤣
in simple words, backpropagation is method of finding gradient descent. to minimising the losses their is need to find the correct values of each function and it is get by chain rule which is based on darivatives and calculus. it's direction in backward direction hence it is known as backward propagation.
still so much confusion in mind regarding this process.
video is very useful and editing is extraordinary.
Where you were so far.... I've been trying to understand this concept from past 2 years, and now it's cleared after watching this video. Honestly, my maths was not up to the mark. After seeing your video so many important concepts are cleared.
Don't have enough words to thank you.... God bless you.
I'll share your videos with my friends. Please keep it up.....🙏🙏🙏🙏🙏🙏🙏🙏🙏
This is incredibly well done and helped me visualize derivatives comprehensively. Thank you.
Man, you really nailed it, especially the Computational Graph and Autodiff part. I heard so many times about them in lectures on Stanford and others. However, this was impressive.
Hands down the best explanation there is to backprop
I think I just found my favourite channel of all times.
I've been on YT since 2011 and never had a crush for a YT channel before today é.è
this is the most intuitive video I have ever come across. Amazing work!!!!!
I should be watching gameplays but here I am procrastinating. Jokes aside, im 13 minutes into the video and astonished by your crystal clear explanations and the quality of your material, this is gold.
The best explanation of machime learning i have ever seen on you tube ,amazing work .thank you👍
I cannot tell how much excited this video has got me once I realized I am understanding every single step effortlessly.😂😂😂
Thanks so much for the explanation.
God bless you!🙏🙏🙏🙏🙏🙏
Another gem of a video, well done Artem!! This channel deserves 1M+ subscribers, there's nothing else like it on RUclips.
Beat graphical experience with a clear information, Really enjoyed throughout the video !!!
This video has an amazing and easy-to-understand explanation of the basics of Calculus. Many Thanks to the Creator 🙏🏼
Magnificent work, from the beautiful, creative, elegant design, to the mastery in teaching. Thank you!
Some people just want to see the world learning. Great Video Artem!
Best description on the topic on the internet!
The world needs more of you bro
That was fire bro! Gonna have to rewatch to understand the back step, but a lot clearer than most videos
Glad to see ML related video from you ! As you have neuroscience background I would love to see some video that compare the current state of the art architecture work in ML with some of the inner working of the brain. For exemple if there are any structure in the brain with some ressemblance with GPT/transformers architecture, even thought the brain is light-years away I think that could be interesting :)
31 years now, had like 13 years of math in school and another 5 years at university, first time i really understood how derivatives work, bcs visualisation instead of "you calculate it this way and derive it that way, now memorize"
May I ask which university you went to?
@@ArnaudMEURET Sorry, but I don't believe anybody who have no idea what a tangent line of a function in point x looks like, and what it means (despite milion of excersises teaching you it's meaning), and dozens of graphs in literally every workbook, could actually go through 5 years of math related subject. This guy is straight up lying or trolling.
@@WsciekleMleko exactly. People shit on school because "muh education system bad" and forget about all of the interesting stuff they actually teach.
No, you're not heroes trying to fight against the big bad. You're just lazy and want an excuse to keep being one.
Um, how long did it take you to graduate grade school? 13 years of math! Even if you did start learning maths in PK or K to 12, it’s basic mathematics until about grade 4 when you’re starting to build off the foundation of that such as algebra, geometry etc
In other words, the maths ain’t mathing!
Hands down the best explanation I have seen so far! So clear and easy to understand!!
Excellent explanation of back-propagation, the building block of machine learning. Thanks a lot.
This is the best video about this topic. Learned a lot of things. Took me 2 or more hours but I understand it now. Thank you!
It's very very nice to see that are you updating.
Very well explained how backpropagation and how the loss function helps in determining the optimal minimum by using calculus, great detail which helps newbies like me understand this complex topic much better.
This is the best youtube channel in my feed, and I have many.
So much effort, in this video, the quality of the content at the same level of 3B1B, keep it going man.
Amazing how you can explain it so well, so simply. You have a subscriber !
bro im 2 minutes in and your graphics are insanely good I can already tell this is going to be a treat. Holy smokes man I'm having a graphicgasm
Your approach to trading is truly impressive. Thank you for teaching me so much!
Very good video, very well explained. But there is one problem you didn't mention. When training very deep neural networks and using a sigmoid or tanh funktion as the activation funktion, Backpropagation loses it's "powers". The learning prozess becomes extremely slow and results are suboptimal. One of many solutions to this is a ReLU or a ELU funktion in the hidden layers instead of Sigmoid or Tanh. And also how we initialize our weights at the beginning. For example the He-Initialiazation...
simply the best presentation on the subject
Most Comprehensive Explanation EVER
my opinion : better than
3b 1b, No offence to 3b 1b Hes great at it and one of the pioneers who did these kind kf visual explanations.
But i like your explanation as it is slow paced & comprehensive
Yeah 3b1b definitely deserves respect from me, but I think he will to recognize this video is very carefully done.
I like that these people just care about the truth and the perfection, and even with a little bit of envy, care about the best product being done.
this video has amazing animations. You/your team clearly have a very high attention to details
Watching this video was like a breath of fresh air after some heavy math calculations! The visual explanations really helped solidify my understanding of backpropagation. I appreciate how clear and easy to follow the graphs were. Keep up the fantastic work! Can't wait for more graphic doses like this.
Brilliant video! The math, detailed visuals and explanation are excellent. Thank you.
Wow amazing thank you. Ive read and watched many videos on this topic and this is the one where I finally "got it"
This video is an absolute masterpiece, congratulations
On the computational graph @35:22 can anyone explain why the partial derivatives w.r.t the polynomial terms multiplied by the constants (i.e k_0, k_1 * x, ..., k_5 * x^5) suddenly had a negative sign (-2 * delta_y1)? The partials were back-propagated from a summation operation so I'd think they would remain positive... Or is it just an error?
whoops, you're right, there shouldn't be a minus sign there! sorry about that.
Good catch!
This is just superb, thank you Artem! Timing couldn't be any better as the gradient descent algorithm was mentioned in Grahaene's "How We Learn" which I'm currently reading.
Waiting patiently for the second video 🫰♥️. Much love from Kenya, thank you for making me understand back propagation. Started watching your channel because of Obsidian, stayed for the AI lessons 🫰.
That's the most amazing way of explaining such hard things to understand
Thanks
As soon as I saw this video, I knew it was going to be the best of this kind on the Internet. And it was. Fantastic video!
Excellent explanation. I am going to rewatch this a few more times. Well done and thank you.
I wish the Chain Rule was explained in this manner when I was in university. I understood how to do it on paper just fine, but this explanation makes the reasoning behind it make complete sense.
Excellent explanation - I already understood this conceptually but this video gives a very good intuition for the repeated chain rule application
Thanks!
WOW!!! The amount of animation you have made is just incredible. I would really like to not know about backprop again in order to fully appreciate this video!
Great video! Very elegant explanation of back propagation, and I’m super excited to see the different mechanics of biological neural networks! Keep up the good work.
You're doing pure ML content now? Excellent! Always glad to see more of your work, looking forward to watching the beautiful manim visuals and clear explanations as usual.
thanks! ;)
Yep! The channel so far has been a reflection of my research interests, and since i've joined an more computational theory neuro-AI lab, i figured more ML content with relevant topics of what i'm learning could be a nice addition
@@ArtemKirsanov the reason, why I am watching your videos, is exactly because of the fact you draw common traits and differences between biology (neuroscience) and ML/"models" of it. Thank you for these!
A million dollar explanation. Thank you @Artem
Спасибо, это лучший канал связок, все работает, буду это пробовать.
Such a good video. Have liked and subscribed!
I love the Curve Fitter 6000 machine in the animations to explain these concepts. Most textbooks are just too abstract and confusing but you have done a great job.
Very insightful video. Can't wait to see the second part. I would really love to see a video from you on spiking neural networks too!
This video would have saved me so many days that I have spent on researching backpropagation 2 years ago
Wow, hats off to you! Can't even imagine how long it takes to make something like this