The Most Important Algorithm in Machine Learning

Поделиться
HTML-код
  • Опубликовано: 22 дек 2024

Комментарии • 535

  • @ArtemKirsanov
    @ArtemKirsanov  8 месяцев назад +37

    Join Shortform for awesome book guides and get 5 days of unlimited access! shortform.com/artem

    • @TNTsundar
      @TNTsundar 8 месяцев назад

      Can you talk about liquid neural networks? I’m interested to know if that’s a revolutionary work that deserves more recognition and following.
      arxiv.org/pdf/2006.04439.pdf

    • @webgpu
      @webgpu 3 месяца назад +2

      Artem, i want to thank you, not only for publishing an excellent material (from the 100's of DL/ML videos i've saved, yours is top 5 - really), as well as having a great intonation, that helps A LOT in capturing the attention in this day and age of constant distractions. THANK YOU 🙌

  • @Mutual_Information
    @Mutual_Information 8 месяцев назад +561

    Back prop is a hard, heavy thing to explain, and this video does it extremely well. I mean, that section 'Computational Graph and Autodiff' might be the best explanation of that subject on the internet. I'm very impressed - well done!

    • @33gbm
      @33gbm 8 месяцев назад +5

      You two are the best channels I have found in the SoME episodes. It's great to see this interaction between you guys.

    • @dprophecyguy
      @dprophecyguy 8 месяцев назад +3

      Love your videos

    • @michaelcharlesthearchangel
      @michaelcharlesthearchangel 8 месяцев назад +2

      If there is no mention of sine waves in neural networks then it won't be total.

    • @ExtantFrodo2
      @ExtantFrodo2 6 месяцев назад +1

      Where is that section 'Computational Graph and Autodiff' ?

    • @entivreality
      @entivreality 5 месяцев назад +2

      Yeah really helped me get the significance of autodiff

  • @CuriousLad
    @CuriousLad 8 месяцев назад +301

    Funnily enough, the calculus portion of the video is probably one of the best explained I've seen

    • @George70220
      @George70220 8 месяцев назад +3

      Why would that be 'funnily enough'? What a diss lmao.

    • @balu6923
      @balu6923 8 месяцев назад +30

      @@George70220 I don't think CuriousLad meant it as a diss, it's just that when Artem made the video, he explained the Calculus section as a background information. The partial derivates and gradient descent wasn't the main topic of the vid, yet you could show this to Calculus I student and they would be thanking him for the explanation, even if they have not interest in learning back propagation! That's why funnily enough, while the intro Calc topics wasn't the main part of the video, that portion would be very helpful to anyone starting out int Calc!

    • @veritas7010
      @veritas7010 8 месяцев назад +2

      I dont agree for example the act of minimizing loss function and gradient descend were not properly linked there were just two pieces of information unprocessed dumped in series

    • @tonic4120
      @tonic4120 5 месяцев назад +1

      I found it unnecessary. Anyone who clicks on a video about back propagation likely already knows calculus, and if they don’t, that short primer is not going to be enough foundation for the rest of the content.

    • @MphoMotlokwemampuru
      @MphoMotlokwemampuru 4 месяца назад +1

      Nasdaq please buy toggle 0:25

  • @aminebouramada
    @aminebouramada 8 месяцев назад +13

    this's by far the most clearer explaination and simplification of backpropagation i have watched

  • @shikhargairola5815
    @shikhargairola5815 8 месяцев назад +28

    It’s probably the best explanation of backward propagation. Hats off to your hard work and saving this so valuable content.

  • @undertheshadow
    @undertheshadow 8 месяцев назад +186

    "Wait, It's all derivatives?"
    "Always has been"
    Great work pal. Provides excellent clarity.
    Looking forward to the second part.

    • @rad6626
      @rad6626 7 месяцев назад +5

      😂 Turns out back propagation isn’t just magic

  • @vastabyss6496
    @vastabyss6496 8 месяцев назад +110

    It makes sense that you would cover both computational neuroscience AND machine learning since they both play a significant role in AI research. The sort of content you're making is definitely 3Blue1Brown level. Keep up the good work!

    • @nickwissler6811
      @nickwissler6811 6 месяцев назад +9

      He also managed to squeeze an entire calc 1 course into this single video. It's amazing

  • @matheusmendonca1332
    @matheusmendonca1332 8 месяцев назад +23

    By far the best ML explanation I have seen on internet.

  • @black_crest
    @black_crest 8 месяцев назад +6

    This just might be the most underrated video on Back Propagation that I've ever seen! I hope more people come across this

  • @ReighKnight
    @ReighKnight 8 месяцев назад +14

    The visuals on this video is from another planet . So Good !!!!!!!!

  • @keithwallace5277
    @keithwallace5277 7 месяцев назад +42

    This has to be one of the greatest explanation of the inner working of learning in ML, I love it!

  • @priteshtadvi4946
    @priteshtadvi4946 6 месяцев назад +38

    I knew that calculus is important for machine learning but never knew that 12th grade derivatives are that much important.
    When you said about chain rule, that bring me back to my school days , I never thought that derivatives, integration and probabilities will be used this way in future.
    Well explained video.
    Thanks for sharing this knowledge and conveying process much simply.

    • @arjunrao9978
      @arjunrao9978 5 месяцев назад

      Very True!

    • @mb2776
      @mb2776 18 дней назад +1

      Remember when people said nobody needs higher dimensions expect those stupid quantum scientist and nothing useful would come out of it? yeah...^^

  • @maheshwaransivagnanam6452
    @maheshwaransivagnanam6452 6 месяцев назад +14

    I've been trying to get into ML for quite a while now. This is by far the best explanation of gradient descent and back propagation hands down!!!
    Amazing work!!!

  • @Redant1Redant
    @Redant1Redant 6 месяцев назад +6

    That was an outstanding explanation. Your ability to explain higher mathematical concepts in such simple terms is really an amazing service to the rest of us who wanna understand these subjects but don’t have a mathematics degree. Thank you.

  • @Alwaysiamcaesar
    @Alwaysiamcaesar 7 месяцев назад +5

    I actually pictured this all in my head successfully where I thought I had everything in a canonical deep neural network figured out the other day. It’s one thing to hold it, it’s another to do the detailed, gritty work of explaining it in video format. Very well done.

  • @bungerwow7963
    @bungerwow7963 7 месяцев назад +4

    I've seen probably 20 videos on this and your explanation of the derivatives for someone not in calculus was really helpful. thanks.

  • @ZavierBanerjea
    @ZavierBanerjea 4 месяца назад +1

    Guru of Fundamentals. I can't resist subscribing to your channel and watch all of your videos. The way you explained Chain Rule : the logic behind it is awesome. I am trying to visualize the Quotient Rule of Derivatives in your way. A good Teacher always makes you THINK 🙏

  • @pradhumnkanase8381
    @pradhumnkanase8381 8 месяцев назад +5

    There could not have been a better explanation. Hats off to you

  • @K9Megahertz
    @K9Megahertz 8 месяцев назад +10

    This is a visual masterpiece! Well done!
    Much of this was a review for me as I took the time to go through all this last year. I did an implementation of the MNIST handwritten number neural network and had to learn all the calculus covered here to work out the backpropagation math. You really do have to dig in to it to get a good handle on it but it's fun stuff.

  • @gianlucanordio7200
    @gianlucanordio7200 8 месяцев назад +8

    I just have to say this goes way beyond the quality of the many chainrule videos I've seen so far. Good job man, you've got some impressive skills to keep me watching a math video and take notes past my usual bedtime

    • @marc_frank
      @marc_frank 7 месяцев назад

      you take notes?

    • @kevinscales
      @kevinscales 5 месяцев назад +2

      @@marc_frank It's generally a good idea if you are trying to learn. Don't be passive if you want it to stick.

    • @ZavierBanerjea
      @ZavierBanerjea 4 месяца назад +1

      Taking notes, making sketches of the ideas, doing the math are excellent learning techniques. Old timers like me always do that 👍

  • @ChaseGartner
    @ChaseGartner 5 месяцев назад +1

    Absolutely one of the best videos explaining data points and regression formulas I have ever seen. Amazing work

  • @aabiddd
    @aabiddd 8 месяцев назад +4

    all these basic concepts such as derivatives, least square method, I'm learning it in my college. watching these kind of machine learning videos has made me understand the practical applications of these theoretical concepts a bit better now 😌

  • @cachegrk
    @cachegrk 8 месяцев назад +2

    This is the best ever explanation I have seen. Thanks for taking the time and doing something extraordinary.

  • @naveen_malla
    @naveen_malla 7 месяцев назад +3

    Dude, this is the most beautiful ML video i've ever seen. Highly informative yes, but also beautifully made. Thank you for your work.

  • @AlexKelleyD
    @AlexKelleyD 8 месяцев назад +10

    This is one of, if not the, best videos I’ve seen that throughly explains back propagation. It will definitely help me to be able to better explain the algorithm to others, so thank you for creating it.

  • @Anonymous-fr2op
    @Anonymous-fr2op 8 месяцев назад +40

    Damn, I was wondering where you've been since over half a year, whilst I was stuck in backpropagation😂 and here you came back like a true mind reader. Glad to see you back❤

    • @highchiller
      @highchiller 8 месяцев назад +4

      He was calculating your backward step so you can make your next forward step (sorry, couldnt resist) XD

    • @David123456789345
      @David123456789345 4 месяца назад

      @@highchillerhe just gave you the right explanation gradient so that you can optimize your learning loss function 😂

  • @pavanmamidi7705
    @pavanmamidi7705 Месяц назад

    Understanding something complex requires high intelligence. Explaining it simply requires even higher intelligence. You are one of the best teachers that I have encountered in my life! I'm grateful!

  • @TruthOfZ0
    @TruthOfZ0 8 месяцев назад +3

    i just made that in python for a simple quadratic equation.....THANK YOU !!!! i just learned python and machine learning !!!!!!!!!!
    Using desired y=0 i could also find one solution of the equation... wow i love this so much!!
    The only different i did was to make x the weight and not the coeficients which i wanted them to be fixed inputs
    What you helped me realise is that any system that can put in a computational graph like that 30:04 ...it can be embeded backpropagation regardles
    THANK YOU im out of words
    Also when the next loss is bigger or equal than the preview loss after one iteration... i divided the learning rate by a factor of 2 or 10 for more accuracy and if the next loss was smaller than the preview one i multiple the learning rate by a factor of 1.1 to 1.5 to speed up the proccess...thus having results in hundreds or even thousands less generations/iterations and less time consuming!!!!!
    I can use this for optimizing my desired outputs in any system !!! JUST WOW!!

  • @sparkle2575
    @sparkle2575 4 месяца назад +3

    Excellent explanation!! You have done a selfless service to humanity.

  • @tonsetz
    @tonsetz 8 месяцев назад +1

    He is back! Greetings from Brazil, we've all been waiting for this release!

  • @techshivanik
    @techshivanik 2 месяца назад

    This is the best resource I have found to learn backpropagation. The visualization of each concept made this very clear. I can't even imagine the amount of effort you have put into this video.

  • @ahmeterdonmez9195
    @ahmeterdonmez9195 3 месяца назад +1

    The best and most understandable explanation I have ever seen. You explained the essential basis of Artificial Neural Networks so beautifully. I really congratulate you

  • @winterknight1159
    @winterknight1159 8 месяцев назад +1

    I have been doing ML research for a few years now but somehow I was drawn to this video. I am glad to say that it did not disappoint! You have done an amazing job, putting things in perspective and showing respect to calculus where it is due. We forget how a simple derivatives powers all of ML. Thank you for reminding that!

    • @ArtemKirsanov
      @ArtemKirsanov  7 месяцев назад

      Thank you! That’s really nice to hear!

  • @shekharkumar1902
    @shekharkumar1902 22 дня назад

    This is the best educational video i ever seen on internet.. explaining the Backpropogation with visualisation. Amazingly super😊😊

  • @moralboundaries1
    @moralboundaries1 8 месяцев назад +6

    So clear and concise! Thank you for creating this.

  • @kltr007
    @kltr007 8 месяцев назад +7

    This video explains the mathematical base of neural networks in a way I understood it the frist time enough to be able to explain it to somebody else. Thank You for that. I can't even imagine how much work you put into the animations. A master piece!

  • @MrRhainer
    @MrRhainer 7 месяцев назад +3

    The best explanation about Deep Learning. Grateful.

  • @Master_of_Chess_Shorts
    @Master_of_Chess_Shorts 8 месяцев назад +1

    This has to be the best explanation of the chain rule ever! Thanks

  • @hasanrants
    @hasanrants 2 месяца назад

    I just watched this video after completing my first lecture of Deep Learning on Backpropagation and Gradient Descent.
    thanks man! appreciated. Really solid content.

  • @halilzabun
    @halilzabun 6 месяцев назад

    One of the best visual explanations of the backpropagation algorithm I've seen! The animations are really good.

    • @javastream5015
      @javastream5015 6 месяцев назад

      Sure that it was the back propagation algorithm?

  • @TheMankeer
    @TheMankeer 27 дней назад

    What a great explanation and clarification especially for all mathematics required to understand Back prop algorithm, appreciate this so much

  • @slopesmonte
    @slopesmonte Месяц назад

    Finally a solid explantation of backpropagation. Thank you!!

  • @HeatherRoberson-vx5eh
    @HeatherRoberson-vx5eh 6 месяцев назад

    As a student in this business, who has passed through a bunch of professors, I can say with confidence! With this trader, you will both learn and earn and, importantly, receive advice. Everything is competent and clear, without a bunch of any unnecessary movements! Keep up the good work!🤣

  • @HimanshuPakhale-n3i
    @HimanshuPakhale-n3i 2 месяца назад +1

    in simple words, backpropagation is method of finding gradient descent. to minimising the losses their is need to find the correct values of each function and it is get by chain rule which is based on darivatives and calculus. it's direction in backward direction hence it is known as backward propagation.
    still so much confusion in mind regarding this process.
    video is very useful and editing is extraordinary.

  • @pankajgoikar4158
    @pankajgoikar4158 4 месяца назад

    Where you were so far.... I've been trying to understand this concept from past 2 years, and now it's cleared after watching this video. Honestly, my maths was not up to the mark. After seeing your video so many important concepts are cleared.
    Don't have enough words to thank you.... God bless you.
    I'll share your videos with my friends. Please keep it up.....🙏🙏🙏🙏🙏🙏🙏🙏🙏

  • @Vinsce_lives
    @Vinsce_lives 8 месяцев назад +2

    This is incredibly well done and helped me visualize derivatives comprehensively. Thank you.

  • @omarbadr9469
    @omarbadr9469 5 месяцев назад

    Man, you really nailed it, especially the Computational Graph and Autodiff part. I heard so many times about them in lectures on Stanford and others. However, this was impressive.

  • @chilledpepsi
    @chilledpepsi 7 месяцев назад +1

    Hands down the best explanation there is to backprop

  • @XxIgnirFirexX
    @XxIgnirFirexX 8 месяцев назад

    I think I just found my favourite channel of all times.
    I've been on YT since 2011 and never had a crush for a YT channel before today é.è

  • @krishnagupta31
    @krishnagupta31 5 месяцев назад

    this is the most intuitive video I have ever come across. Amazing work!!!!!

  • @wrtcookiedelta7560
    @wrtcookiedelta7560 2 месяца назад

    I should be watching gameplays but here I am procrastinating. Jokes aside, im 13 minutes into the video and astonished by your crystal clear explanations and the quality of your material, this is gold.

  • @HozanKano
    @HozanKano 5 месяцев назад

    The best explanation of machime learning i have ever seen on you tube ,amazing work .thank you👍

  • @kakandeemmanuel7410
    @kakandeemmanuel7410 6 месяцев назад

    I cannot tell how much excited this video has got me once I realized I am understanding every single step effortlessly.😂😂😂
    Thanks so much for the explanation.
    God bless you!🙏🙏🙏🙏🙏🙏

  • @MultiMojo
    @MultiMojo 8 месяцев назад +4

    Another gem of a video, well done Artem!! This channel deserves 1M+ subscribers, there's nothing else like it on RUclips.

  • @brahmatejachilumula2668
    @brahmatejachilumula2668 8 месяцев назад +1

    Beat graphical experience with a clear information, Really enjoyed throughout the video !!!

  • @arjunrao9978
    @arjunrao9978 5 месяцев назад

    This video has an amazing and easy-to-understand explanation of the basics of Calculus. Many Thanks to the Creator 🙏🏼

  • @andrewshort6440
    @andrewshort6440 6 месяцев назад

    Magnificent work, from the beautiful, creative, elegant design, to the mastery in teaching. Thank you!

  • @shizzm1990
    @shizzm1990 8 месяцев назад +1

    Some people just want to see the world learning. Great Video Artem!

  • @eradubbo
    @eradubbo 3 месяца назад

    Best description on the topic on the internet!

  • @Binue-n5g
    @Binue-n5g 7 месяцев назад +1

    The world needs more of you bro

  • @Bhuvan_D
    @Bhuvan_D 4 месяца назад

    That was fire bro! Gonna have to rewatch to understand the back step, but a lot clearer than most videos

  • @fosowl
    @fosowl 8 месяцев назад +2

    Glad to see ML related video from you ! As you have neuroscience background I would love to see some video that compare the current state of the art architecture work in ML with some of the inner working of the brain. For exemple if there are any structure in the brain with some ressemblance with GPT/transformers architecture, even thought the brain is light-years away I think that could be interesting :)

  • @asdasd-yr7wi
    @asdasd-yr7wi 8 месяцев назад +57

    31 years now, had like 13 years of math in school and another 5 years at university, first time i really understood how derivatives work, bcs visualisation instead of "you calculate it this way and derive it that way, now memorize"

    • @ArnaudMEURET
      @ArnaudMEURET 6 месяцев назад +3

      May I ask which university you went to?

    • @WsciekleMleko
      @WsciekleMleko 5 месяцев назад +4

      @@ArnaudMEURET Sorry, but I don't believe anybody who have no idea what a tangent line of a function in point x looks like, and what it means (despite milion of excersises teaching you it's meaning), and dozens of graphs in literally every workbook, could actually go through 5 years of math related subject. This guy is straight up lying or trolling.

    • @abyssmage6979
      @abyssmage6979 4 месяца назад +1

      ​​@@WsciekleMleko exactly. People shit on school because "muh education system bad" and forget about all of the interesting stuff they actually teach.
      No, you're not heroes trying to fight against the big bad. You're just lazy and want an excuse to keep being one.

    • @Dr_Larken
      @Dr_Larken Месяц назад

      Um, how long did it take you to graduate grade school? 13 years of math! Even if you did start learning maths in PK or K to 12, it’s basic mathematics until about grade 4 when you’re starting to build off the foundation of that such as algebra, geometry etc
      In other words, the maths ain’t mathing!

  • @nayanahgowda3219
    @nayanahgowda3219 7 месяцев назад

    Hands down the best explanation I have seen so far! So clear and easy to understand!!

  • @balajimarisetti4245
    @balajimarisetti4245 2 месяца назад

    Excellent explanation of back-propagation, the building block of machine learning. Thanks a lot.

  • @ElSenorEls
    @ElSenorEls 4 месяца назад

    This is the best video about this topic. Learned a lot of things. Took me 2 or more hours but I understand it now. Thank you!

  • @仁科欣
    @仁科欣 8 месяцев назад +2

    It's very very nice to see that are you updating.

  • @eurob12
    @eurob12 6 месяцев назад

    Very well explained how backpropagation and how the loss function helps in determining the optimal minimum by using calculus, great detail which helps newbies like me understand this complex topic much better.

  • @martonbalassa8128
    @martonbalassa8128 8 месяцев назад

    This is the best youtube channel in my feed, and I have many.

  • @OmarElghamry1
    @OmarElghamry1 3 месяца назад

    So much effort, in this video, the quality of the content at the same level of 3B1B, keep it going man.

  • @ndungikyalo
    @ndungikyalo Месяц назад

    Amazing how you can explain it so well, so simply. You have a subscriber !

  • @4th_wall511
    @4th_wall511 6 месяцев назад

    bro im 2 minutes in and your graphics are insanely good I can already tell this is going to be a treat. Holy smokes man I'm having a graphicgasm

  • @WayneLopez-w9d
    @WayneLopez-w9d 5 месяцев назад

    Your approach to trading is truly impressive. Thank you for teaching me so much!

  • @yordanyordanov6719
    @yordanyordanov6719 5 месяцев назад +1

    Very good video, very well explained. But there is one problem you didn't mention. When training very deep neural networks and using a sigmoid or tanh funktion as the activation funktion, Backpropagation loses it's "powers". The learning prozess becomes extremely slow and results are suboptimal. One of many solutions to this is a ReLU or a ELU funktion in the hidden layers instead of Sigmoid or Tanh. And also how we initialize our weights at the beginning. For example the He-Initialiazation...

  • @RoodJood
    @RoodJood 2 месяца назад +1

    simply the best presentation on the subject

  • @ram-my6fl
    @ram-my6fl 8 месяцев назад +5

    Most Comprehensive Explanation EVER
    my opinion : better than
    3b 1b, No offence to 3b 1b Hes great at it and one of the pioneers who did these kind kf visual explanations.
    But i like your explanation as it is slow paced & comprehensive

    • @domorobotics6172
      @domorobotics6172 6 месяцев назад

      Yeah 3b1b definitely deserves respect from me, but I think he will to recognize this video is very carefully done.
      I like that these people just care about the truth and the perfection, and even with a little bit of envy, care about the best product being done.

  • @obsidianSt6761
    @obsidianSt6761 4 месяца назад

    this video has amazing animations. You/your team clearly have a very high attention to details

  • @simaitools
    @simaitools 8 месяцев назад +1

    Watching this video was like a breath of fresh air after some heavy math calculations! The visual explanations really helped solidify my understanding of backpropagation. I appreciate how clear and easy to follow the graphs were. Keep up the fantastic work! Can't wait for more graphic doses like this.

  • @imran_v1.0
    @imran_v1.0 4 месяца назад

    Brilliant video! The math, detailed visuals and explanation are excellent. Thank you.

  • @1ProsperousPlanet
    @1ProsperousPlanet Месяц назад

    Wow amazing thank you. Ive read and watched many videos on this topic and this is the one where I finally "got it"

  • @StratosFair
    @StratosFair 9 дней назад

    This video is an absolute masterpiece, congratulations

  • @yewdimer1465
    @yewdimer1465 8 месяцев назад +4

    On the computational graph @35:22 can anyone explain why the partial derivatives w.r.t the polynomial terms multiplied by the constants (i.e k_0, k_1 * x, ..., k_5 * x^5) suddenly had a negative sign (-2 * delta_y1)? The partials were back-propagated from a summation operation so I'd think they would remain positive... Or is it just an error?

    • @ArtemKirsanov
      @ArtemKirsanov  8 месяцев назад +1

      whoops, you're right, there shouldn't be a minus sign there! sorry about that.
      Good catch!

  • @philipm3173
    @philipm3173 8 месяцев назад

    This is just superb, thank you Artem! Timing couldn't be any better as the gradient descent algorithm was mentioned in Grahaene's "How We Learn" which I'm currently reading.

  • @TurinBeats
    @TurinBeats 6 месяцев назад

    Waiting patiently for the second video 🫰♥️. Much love from Kenya, thank you for making me understand back propagation. Started watching your channel because of Obsidian, stayed for the AI lessons 🫰.

  • @AkshayKumar-sg8qm
    @AkshayKumar-sg8qm 4 месяца назад

    That's the most amazing way of explaining such hard things to understand

  • @maxwell77176
    @maxwell77176 7 месяцев назад

    Thanks

  • @delete7316
    @delete7316 8 месяцев назад +1

    As soon as I saw this video, I knew it was going to be the best of this kind on the Internet. And it was. Fantastic video!

  • @kalebnegussie8140
    @kalebnegussie8140 6 месяцев назад

    Excellent explanation. I am going to rewatch this a few more times. Well done and thank you.

  • @ForTheOmnissiah
    @ForTheOmnissiah 4 месяца назад

    I wish the Chain Rule was explained in this manner when I was in university. I understood how to do it on paper just fine, but this explanation makes the reasoning behind it make complete sense.

  • @benmuller6103
    @benmuller6103 8 месяцев назад

    Excellent explanation - I already understood this conceptually but this video gives a very good intuition for the repeated chain rule application

  • @connormcmk
    @connormcmk 8 месяцев назад

    Thanks!

  • @Тима-щ2ю
    @Тима-щ2ю 7 месяцев назад

    WOW!!! The amount of animation you have made is just incredible. I would really like to not know about backprop again in order to fully appreciate this video!

  • @ahumanperson3649
    @ahumanperson3649 8 месяцев назад

    Great video! Very elegant explanation of back propagation, and I’m super excited to see the different mechanics of biological neural networks! Keep up the good work.

  • @TheForbiddenLOL
    @TheForbiddenLOL 8 месяцев назад +10

    You're doing pure ML content now? Excellent! Always glad to see more of your work, looking forward to watching the beautiful manim visuals and clear explanations as usual.

    • @ArtemKirsanov
      @ArtemKirsanov  8 месяцев назад +22

      thanks! ;)
      Yep! The channel so far has been a reflection of my research interests, and since i've joined an more computational theory neuro-AI lab, i figured more ML content with relevant topics of what i'm learning could be a nice addition

    • @Amejonah
      @Amejonah 8 месяцев назад

      @@ArtemKirsanov the reason, why I am watching your videos, is exactly because of the fact you draw common traits and differences between biology (neuroscience) and ML/"models" of it. Thank you for these!

  • @chakravarthyelumalai8408
    @chakravarthyelumalai8408 8 месяцев назад +1

    A million dollar explanation. Thank you @Artem

  • @RohitKumar-pu4nm
    @RohitKumar-pu4nm 7 месяцев назад

    Спасибо, это лучший канал связок, все работает, буду это пробовать.

  • @benjamin6729
    @benjamin6729 5 месяцев назад

    Such a good video. Have liked and subscribed!
    I love the Curve Fitter 6000 machine in the animations to explain these concepts. Most textbooks are just too abstract and confusing but you have done a great job.

  • @francescobranca653
    @francescobranca653 8 месяцев назад

    Very insightful video. Can't wait to see the second part. I would really love to see a video from you on spiking neural networks too!

  • @ajay0909
    @ajay0909 2 месяца назад

    This video would have saved me so many days that I have spent on researching backpropagation 2 years ago

  • @ks0ni
    @ks0ni 8 месяцев назад

    Wow, hats off to you! Can't even imagine how long it takes to make something like this