Optimizers - EXPLAINED!

Поделиться
HTML-код
  • Опубликовано: 26 окт 2024

Комментарии • 137

  • @varshitakolipaka7933
    @varshitakolipaka7933 6 дней назад +2

    This is THE best video I’ve watched on this topic: clear, perfectly motivated, and insanely engaging

  • @tombratfred3102
    @tombratfred3102 4 года назад +141

    I like how you anthropomorphize optimizers. Makes me really empathize with their struggles.

    • @erich_l4644
      @erich_l4644 4 года назад +6

      with a profile pic like that- you would

    • @metaprog46and2
      @metaprog46and2 4 года назад +5

      @@erich_l4644 LMAO - your comment just won the internet. You'll soon receive an email by a Nigerian Prince with instructions on how to claim your winnings lol.

    • @metaprog46and2
      @metaprog46and2 4 года назад +2

      Died laughing like thrice. Witty joke.

  • @jonass1315
    @jonass1315 2 года назад +15

    This is how every lecture should be like. Funny but perfectly explained, and greatly visualized. Thanks!

  • @diaojun161
    @diaojun161 4 года назад +50

    The best explaination of optimizers in DL I HAVE EVER WATCHED!

    • @CodeEmporium
      @CodeEmporium  4 года назад +2

      Thank you! More of this to come!

  • @1harru
    @1harru 3 года назад +10

    Hands down.. This is the best video on Optimizers.. !!! I've been trying to understand the complex math equations for the past few days and this one literally gave me the overall intuition in 7min 🙏🙏🙏

  • @dude8309
    @dude8309 4 года назад +26

    great little overview! love how you get to the point quickly yet provide all the needed intuition

    • @CodeEmporium
      @CodeEmporium  4 года назад +4

      Thanks! That's exactly what I was going for :)

  • @kevinelkin3943
    @kevinelkin3943 3 года назад +7

    Such an underrated channel! Great explanations and visuals!

  • @zhengyahnis848
    @zhengyahnis848 3 года назад +1

    Don't know why this video is under spreading, the explanation is great and the high-level summarization helps me a lot.

    • @CodeEmporium
      @CodeEmporium  3 года назад

      Thank you! Mind fixing that by sharing this around? Would love to get more eyeballs here :)

  • @X_platform
    @X_platform 4 года назад +25

    Loving the sound effect

  • @abhikbanerjee3719
    @abhikbanerjee3719 4 года назад +6

    I am watching this at 2 am in the morning and that sudden effect 00:13 cracked me up!

  • @fahdciwan8709
    @fahdciwan8709 4 года назад +1

    thanks! one of those rare videos that explain the intuition perfectly instead of hovering around the terms

  • @shubhigautam9655
    @shubhigautam9655 3 года назад +1

    the only video thats ever made me laugh while explaining a concept. Love it, thank you!

  • @a.h.s.3006
    @a.h.s.3006 3 года назад +2

    That...... was........ one EXCELLENT VIDEO!!!!!
    Thank you so much, I thought I would struggle with optimizers but now it's all clear to me

  • @carebox6187
    @carebox6187 4 года назад +16

    This video was both informative and hilarious. I absolutely loved it!

    • @CodeEmporium
      @CodeEmporium  4 года назад +1

      That was the objective. Glad you liked it :)

  • @carlavirhuez4785
    @carlavirhuez4785 4 года назад +1

    Best video ever on optimizer. Thanks a lot.

  • @BlockDesignz
    @BlockDesignz 4 года назад +4

    Absolutely love this iterative explanation.

    • @CodeEmporium
      @CodeEmporium  3 года назад +2

      Thank you. I'm experimenting with different teaching styles :)

  • @DouweMr
    @DouweMr 2 года назад

    This is one hell of a video to refresh on this stuff! kindly appreciated!!

  • @lakshmisrinivas369
    @lakshmisrinivas369 3 года назад +3

    Absolute way of learning with lot of fun. Thanks for such a funny and insightful video

  • @trocketflicks
    @trocketflicks 4 года назад +1

    Man, this video is slept on. Such a good explanation!

  • @Anja5233
    @Anja5233 Год назад

    This is my new favorite video on the internet

    • @CodeEmporium
      @CodeEmporium  Год назад

      Thanks so much for the compliments:) I try

  • @shouravpaul3092
    @shouravpaul3092 4 года назад +1

    like your video, and mostly i wanted to see the graph that most people dont show, thank you

  • @anujlahoty8022
    @anujlahoty8022 2 года назад +1

    Very well explained and in a fun way.

  • @rendevous9253
    @rendevous9253 3 года назад

    Man you gave the best explanation which even a noob like me in machine learning can understand . Keep it up man 👍.

  • @10bokaj
    @10bokaj 3 года назад +1

    Very clear, very well explained 10/10

  • @ahmedaj2000
    @ahmedaj2000 Год назад

    love it! thank you! explained better than my professors. i finally get these now after so long

    • @CodeEmporium
      @CodeEmporium  Год назад

      Words that are too kind. Thank for the kind words

  • @oskarbartosz9159
    @oskarbartosz9159 Год назад

    m8, i was searching for channel like that for a really long time

  • @ruxiz2007
    @ruxiz2007 4 года назад

    This video is so good, and it deserves 100X more attention!

  • @Hariharan-yy1fu
    @Hariharan-yy1fu 2 года назад

    Awesome work easy to get a quick review before my interview keep going

  • @GauravSharma-ui4yd
    @GauravSharma-ui4yd 4 года назад +2

    Thanks ajay for giving this a shoot. Loved it❤️

    • @CodeEmporium
      @CodeEmporium  4 года назад +1

      Thanks for watching Gaurav (and the suggestion). Saw your comment on the last video too. And it was also in a line of videos I wanted to do. Probably not as "mathematical" as you'd like. I wanted to just explain why certain terms appear the way they do. Hopefully this helped that understanding. I might do a more mathematical video in the future though. But for now, this will do :)

    • @GauravSharma-ui4yd
      @GauravSharma-ui4yd 4 года назад

      @@CodeEmporium You did a pretty awesome job in just 7 minutes. Its both beginner friendly and refreshing for intermediates.

  • @inteligenciamilgrau
    @inteligenciamilgrau 2 года назад

    Best explanation ever!! Thank you so much!!!

  • @igorg4129
    @igorg4129 Год назад +1

    I think that a critical point missing here in the explanation:
    You have forgotten to mention that a loss surface is different for each sample so there DOES NOT EXIST any universal loss surface for a given dataset and this is a problem in stochastic gradient descent

  • @Fransphoenix
    @Fransphoenix Год назад

    Great explanation and fun, too. Thank you!

  • @ErturkKadir
    @ErturkKadir 4 года назад

    Such a clear and simple explanation of complicated things. Great job.

  • @crashedbboy
    @crashedbboy 9 месяцев назад

    Never thought I would spit out drink while watching machine learning video

  • @eyesyt7571
    @eyesyt7571 Год назад

    The first scene is precisely what happened to my neural network 2 weeks ago.

  • @r.y.y8073
    @r.y.y8073 Год назад

    I like how you explained this!

  • @FanOfFunBuddy
    @FanOfFunBuddy 2 года назад

    Wow most complex topic in under 7 minutes 😊 with pretty good visualizations.

  • @ZobeirRaisi
    @ZobeirRaisi 4 года назад

    Your Explanation went to the deep of my brain!

  • @artinbogdanov7229
    @artinbogdanov7229 4 года назад +1

    Great explanation. Thank you!

  • @MuhammadMujahidHaruna
    @MuhammadMujahidHaruna Месяц назад

    😮 Am silence of how you explain everything in details

  • @sb7048
    @sb7048 4 года назад +1

    What does the alpha at the SGD momentum equation do? I mean alpha is your learning rat at the first two equations but since then you use n as the learning rate, so for what is alpha since then?

  • @TawhidShahrior
    @TawhidShahrior 2 года назад +1

    man you do great work!

  • @eniolaajiboye4399
    @eniolaajiboye4399 3 года назад

    ❤️ the videos man. They're so clear

  • @m.a.flores7252
    @m.a.flores7252 4 года назад

    Please keep doing this kind of videos I’m in love with ML and with u

    • @CodeEmporium
      @CodeEmporium  4 года назад

      Haha thank you so much for the support

  • @aafaq97in
    @aafaq97in 4 года назад +1

    amazing vid you just earned a subscriber! looking forward to more content like this!

  • @ThamizhanDaa1
    @ThamizhanDaa1 3 года назад +1

    Nice channel! better than my professors lol

    • @CodeEmporium
      @CodeEmporium  3 года назад +1

      Super happy this is helpful. Thanks!

  • @Simon-ed6zc
    @Simon-ed6zc 3 года назад +2

    Hey, thank you a lot for the explanations! Do you happen to know any heuristics with which to chose a specific optimizer? Right now I have a problem where every paper uses Natural Gradient descent, but when I use it it barely ever converges, while Adam always gets it right (or at least comes close)...

    • @SirPlotsalot
      @SirPlotsalot 2 года назад

      Your implementation might not be ideal, I'd try to use a KFAC preconditioning term maybe?

  • @mridulavijendran3062
    @mridulavijendran3062 4 года назад +2

    Hey. Great work on the video :D It was v clear and fascinating
    What's NAG? I wonder how come Nadam isn't popular -seems like a better choice.
    How would you describe RMSProp? You seem to have really great insight into DL concepts :D
    Also why expectation in particular for Adam parameter updates?
    Sorry for the questions bombardment. Jus pretty curious

  • @SujayAmberkar
    @SujayAmberkar 4 года назад +2

    you voice is like some cool anime main character. I wished i had a voice like yours. Anyway great explanation.

  • @hariharans.j5246
    @hariharans.j5246 4 года назад +1

    do Neural ODEs and self-supervised learning techniques pls,
    great video btw

    • @CodeEmporium
      @CodeEmporium  4 года назад +2

      Thanks. I saw your comment on another video. I'll look into this a bit

  • @MrSinalta
    @MrSinalta 8 месяцев назад

    If I understood well, Acceleration should not be called 'deceleration' in this particular case ?

  • @roshanid6523
    @roshanid6523 3 года назад

    Amazing explanation

  • @dan1ar
    @dan1ar Год назад

    why gradient on 1:21 would be large? Isn't it just avg for every element in dataset? Same for mini-batch, but only there gradient is avg for every element in batch

  • @YangQuanChen
    @YangQuanChen 3 года назад

    Nicely done! Thanks!

  • @mahdijavadi2747
    @mahdijavadi2747 3 года назад

    Thanks for the great clarification!

  • @rutweeksawant6567
    @rutweeksawant6567 4 года назад

    very nice explanation and visualization.

  • @ardhidattatreyavarma5337
    @ardhidattatreyavarma5337 Год назад

    awesome explanation

  • @sia7001
    @sia7001 4 года назад +2

    So underrated 😭

  • @yahavx
    @yahavx Год назад

    The first part is not correct: the fact that you use a mini-batch in each step, rather than the entire dataset, does not give you a higher chance to converge to the optimum. Because even when considering the entire dataset in each step, you're still taking the average gradient, so the expected magnitude of the gradient does not change. It all depends on the step size.

  • @karamjeetsinghgulati6989
    @karamjeetsinghgulati6989 2 года назад

    I can start laughing for intial 20 sec, I am watching in loop

  • @lucha6262
    @lucha6262 4 года назад

    really good overview

    • @CodeEmporium
      @CodeEmporium  4 года назад

      Thanks! Making more of this stuff on the channel

  • @zshahlaie4740
    @zshahlaie4740 Год назад

    this video was the bestttttt

  • @davisburnside9609
    @davisburnside9609 3 года назад

    very helpful, thank you

  • @rakeshsinghrawat99
    @rakeshsinghrawat99 4 года назад +2

    Always good

  • @user-wr4yl7tx3w
    @user-wr4yl7tx3w 2 года назад

    Well explained.

  • @MrAdhito
    @MrAdhito 3 года назад

    0:43 this really cracks me up HAHA!

  • @shaflyhamzah3848
    @shaflyhamzah3848 4 года назад

    Nice explanation

  • @RichardMuenzer
    @RichardMuenzer Год назад

    Question how does Newton play here?????

  • @kidsfree6615
    @kidsfree6615 Год назад

    Amazing video.

  • @sizhuanghe1361
    @sizhuanghe1361 Год назад

    That's fun!

  • @negardeylami6039
    @negardeylami6039 2 года назад

    LMFAO the ontro I don't know why it's so funnyyyyyy for me

  • @nikab1852
    @nikab1852 3 года назад

    love this

  • @porimol108Tv
    @porimol108Tv 3 года назад

    I subscribed to your channel.

  • @yahavx
    @yahavx Год назад

    Not enough formal for me.. intuition is nice but need to be a little concrete about what is actually done

    • @CodeEmporium
      @CodeEmporium  Год назад +1

      Fair. I did what I cold in a short video like this. Thanks for watching!

    • @yahavx
      @yahavx Год назад

      @@CodeEmporium Thank you!

  • @chinmayeedongre5525
    @chinmayeedongre5525 4 года назад

    Great Video!

  • @aakarshrai5833
    @aakarshrai5833 4 месяца назад

    Bro could you please label you equations. It'll be helpful

  • @SeanKearney-g7d
    @SeanKearney-g7d 27 дней назад

    excellent

  • @youtubecommenter5122
    @youtubecommenter5122 4 года назад

    What a good video!

  • @ArsalJalib
    @ArsalJalib 4 года назад

    Loved the start, watched 5 times.
    Also my first comment on RUclips. =)

    • @CodeEmporium
      @CodeEmporium  3 года назад

      Yas! Thank for this comment! Absolutely love it

  • @theoutlet9300
    @theoutlet9300 4 года назад

    dude where did you study this. understanding the maths make the coding so much fun

  • @sumod12
    @sumod12 3 года назад

    Awesome ❣️❣️

  • @RohitashChandra
    @RohitashChandra 4 года назад

    well done!

  • @Eysh2009
    @Eysh2009 4 месяца назад

    Valeu!

    • @CodeEmporium
      @CodeEmporium  4 месяца назад

      Thanks so much for the donation! Glad you liked this content!

  • @EdeYOlorDSZs
    @EdeYOlorDSZs Год назад

    W explination

  • @arsalan2780
    @arsalan2780 4 года назад

    wonderfulllllllllllllll ...........!!!!

  • @annarauscher8536
    @annarauscher8536 2 года назад

    I think I watched that intro like 7 times haha

  • @EngRiadAlmadani
    @EngRiadAlmadani 4 года назад

    Good jop

  • @stepantoman4694
    @stepantoman4694 2 года назад

    hahahaha amazing

  • @MikeSieko17
    @MikeSieko17 7 месяцев назад +1

    one critique, your notation is really weird and non intuative for beginners

  • @tostupidforname
    @tostupidforname 4 года назад

    Imo you should have gone into more detail of the math of the optimizers. I did not understand how the terms relate to the behaviour the optimizers are supposed to have.

  • @mennoliefstingh5687
    @mennoliefstingh5687 4 года назад +1

    Thanks for the great explanation!