Gradient Descent in 3 minutes

Поделиться
HTML-код
  • Опубликовано: 15 янв 2025

Комментарии • 91

  • @user-sb6os
    @user-sb6os 3 года назад +75

    This animation is really great for a small channel like this

  • @kitzelnsiebert
    @kitzelnsiebert 3 года назад +45

    Great job! I like that you were even able to talk about some of the different types of gradient decent algorithms, tall task for 3 minutes.

    • @byronwilliams7977
      @byronwilliams7977 2 года назад

      Same here. i studied Applied mathematics, so I have to get up to speed on this rather quickly, I find these videos to be excellent.

  • @myelinsheathxd
    @myelinsheathxd 3 года назад +9

    That's appealing to see these visual explanations after learning the the concept !

  • @prakhargupta1745
    @prakhargupta1745 Год назад +4

    just a correction, 2:30 is mini-batch stochastic gradient descent since we are iterating over batches

  • @billyin4771
    @billyin4771 2 года назад +12

    Very comprehensive and short, love it! Quick and concise!

  • @vivekpujaravp
    @vivekpujaravp Год назад +18

    Best explanation and visualization I've seen. You have incredible talent. Please keep making more.

  • @Leo-vv3jd
    @Leo-vv3jd 7 месяцев назад +27

    I really liked the video and the visuals, but I think it would be better without the "generic music" in the background.

    • @VisuallyExplained
      @VisuallyExplained  7 месяцев назад +7

      Thank you for taking the time to post your feedback, this is very useful for the growth of this channel!

  • @0N3_01
    @0N3_01 9 месяцев назад +1

    Thank you for the clear explanation

  • @snowcamo
    @snowcamo 7 месяцев назад

    Honestly didn't really help with my questions, but I didn't expect a 3 minute video to answer them. This was very well done, the visualization was great, and everything it touched on (while brief) was concise and accurate. Subbed.

  • @hrishikeshsrivatsa
    @hrishikeshsrivatsa 2 года назад +3

    First video, where I got clear and precise understanding of the topic

  • @209_Violate
    @209_Violate 10 месяцев назад

    wow. you have such a talent for explaining things so well compared to the rest of the youtube sphere. i hope you will continue to bless us with your talents.

  • @ah-rdk
    @ah-rdk 9 месяцев назад

    Hello. Thanks for this great video.
    Just I believe at 2:20, this variant of Gradient Descent that you explained is called the Mini-Batch Gradient Descent which uses a random subset of the training dataset.
    Stochastic Gradient Descent is the one that uses just one training record in each iteration.

    • @khabib8568
      @khabib8568 5 месяцев назад

      Yeah i have the same doubt

  • @brainxyz
    @brainxyz 3 года назад +3

    Great Job! well done

  • @bilalb95
    @bilalb95 Год назад

    Thanks for the amazing explanation and visualization

  • @juanete69
    @juanete69 2 года назад

    It mixes very well the theory and a practical example.

  • @unplandsitch
    @unplandsitch Год назад

    Wow this is so well, intuitivly explained

  • @pdebuck1
    @pdebuck1 3 года назад +1

    My professor said “there is no excuse for gradient descent” when conjugate gradient is so easy to implement

  • @World_Admiror
    @World_Admiror 11 месяцев назад

    Keep up the good work! This viedeo and the whole chanel are amazing!
    Bravo 👏🏻

  • @A0G7
    @A0G7 5 месяцев назад +1

    i looooooooooove when the background music stops. it tells you to open your eye and focus your ear, a really important revelation is coming ....

  • @hariraj4184
    @hariraj4184 3 года назад +2

    Brilliant work 👍

  • @AK56fire
    @AK56fire 3 года назад +10

    Could you please make a video explaining how you made this video.. That would be very VERY helpful.. I've always wanted to use blender to make such animations as you've made, but couldn't make head or tails of it in blender. Most tutorials(and I've seen more than 100 videos) on blender showcase heavyduty animations which have nothing to do with mathematical explanations, as in how to make animations for maths related videos.. Your's is the first video in which I've seen such a thing. Please consider my request and kindly make a video tutorial about it(for the Blender part).

    • @VisuallyExplained
      @VisuallyExplained  3 года назад +9

      Hey Amit, I am definitely planning to make a video about my workflow, and in particular, how I make the animations. So stay tuned for that! :)

    • @AK56fire
      @AK56fire 3 года назад +7

      @@VisuallyExplained Will definitely wait for it.. Thanks for considering it. I appreciate it a lot.

    • @photogyulai
      @photogyulai 9 месяцев назад

      Huge thanks, it would be fantastic (fellow teacher here:-)

  • @olivier306
    @olivier306 2 года назад +1

    Literal gift from god this channel

  • @YassinP10
    @YassinP10 19 дней назад

    Bravo sadiki

  • @MARCELSOCORÓGARRIGOSA-l6h
    @MARCELSOCORÓGARRIGOSA-l6h Год назад

    I have only seen one video and it is helping me a lot! Keep going!

  • @alanmdl
    @alanmdl Год назад +1

    great video but at around 1:30 my heart dropped it felt like a scary movie since it was all dark lol

  • @eumatheus
    @eumatheus Год назад

    Thanks for this, the visualization helps a lot!

  • @ranjanpal7217
    @ranjanpal7217 Год назад

    Amazing explanation and visualization

  • @sayyidj6406
    @sayyidj6406 10 месяцев назад

    excellent vid. do you have a video about PPO in RL?

  • @thebrucecyou
    @thebrucecyou 3 года назад +1

    Excellent video!

  • @dannyk123
    @dannyk123 Год назад +1

    Great video, was wondering how this works if there are multiple minimum points where the data has high dimensionality?

    • @herronproductions829
      @herronproductions829 Год назад +1

      It works just fine with multiple minimum points in a high dimension. As long as you configure your hyperparameters (learning rate, batch size, etc) correctly, you should have no problem converging to a "decent" minimum.

  • @rasmil77
    @rasmil77 5 месяцев назад

    Helpful for revising the topic :)

  • @alirezaakhavi9943
    @alirezaakhavi9943 11 месяцев назад

    really nice animation, explanation and content than you very much for sharing! :)

  • @error-my9ut
    @error-my9ut Год назад

    thanks for visualization it really helped .

  • @Darkev77
    @Darkev77 3 года назад +3

    Proximal GD next please!

  • @MrWater2
    @MrWater2 Год назад

    funcking incredible explanation in just 3 minutes..wow!

  • @hassanabdullah6742
    @hassanabdullah6742 10 дней назад

    It is basically using the principle of induction to create a cardinality symmetry.

  • @incrediblekullu7932
    @incrediblekullu7932 3 месяца назад

    how you make these videos ?
    manim ??

  • @Hateusernamearentu
    @Hateusernamearentu 5 месяцев назад

    very smart. But I still need another video presenting differential to help understand the slope, opposite direction thing in 2D. But this is clear. Also I like the small step demonstration.

  • @zzzzzz-zzz-789
    @zzzzzz-zzz-789 Год назад

    great video, thanks!

  • @na50r24
    @na50r24 Месяц назад

    For Vanilla GD, are you not supposed to divide by the number of samples in data before performing the update? Or do you just take the sum of this 'accumulated gradient'?

  • @benedictcoltman1983
    @benedictcoltman1983 14 дней назад

    Beautiful 👍

  • @fabricetshinangi5042
    @fabricetshinangi5042 3 года назад +1

    Can you please post a link or titles of materials(books) on this topic tsht one can go through . I really need to learn this topic. Thank you

    • @VisuallyExplained
      @VisuallyExplained  3 года назад +3

      Great idea! Boyd's book is a good starting point (page 463 of web.stanford.edu/~boyd/cvxbook/bv_cvxbook.pdf). I will try to add more references to the video description in the future.

    • @fabricetshinangi5042
      @fabricetshinangi5042 3 года назад +1

      Thank you

  • @abdulghanialmasri5550
    @abdulghanialmasri5550 3 года назад +1

    Great video, would you consider some topics in numerical analysis, like gaussian quadrature???

  • @AK56fire
    @AK56fire 3 года назад +2

    Which software did you use to make those animations.

    • @VisuallyExplained
      @VisuallyExplained  3 года назад +5

      I used Blender3D (with python) for all 3d scenes. The rest is a combination of After Effects and the python library manim

  • @md.sarowarhossainrana4787
    @md.sarowarhossainrana4787 4 месяца назад +1

    what is the ita?

    • @tarekbenzyad6766
      @tarekbenzyad6766 2 месяца назад +2

      i was about to ask the same thing , he suddenly introduced it into the cost function as a paramter then never talked about what it meant

  • @JuanCarlosAraujoCabarcas
    @JuanCarlosAraujoCabarcas 2 года назад +1

    Interesting topic and comparison. Since you are using information from past iterations, it would be very illustrative to include a quasi-newton in your comparison. For example the BFGS.

  • @AK56fire
    @AK56fire 3 года назад +1

    Great Animation buddy.. Cool..

  • @wearedoingsomething
    @wearedoingsomething 2 года назад

    Huge thank you!

  • @jayp9158
    @jayp9158 Год назад

    My fav video about this

  • @Urammar
    @Urammar 10 месяцев назад

    How does this actually apply in reverse, though? How do you apply this

  • @charlesyang7233
    @charlesyang7233 2 года назад

    This is amazing

  • @TimCrinion-j2r
    @TimCrinion-j2r 9 месяцев назад

    Is this the same as Newton's method, or the Newton-Raphson method?

  • @viddeshk8020
    @viddeshk8020 3 года назад

    Wow, 😊❤️ love it

  • @arnaupinto5890
    @arnaupinto5890 3 года назад +1

    awesome

  • @emirhandemir3872
    @emirhandemir3872 Год назад

    Amazing !!!

  • @John-wx3zn
    @John-wx3zn 9 месяцев назад

    What is gradient descent trying to find?

    • @at1with0
      @at1with0 9 месяцев назад

      To minimize cost/error of a learning model

  • @Zethuzzz
    @Zethuzzz 9 месяцев назад

    Adam usually works quite good

  • @user-rw6iw8jg2t
    @user-rw6iw8jg2t Месяц назад

    Isn't it awesome in a simplified way, I was juz implementing OLS for Vanilla linear regression to train the neural networks with some weights and bias and this video popped up was doing some stuff with thje matrix and dot product , I luv Mathematics !!! one thing when we have OLS algo directly why we need to implement OLS with GDA again then wts the use of having OLS algo seperately is it coz of Volume of the data points ?

  • @cedricmanouan2333
    @cedricmanouan2333 Год назад

    magnificent 🔥😧

  • @tsunningwah3471
    @tsunningwah3471 Год назад

    simplex method please

  • @guillecobo_
    @guillecobo_ 3 года назад

    great video

  • @marinamaged962
    @marinamaged962 2 месяца назад +2

    i dont get it lol

  • @FaberLSH
    @FaberLSH 6 месяцев назад

    Good!

  • @andrewt15
    @andrewt15 3 года назад +2

    dope

  • @yassinom2466
    @yassinom2466 Год назад

    veeeeery nice

  • @Hans_Magnusson
    @Hans_Magnusson Год назад

    As you might know I studied this topic in London…
    I obviously aced it.😂

  • @-leaflet
    @-leaflet Год назад

    Wow

  • @Christoo228
    @Christoo228 7 месяцев назад

    sagapo

  • @fulljsu3glitches
    @fulljsu3glitches Месяц назад

    And how the fuck i get the n ??

  • @tsunningwah3471
    @tsunningwah3471 5 месяцев назад

    房山

  • @mohammadnasser9951
    @mohammadnasser9951 2 года назад

    Damn calculus.
    I understand nothing.