Conjugate Gradient Method

Поделиться
HTML-код
  • Опубликовано: 15 сен 2024

Комментарии • 49

  • @zhangkin7896
    @zhangkin7896 3 года назад +8

    Here is 2021, your class is still great for now. ❤️

  • @pigfaced9985
    @pigfaced9985 11 месяцев назад

    You are a life saver! I have an assignment that's related to this method and I understood it pretty well! THANK YOU!

  • @louisdavies5367
    @louisdavies5367 8 лет назад +7

    Thank you for making this video!! It's really helpful with my studies :)

  • @mari0llly
    @mari0llly 6 лет назад +21

    good video, but you used the Laplace operator instead of the nabla operator for the gradient.

  • @valentinzingan1151
    @valentinzingan1151 4 года назад +10

    The first method you described is called the Steepest Descent (not the Gradient Descent). Gradient Descent is the simplest one, the Steepest Descent is an improvment on the Gradient Descent, exactly as you described.

  • @songlinyang9248
    @songlinyang9248 6 лет назад +3

    Very very clear and helpful, thank you very much

  • @alexlagrassa8961
    @alexlagrassa8961 4 года назад +3

    Good video, clear explanation.

  • @from-chimp-to-champ
    @from-chimp-to-champ 2 года назад

    Good job, Priya, elegant explanation!

  • @dmit10
    @dmit10 Год назад +1

    Another interesting topic is Newton-CG and what to do if the Hessian is indefinite.

  • @edoardostefaninmustacchi2232
    @edoardostefaninmustacchi2232 3 года назад +2

    Excellent stuff. Really helped

  • @aboubekeurhamdi-cherif6962
    @aboubekeurhamdi-cherif6962 9 лет назад +1

    Please note that x* is the minimizer and the minimum.

  • @vijayakrishnanaganoor9335
    @vijayakrishnanaganoor9335 3 года назад

    Great video!

  • @Koenentom
    @Koenentom 3 года назад

    great video. Thanks!!

  • @pablocesarherreraortiz5239
    @pablocesarherreraortiz5239 Год назад

    thank you very much

  • @ryanmckenna2047
    @ryanmckenna2047 3 месяца назад

    What is a TOD?

  • @narvkar6307
    @narvkar6307 10 лет назад +1

    how is the value of alpha1 updated..

  • @lenargilmanov7893
    @lenargilmanov7893 Год назад

    What I don't understand is: why use an iterative process if we know that there's exactly one minimum, just set the gradient to 0 and solve the resulting system of equations, no?

    • @mrlolkar6229
      @mrlolkar6229 Год назад

      those methods are used when you have let say 10^6+ equations (for example in Finite Element Method). With those method you solve in much faster then by setting all derivatives equal to 0. Even it seems there you need all steps to get to minimum its not true, usually you are close enought even in those humangus number of equations to minimum that you are already satisfied with answer that you dont need rest of 95% of missing gradients, and thats why those methods are so powerfull,.

    • @lenargilmanov7893
      @lenargilmanov7893 Год назад

      @@mrlolkar6229 Yeah, I kinda figured it out now.

  • @yubai6549
    @yubai6549 6 лет назад

    Many thanks!

  • @exploreLehigh
    @exploreLehigh 3 года назад

    gold

  • @frankruta4701
    @frankruta4701 3 года назад

    is alpha_k a matrix or scalar quantity?

    • @frankruta4701
      @frankruta4701 3 года назад

      scalar... i just didn't flatten my residual (which was a matrix in my case)

  • @xruan6582
    @xruan6582 4 года назад +5

    lack of detailed explanation and hard to understand

  • @bellfish188
    @bellfish188 5 месяцев назад

    low volume

  • @aboubekeurhamdi-cherif6962
    @aboubekeurhamdi-cherif6962 9 лет назад +1

    Sorry! Something was missing in my last comment. Please note that x* is the minimizer and NOT the minimum.

    • @kokori100
      @kokori100 8 лет назад

      +Aboubekeur Hamdi-Cherif yeap same notice

    • @yashvander
      @yashvander 4 года назад

      hmm, that means x1 = x0 + x*
      right?

  • @beeseb
    @beeseb Год назад

    🍵

  • @bigsh0w1
    @bigsh0w1 9 лет назад

    Please can you share the code

  • @Aarshyboy96
    @Aarshyboy96 4 года назад

    I dont understand how you updated alpha1.

  • @AdityaPrasad007
    @AdityaPrasad007 5 лет назад +5

    wow interesting how she made one technical video and stopped. Motivation was lost I guess?

    • @nickp7526
      @nickp7526 3 года назад +2

      Have you not seen Bear and Simba dumbass?

    • @AdityaPrasad007
      @AdityaPrasad007 3 года назад

      @@nickp7526 I said technical video my dear chap.

    • @ethandickson9490
      @ethandickson9490 3 года назад +1

      @@AdityaPrasad007 Think he was joking bruh

    • @AdityaPrasad007
      @AdityaPrasad007 3 года назад

      @@ethandickson9490 really? I'm pretty bad at sarcasm... @Nick was it a joke?

    • @PriyaDeo
      @PriyaDeo  3 года назад +6

      I made the video for a class. I guess I didn't expect it to get so many views and comments especially for people to keep watching it after some years. But if theres alot of interest I can make another video. Do you have any suggestions for topics?

  • @Marmelademeister
    @Marmelademeister 5 лет назад +2

    It’s okay... It’s too slow at the beginning and too fast at the end. And why would you start with gradient descent? I would think that most people studying cg are already miles beyond gradient descent, have seen Newton’s method and now study Newton-like methods.

  • @DLSMauu
    @DLSMauu 8 лет назад +1

    cute lecture :P

  • @MyName-gl1bs
    @MyName-gl1bs 2 года назад

    I like fud

  • @erickdanielperez9463
    @erickdanielperez9463 5 лет назад

    You don't use your mathematics for resolve all the problema. If you have problems with more than 3 variables, not is possible look the solution if not used the abstract mathematics. A mutli dimensional problems i.e. chemical problems (pressure, tempeture, flux, composition and rate) only is visualized with math, not with graph. Used you mathematics and numbers

  • @kandidatfysikk86
    @kandidatfysikk86 7 лет назад +1

    Great video!