Descent methods and line search: Golden section

Поделиться
HTML-код
  • Опубликовано: 18 ноя 2024

Комментарии • 7

  • @reY-bk2vl
    @reY-bk2vl Год назад +1

    Fabulous work. Many thanks!

  • @nabajyotimajumdar4511
    @nabajyotimajumdar4511 3 года назад +1

    Thanks!

  • @yassinebouzbiba1360
    @yassinebouzbiba1360 5 лет назад +1

    Hi professor! Thank you for your explanation. I do however wonder why in your "proof" the ratio was (3-sqr(5))/2 and the golden section is (1+sqr(5))/2. Thank you from the Netherlands!!!

  • @isaacnewton1545
    @isaacnewton1545 3 года назад

    When strict-unimodality holds?

    • @MichelBierlaire
      @MichelBierlaire  3 года назад +1

      Proving unimodality is often hard. If it does not hold, the method will converge to one of the local optima.

  • @pnachtwey
    @pnachtwey 6 месяцев назад

    I would like to see a real example. In my case the line search begins from the current point in the opposite direction of the gradient. I must search along the line. I can just repeatably iterate along the line and evaluate the cost function until it no longer gets smaller. You are suggesting searching between two points but how far should the end point or 'low" point be from the current "high" point. You are assuming the boundary of the end point is known and will bracket the minimum point along the line search. What if it isn't?