Introduction To Optimization: Gradient Free Algorithms (2/2) Simulated Annealing, Nelder-Mead

Поделиться
HTML-код
  • Опубликовано: 28 сен 2024

Комментарии • 17

  • @twisties.seeker
    @twisties.seeker 4 года назад

    Thank you for an amazing explanation.

  • @jayktharwani9822
    @jayktharwani9822 3 года назад +4

    This is the best series of videos to actually understand the optimization procedure.

    • @anniehamalian3077
      @anniehamalian3077 Год назад

      you can check www.youtube.com/@decisionbrain to know more about optimization

  • @metaprog46and2
    @metaprog46and2 4 года назад

    Nice video. Your explanation synchs well with the graphics (which are awesome themselves - which design / video maker software did you use?)

    • @alphaopt2024
      @alphaopt2024  4 года назад +4

      Powerpoint if you can believe it. You can do a lot with the morph transition.

    • @metaprog46and2
      @metaprog46and2 4 года назад

      @@alphaopt2024 Wow. Color me surprised. I'll have to get over my natural disdain for PPT lol. Thanks for the response!

  • @alvarorodriguez8575
    @alvarorodriguez8575 6 лет назад

    Hello, thank you for the video, I have a question, for multi objective optimization the same classification applies or it is different? Especially looking forward to Buildings Multi-disciplinary and multi-objective optimization problems, Thank you !

  • @grimonce
    @grimonce 6 лет назад +2

    Really helpful and neat explanation, thanks for the video :)

  • @mitjadrab6529
    @mitjadrab6529 6 лет назад +2

    What is the simulation program shown at 1:30?

    • @alphaopt2024
      @alphaopt2024  6 лет назад

      Hi Mitja, I used Algodoo for the simulation: www.algodoo.com/

  • @domaminakoi5630
    @domaminakoi5630 2 года назад

    Do you have experience on when to you which of the gradient free algorithms? PSO has worked best for me in the past. Haven't been succesful implementing a simulated annealing with good results yet.

  • @adelsayyahi9665
    @adelsayyahi9665 Год назад

    Thank you, what is the name of the algoodo tolbox you used for simulated annealing?

  • @andrea-mj9ce
    @andrea-mj9ce Год назад

    The Nelder-Mead method is not explained long enough to understand it.

  • @vi5hnupradeep
    @vi5hnupradeep 3 года назад

    Thank you so much 💯

  • @where-is-my-mind.
    @where-is-my-mind. 5 лет назад

    gradient-based optimisation also doesn't guarantee an optimal solution.

    • @where-is-my-mind.
      @where-is-my-mind. 5 лет назад

      @Dat Boi When you say "guarantee an optimal solution", I presume you mean global optimum. I don't know where you've learnt that but that's not correct. If you have a paper to back that up please reference it so I can take a look too. To start with, there are infinite number of optimisation problems and gradient-based optimisation can only solve a handful of it. Because real world problems are hardly differentiable, hence why the derivative-free or non-gradient optimisation algorithms emerged. Now going back to what you said about the second-order optimisation, it is more "efficient" in terms of convergence in comparison to first-order optimisation however, optimality of the solution has nothing to do with the speed of convergence. Just like first-order methods, second-order methods are also very likely to be stuck in local minimas so it doesn't guarantee an optimal solution. In fact, I've seen many studies where they've obtained better result optimising a specific problem with gradient descent instead of a 2nd order method. So to wrap up, it's not as simple as you've stated.

    • @parg2244
      @parg2244 4 года назад

      @@where-is-my-mind. Hi! could you recommend a book about optimization?