Adjoint Sensitivities over nonlinear equation with JAX Automatic Differentiation

Поделиться
HTML-код
  • Опубликовано: 12 сен 2024

Комментарии • 4

  • @julienblanchon6082
    @julienblanchon6082 2 года назад +2

    I realy enjoy the format of your video ! Thanks you very much

    • @MachineLearningSimulation
      @MachineLearningSimulation  2 года назад +1

      I'm really happy to hear that :). Thanks a lot! There is more like this coming over the next weeks.

  • @jahongirxayrullaev108
    @jahongirxayrullaev108 Год назад +1

    Thank you for the video. Can we calculate gradient of loss function which is constrained by ODEs?

    • @MachineLearningSimulation
      @MachineLearningSimulation  Год назад +1

      You're very welcome 🤗
      You find two additional videos on the channel (look for "neural ODE") if you are interested in the theory. I don't have an Implementation based video (yet), but there is a draft I uploaded on the GitHub repo: github.com/Ceyron/machine-learning-and-simulation/blob/main/english/adjoints_sensitivities_automatic_differentiation/adjoint_ode.py