Neural Networks for Solving PDEs

Поделиться
HTML-код
  • Опубликовано: 24 дек 2024

Комментарии • 31

  • @shailendrakaushik9281
    @shailendrakaushik9281 4 года назад +4

    An excellent review of PINNs and a very fascinating way to choose Lambda to weigh optimally the losses on boundary versus interior points. Do you a have a tutorial problem with code that exemplifies this approach? Please let me know. Thanks

    • @anastasiaborovykh120
      @anastasiaborovykh120 4 года назад +1

      Thank you :) I am happy to hear you found it interesting! Our code is available on Github: github.com/remcovandermeer/Optimally-Weighted-PINNs

  • @Eta_Carinae__
    @Eta_Carinae__ 2 года назад +1

    Have you heard of SINDy from Brunton's lab at UW?

    • @keeperofthelight9681
      @keeperofthelight9681 Год назад +1

      Steve Brunton is my favorite teacher when it comes to
      Machine learning meets dynamical systems

  • @abderrahmaneouachouach926
    @abderrahmaneouachouach926 Год назад

    Could you please provide a citation for the theorem (MOB, 2020) that you mentioned in 5:09? I couldn't find it anywhere.

  • @bikmeyevAT
    @bikmeyevAT 3 года назад +2

    Great presentation and one of the most understandable explanation of PDE AI-solver!
    Many thanks!!!

  • @alexeychernyavskiy4193
    @alexeychernyavskiy4193 4 года назад +4

    Thank you, Anastasia. The approach of trying to find those collocation that have the most effect on the final solution could be a very promising direction of research indeed. While you demonstrated a coupe of model examples, it would be great to see one day these methods applied to, e.g., fluid flows for reservoir modelling, gas dynamics etc.

    • @anastasiaborovykh120
      @anastasiaborovykh120 4 года назад +1

      Agree; those are very interesting future directions we are thinking about!

  • @mohammedaajaji2265
    @mohammedaajaji2265 4 года назад +2

    Hi @Anastasia Borovykh
    Thank for this presentation, I read the article and I'm playing around with the code, and I wonder if we can solve PDEs that depend both on time and space or the application of this method is only limited to space dimension.
    I would like to apply the approach to solve PDEs in Finance (for example the Black Scholes PDE), and where only the Boundary value at the final time is available, and we are interested in the solution value at initial time.
    It will be helpful if you can comment on this

    • @anastasiaborovykh120
      @anastasiaborovykh120 4 года назад +5

      Hi! Thank you for your interest :) Yes, definitely! In that case you would just create the collocation points also over your time variable. I have not worked on the financial applications of this method myself, but my collaborators have a paper where they use the weighting of the loss function to compute various option prices: arxiv.org/pdf/2005.12059.pdf Specifically in section 3.1 the Black Scholes model is discussed. Hope this helps! Anastasia

  • @leon-tjomb
    @leon-tjomb 2 года назад

    Hello Anastasia,
    So interesting your presentation.
    I'm Leon, I'm currently working on PINNs for a vibration problem: Case of a beam Bridge.
    I would like to know if we are dealing with time dependence PDE such as if we have boundaries and initial the condition, how can we define the loss function since we would like to minimise de weight?
    Best regards,

  • @AhmedEmamAI1
    @AhmedEmamAI1 4 года назад +2

    Great explanation, can you make a video in hidden physics models HPM

  • @edvinbeqari7551
    @edvinbeqari7551 4 года назад

    Can you let lambda be a parameter - and use gradient descent to find the its optimal value? meaning for each train step - take the gradient of the loss with respect to lambda

    • @oliverhennigh451
      @oliverhennigh451 4 года назад

      If you did this and optimized lambda on the same loss function then lambda would converge to either 1 or 0. The network would learn either the zero solution (a constant) which would satisfy the PDE but not the boundary conditions or it would only satisfy the boundary conditions but not the PDE at all.

    • @edvinbeqari7551
      @edvinbeqari7551 4 года назад

      @@oliverhennigh451 Thanks for the comment. My setup is slightly different, I am trying the inverse problem on fitting the parameters to an ode i.e.: x" + bx' + kx = 0. I sampled and perturbed the real solution - and used that data as domain data. Hence, I have three sets of losses - 1. the ode loss (loss_f), the IC loss (loss_ic) and the loss between predicted and sampled data (loss_u). I let the loss be λ^2 * (loss_f + loss_ic) + (1-λ^2) * loss_u, and take derivatates of the loss with respset to b, k, λ. I square lambda so the loss remains positive. It is true that lambda becomes pretty small but not zero - but I am getting good results and b and k approach the actual values. Perhaps, what I am doing does not make sense but I am experimenting on my own. I would love some friends that know the material. Happy to share what I have.

    • @anastasiaborovykh120
      @anastasiaborovykh120 4 года назад +2

      @@edvinbeqari7551 That sounds interesting. The way I see it is that if we optimize lambda while training then we just select the lambda which makes it most easy for the NN to make the loss small (what Oliver Hennigh also mentions). In our case it is not just about making the loss small but finding a weighting between interior and boundary such that a small loss implies a solution close to true PDE solution. In your case I would view the loss_f + loss_ic as a regularization-like term. But exactly the meaning of optimizing it while training would mean I'd have to think about a bit more...

    • @edvinbeqari7551
      @edvinbeqari7551 4 года назад

      @@anastasiaborovykh120 Hi Anastasia - do you have a document where I can see the full derivation of the optimal lambda. Perhaps, a simple example. I would love to learn your method.

    • @anastasiaborovykh120
      @anastasiaborovykh120 4 года назад +2

      Yes definitely. The derivation we did is in our paper arxiv.org/pdf/2002.06269

  • @bingli1918
    @bingli1918 3 года назад

    Thanks for sharing this excellent presentation

  • @samuelauerbacher7982
    @samuelauerbacher7982 3 года назад

    a really good and well structured talk! helped me a lot to prepare my bachelor thesis which will be about that topic

    • @ionlipsiuc8608
      @ionlipsiuc8608 Год назад

      Hey Samuel, I was wondering if I could get some form of contact information from you as I am also working on my Bachelor Thesis about the same topic and was hoping to get some insights from others. Thank you.

  • @gauravbokil8
    @gauravbokil8 3 года назад

    Thanks Anastasia. If you ever see this comment., THANK YOU SO MUCH!

  • @BryceChudomelka
    @BryceChudomelka 4 года назад +1

    Bravo

  • @999nilbog
    @999nilbog 3 года назад +3

    whawaw wait, you so pretty