Solving Systems of Differential Equations with Eigenvalues and Eigenvectors

Поделиться
HTML-код
  • Опубликовано: 22 окт 2024

Комментарии • 61

  • @ethandavis2369
    @ethandavis2369 2 года назад +58

    I'm going to speak for everyone here, but we all really appreciate the effort you put into these videos. Especially considering they are probably at least tertiary to teaching and research. Thank you, Steve.

    • @macmos1
      @macmos1 2 года назад +2

      definitely

  • @sepxviii731
    @sepxviii731 2 года назад +26

    This is truly a piece of art. Years of frustration healed in three videos. A hero of education.

  • @johnstuder847
    @johnstuder847 2 года назад +11

    So grateful for RUclips and EigenSteve! I barely earned a physics degree back in the ‘90’s, but never really internalized math. Thanks to Dr Kutz lecture on SVD, and your series on same, I have been hooked on math videos for years now. In particular, thank you for not skipping steps, and spelling things out for the non-geniuses among us. After all, if we didn’t need things spelled out, we wouldn’t need some one to show us in the first place. After many years of struggle, I am finally starting to understand the language of math. Thanks to RUclips, you have left a legacy for thousands to benefit from. I know it takes patience to keep from skipping steps… I can see you struggle…but from your RUclips students point of view it’s well worth it!

  • @marco_gallone
    @marco_gallone 2 года назад +9

    When I learned this the first time my advanced control systems professor absolutely butchered the explanation. He robbed me of discovering such an amazing discovery about eigenvalues and eigenvectors. I never understood how or why this works.
    Thank you for explaining so clearly, Steve, this series is amazing!

  • @sebastianrada4107
    @sebastianrada4107 4 месяца назад +5

    You know that it is going to be a great lecture series when Eigensteve is teaching you about eigenvalues and eigenvectors

  • @shakennotstired8392
    @shakennotstired8392 2 года назад +16

    this lecture is soooo good. I never really understood it when I was self-studying this from a ODE book recently. Now I have a clear and intuitive understanding. Thanks!

    • @whootoo1117
      @whootoo1117 2 года назад +1

      It is ime to get stirred though 😂😂

  • @stevegale4215
    @stevegale4215 2 года назад +1

    It is just under 50 years ago when I covered this, on the Monday one lecturer boringly introduced us to eigenvalues, on the Thursday a control engineer took us from “equations of motion” of a plane through matrix representation and onto eigenvalues. Never forgotten the second approach. Your videos are great.

  • @vinylflouring
    @vinylflouring 10 месяцев назад +3

    My top list of RUclips math teachers
    1) Linear Algebra - Pavel Grinfeld
    2) Engineering Math - Steve Brunton
    3) Diff Geo- Keenan Crane

  • @macmos1
    @macmos1 2 года назад +2

    What I like about these videos is that they are much better and concise version of the material I learned in college. My notes weren't very good so I'm really glad there is a quick and easily accessible reference to this kind of material.

  • @rudypieplenbosch6752
    @rudypieplenbosch6752 2 года назад +3

    Very understandable explanation, amazing how you can simplify things by taking a detour into eigenspaces.

  • @liboyan7010
    @liboyan7010 Год назад

    One of the best professors in the world!!! thanks!

  • @tabhashim3887
    @tabhashim3887 2 года назад +1

    Thank you Dr. Brunton., amazing series!
    Just a side note: I believe it would have been nice if showed how the D power series in 14:59 goes to e^(Dt) by showing that each diagonal entry of the matrices in the power series, was its own power series for each diagonal element. This would prove why e^(Dt) equals a matrix with exponentials in the diagonals.

  • @sitrakaforler8696
    @sitrakaforler8696 2 года назад

    I feel old for those kind of lectures were years ago
    Great video !

  • @hydropage2855
    @hydropage2855 2 месяца назад

    What’s interesting is this is just how to get the exact answers mathematically guaranteed. For a quick and easy approximation, and a reasonably small matrix, you really could just have a computer compute the first few Taylor expansion terms to get something for e^At, and it would actually get you somewhere. I find it interesting that this method is really the same thing but much more efficient and clever

  • @umedina98
    @umedina98 Год назад

    The last explanation was AMAZING!

  • @martian.07unfiltered
    @martian.07unfiltered 7 месяцев назад

    This is so beautiful, Now I can tell where is eigen value and vector is actually used.

  • @Matroskra
    @Matroskra 3 месяца назад

    Absolutely amazing. Thanks for the video!

  • @shashidharreddy2959
    @shashidharreddy2959 2 года назад +8

    Suggestion: You could include the lecture number in the video title.

    • @TNTsundar
      @TNTsundar 2 года назад

      Your comment is 11 days old when the video was uploaded only 4 hours back? 😮

    • @AbrarShaikh2741
      @AbrarShaikh2741 2 года назад

      @@TNTsundar That is because the entire video playlist is available on the channel but individual publish dates are in the future time (t). To know when the next video will come out you will have to solve the y(t) [youtube] diff eqn or you can change the coordinate system by going to the playlist vector and watch all the videos from future.

    • @AbrarShaikh2741
      @AbrarShaikh2741 2 года назад

      Huge thanks sir @Steve Brunton . Future generation of engineers will have real motivation to study eigen values and other fun stuff if they don’t waste their time in PubG or on TikTok. I wish these lectures were there during my college days. All I heard in class for 4 semesters was Cauchy Reimann theorem and Eigen values, Eigen vectors without actually understanding the crux of the matter.

  • @gabrielhz7549
    @gabrielhz7549 2 года назад

    Muito massa nunca tinha aprendido isso na minha vida, a minha faculdade me ensinou de uma maneira muito estranha é essa maneira que você ensina é muito mais simples e intuitiva do que apenas fazer cálculos automáticos usando formulas prontas. Incrível vídeo

  • @VasilevArtem-g4u
    @VasilevArtem-g4u 2 года назад

    Steve, thank you for your work, I'm eternaly thankful for this video series. The most difficult part for me in this topic is notorious Jordan form for non-diagonalizable matrices. It would be a most nice of you if you make a video explaining this topic

    • @VasilevArtem-g4u
      @VasilevArtem-g4u 2 года назад

      I'm a donkey. There is already a video that elaborates on this🙃

  • @thisisrr1
    @thisisrr1 Год назад

    Beautiful. Really wish the best for you!!! Thank you for sharing such wonderful knowledge

  • @wolfisr
    @wolfisr 2 года назад +1

    Thanks for this excellent lecture!
    I wonder what is the physical meaning of a system of ODEs that thier matrix is not diagonalizable? After all not all matrices can transform into diag matrix via eig values and eigvectors.

  • @Key1Kh1
    @Key1Kh1 5 месяцев назад

    بی نظیر بود🎉

  • @ΚωνσταντίνοςΛαζαρίδης-ξ9ι

    Thank you sir!

  • @willson8246
    @willson8246 2 года назад

    THIS VIDEO IS REALY REALY COOL!

  • @manfredbogner9799
    @manfredbogner9799 10 месяцев назад

    Very good

  • @lgl_137noname6
    @lgl_137noname6 2 года назад +1

    When is the book coming out ?

  • @alfredomaussa
    @alfredomaussa 2 года назад

    How different is it for discrete systems? And how the limit dt->0 gives same results as a continuous system.
    Also, I have noticed the general solution for differential equations is of form: y=c1*v1*exp(\lambda1*t)+c2*v2*exp(\lambda2*t), while the P matrix is represented by {v1,v2} where is inv(P)?

  • @wedad.ibrahimwedad.ibrahim8042
    @wedad.ibrahimwedad.ibrahim8042 10 месяцев назад

    Good lecture, but what about using this solution when we have boundary conditions and not initial conditions?

  • @manfredbogner9799
    @manfredbogner9799 10 месяцев назад

    thanks a lot

  • @diveintoengineering6089
    @diveintoengineering6089 2 года назад

    Amazing!

  • @drskelebone
    @drskelebone 2 года назад +1

    I know this has been mirrored since the beginning, but it just occurred to me to look up your university page, and confirm that your hair is in fact flipped here.
    :)

  • @starriet
    @starriet Год назад

    Note for thinking) I think we don't have to expand e^At.
    - We can just directly transform z(t)=e^(Dt)z(0) to x(t)=Te^(Dt)T^(-1)x(0), since x=Tz and thus T^(-1)x=z.
    Note2) Oh, and the professor forgot to show us why x(t)=e^(At)x(0)=Te^(Dt)T^-1x(0) is solution of x'=Ax. Just calculate x' and Ax, and using the fact A=TDT^-1, then we see it's the solution! Very trivial but for logical completeness :)

    • @huangzhao-shen4514
      @huangzhao-shen4514 Год назад

      This does confuses me. I think x(t)=e^At works only if A is diagonal.

    • @huangzhao-shen4514
      @huangzhao-shen4514 Год назад

      ruclips.net/video/O85OWBJ2ayo/видео.html
      This video answered my question.

    • @APaleDot
      @APaleDot Год назад

      You do need to expand for the proof. How else would you show that e^(At) = Te^(Dt)T^-1?

    • @starriet
      @starriet Год назад

      @@APaleDot Thanks for the comment. Well I think it depends on how we think. If we want to express the solution of z'=Dz as the form of "matrix exponential", z=e^(Dt)z(0), then it includes the concept of the expansion of e^(Dt) which is simpler than expanding e^(At) since D is diagonal, and we don't have to directly show that e^(At) = Te^(Dt)T^-1, anyway. But A=TDT^-1 and everything is connected so... it's up to our viewpoint :)

  • @yuriperez1221
    @yuriperez1221 Год назад

    Steve, I really like your content. But I think the text for equations and font size for codes are too small, even watching on a big monitor...

  • @bonbonpony
    @bonbonpony Год назад

    Hmm but what if the coefficients are not just simple constants, but functions of `t` instead?
    Then we won't have just simple numbers in our matrix `A`, but functions of `t` :q
    Which I suppose that it means that the directions of eigenvectors will be moving (rotating) with time, perhaps even along with their centres (fixpoints), am I right?
    How can we deal with such equations?

    • @hydropage2855
      @hydropage2855 2 месяца назад

      Yes then you can’t use this method. This assumes constant coefficients from the beginning

    • @bonbonpony
      @bonbonpony 2 месяца назад

      @@hydropage2855 Your reply was true, known to me, and completely useless. Read till the end and perhaps you will notice that I'm asking what to do with the cases in which this method doesn't apply, and what to do instead that will work.

    • @hydropage2855
      @hydropage2855 2 месяца назад

      @@bonbonpony Sorry, my bad. Are you autistic? That was a pretty robotic response

    • @hydropage2855
      @hydropage2855 2 месяца назад

      @@bonbonpony Oh sorry, my bad. Just curious, do you have autism? That response was a bit robotic and socially awkward

  • @TNTsundar
    @TNTsundar 2 года назад +3

    It’s raining Eigen vectors! 😂

  • @RoodJood
    @RoodJood 2 года назад

    Why is only one T and T inverse pair canceled when matrix A is squared?

    • @APaleDot
      @APaleDot Год назад

      Matrix multiplication is not commutative. You cannot rearrange the terms in a product as you please.
      So only the terms which touch each other directly will cancel.

    • @hydropage2855
      @hydropage2855 2 месяца назад

      You weren’t ready to watch this video if you’re asking this lol

  • @davidmeyer9194
    @davidmeyer9194 2 месяца назад

    How do you people write backwards so neatly?

  • @curtpiazza1688
    @curtpiazza1688 7 месяцев назад

    Fascinating stuff! 😂

  • @mariarahelvarnhagen2729
    @mariarahelvarnhagen2729 2 года назад

    No It Doesn't

  • @younique9710
    @younique9710 4 месяца назад

    at 11:40, does the multiplication between matrices follow transpose(A)*A, instead of A*A? I wonder if A*A is valid.