I'm going to speak for everyone here, but we all really appreciate the effort you put into these videos. Especially considering they are probably at least tertiary to teaching and research. Thank you, Steve.
So grateful for RUclips and EigenSteve! I barely earned a physics degree back in the ‘90’s, but never really internalized math. Thanks to Dr Kutz lecture on SVD, and your series on same, I have been hooked on math videos for years now. In particular, thank you for not skipping steps, and spelling things out for the non-geniuses among us. After all, if we didn’t need things spelled out, we wouldn’t need some one to show us in the first place. After many years of struggle, I am finally starting to understand the language of math. Thanks to RUclips, you have left a legacy for thousands to benefit from. I know it takes patience to keep from skipping steps… I can see you struggle…but from your RUclips students point of view it’s well worth it!
When I learned this the first time my advanced control systems professor absolutely butchered the explanation. He robbed me of discovering such an amazing discovery about eigenvalues and eigenvectors. I never understood how or why this works. Thank you for explaining so clearly, Steve, this series is amazing!
this lecture is soooo good. I never really understood it when I was self-studying this from a ODE book recently. Now I have a clear and intuitive understanding. Thanks!
It is just under 50 years ago when I covered this, on the Monday one lecturer boringly introduced us to eigenvalues, on the Thursday a control engineer took us from “equations of motion” of a plane through matrix representation and onto eigenvalues. Never forgotten the second approach. Your videos are great.
What I like about these videos is that they are much better and concise version of the material I learned in college. My notes weren't very good so I'm really glad there is a quick and easily accessible reference to this kind of material.
Thank you Dr. Brunton., amazing series! Just a side note: I believe it would have been nice if showed how the D power series in 14:59 goes to e^(Dt) by showing that each diagonal entry of the matrices in the power series, was its own power series for each diagonal element. This would prove why e^(Dt) equals a matrix with exponentials in the diagonals.
What’s interesting is this is just how to get the exact answers mathematically guaranteed. For a quick and easy approximation, and a reasonably small matrix, you really could just have a computer compute the first few Taylor expansion terms to get something for e^At, and it would actually get you somewhere. I find it interesting that this method is really the same thing but much more efficient and clever
@@TNTsundar That is because the entire video playlist is available on the channel but individual publish dates are in the future time (t). To know when the next video will come out you will have to solve the y(t) [youtube] diff eqn or you can change the coordinate system by going to the playlist vector and watch all the videos from future.
Huge thanks sir @Steve Brunton . Future generation of engineers will have real motivation to study eigen values and other fun stuff if they don’t waste their time in PubG or on TikTok. I wish these lectures were there during my college days. All I heard in class for 4 semesters was Cauchy Reimann theorem and Eigen values, Eigen vectors without actually understanding the crux of the matter.
Muito massa nunca tinha aprendido isso na minha vida, a minha faculdade me ensinou de uma maneira muito estranha é essa maneira que você ensina é muito mais simples e intuitiva do que apenas fazer cálculos automáticos usando formulas prontas. Incrível vídeo
Steve, thank you for your work, I'm eternaly thankful for this video series. The most difficult part for me in this topic is notorious Jordan form for non-diagonalizable matrices. It would be a most nice of you if you make a video explaining this topic
Thanks for this excellent lecture! I wonder what is the physical meaning of a system of ODEs that thier matrix is not diagonalizable? After all not all matrices can transform into diag matrix via eig values and eigvectors.
How different is it for discrete systems? And how the limit dt->0 gives same results as a continuous system. Also, I have noticed the general solution for differential equations is of form: y=c1*v1*exp(\lambda1*t)+c2*v2*exp(\lambda2*t), while the P matrix is represented by {v1,v2} where is inv(P)?
I know this has been mirrored since the beginning, but it just occurred to me to look up your university page, and confirm that your hair is in fact flipped here. :)
Note for thinking) I think we don't have to expand e^At. - We can just directly transform z(t)=e^(Dt)z(0) to x(t)=Te^(Dt)T^(-1)x(0), since x=Tz and thus T^(-1)x=z. Note2) Oh, and the professor forgot to show us why x(t)=e^(At)x(0)=Te^(Dt)T^-1x(0) is solution of x'=Ax. Just calculate x' and Ax, and using the fact A=TDT^-1, then we see it's the solution! Very trivial but for logical completeness :)
@@APaleDot Thanks for the comment. Well I think it depends on how we think. If we want to express the solution of z'=Dz as the form of "matrix exponential", z=e^(Dt)z(0), then it includes the concept of the expansion of e^(Dt) which is simpler than expanding e^(At) since D is diagonal, and we don't have to directly show that e^(At) = Te^(Dt)T^-1, anyway. But A=TDT^-1 and everything is connected so... it's up to our viewpoint :)
Hmm but what if the coefficients are not just simple constants, but functions of `t` instead? Then we won't have just simple numbers in our matrix `A`, but functions of `t` :q Which I suppose that it means that the directions of eigenvectors will be moving (rotating) with time, perhaps even along with their centres (fixpoints), am I right? How can we deal with such equations?
@@hydropage2855 Your reply was true, known to me, and completely useless. Read till the end and perhaps you will notice that I'm asking what to do with the cases in which this method doesn't apply, and what to do instead that will work.
Matrix multiplication is not commutative. You cannot rearrange the terms in a product as you please. So only the terms which touch each other directly will cancel.
I'm going to speak for everyone here, but we all really appreciate the effort you put into these videos. Especially considering they are probably at least tertiary to teaching and research. Thank you, Steve.
definitely
This is truly a piece of art. Years of frustration healed in three videos. A hero of education.
So grateful for RUclips and EigenSteve! I barely earned a physics degree back in the ‘90’s, but never really internalized math. Thanks to Dr Kutz lecture on SVD, and your series on same, I have been hooked on math videos for years now. In particular, thank you for not skipping steps, and spelling things out for the non-geniuses among us. After all, if we didn’t need things spelled out, we wouldn’t need some one to show us in the first place. After many years of struggle, I am finally starting to understand the language of math. Thanks to RUclips, you have left a legacy for thousands to benefit from. I know it takes patience to keep from skipping steps… I can see you struggle…but from your RUclips students point of view it’s well worth it!
When I learned this the first time my advanced control systems professor absolutely butchered the explanation. He robbed me of discovering such an amazing discovery about eigenvalues and eigenvectors. I never understood how or why this works.
Thank you for explaining so clearly, Steve, this series is amazing!
You know that it is going to be a great lecture series when Eigensteve is teaching you about eigenvalues and eigenvectors
this lecture is soooo good. I never really understood it when I was self-studying this from a ODE book recently. Now I have a clear and intuitive understanding. Thanks!
It is ime to get stirred though 😂😂
It is just under 50 years ago when I covered this, on the Monday one lecturer boringly introduced us to eigenvalues, on the Thursday a control engineer took us from “equations of motion” of a plane through matrix representation and onto eigenvalues. Never forgotten the second approach. Your videos are great.
My top list of RUclips math teachers
1) Linear Algebra - Pavel Grinfeld
2) Engineering Math - Steve Brunton
3) Diff Geo- Keenan Crane
What I like about these videos is that they are much better and concise version of the material I learned in college. My notes weren't very good so I'm really glad there is a quick and easily accessible reference to this kind of material.
Very understandable explanation, amazing how you can simplify things by taking a detour into eigenspaces.
One of the best professors in the world!!! thanks!
Thank you Dr. Brunton., amazing series!
Just a side note: I believe it would have been nice if showed how the D power series in 14:59 goes to e^(Dt) by showing that each diagonal entry of the matrices in the power series, was its own power series for each diagonal element. This would prove why e^(Dt) equals a matrix with exponentials in the diagonals.
I feel old for those kind of lectures were years ago
Great video !
What’s interesting is this is just how to get the exact answers mathematically guaranteed. For a quick and easy approximation, and a reasonably small matrix, you really could just have a computer compute the first few Taylor expansion terms to get something for e^At, and it would actually get you somewhere. I find it interesting that this method is really the same thing but much more efficient and clever
The last explanation was AMAZING!
This is so beautiful, Now I can tell where is eigen value and vector is actually used.
Absolutely amazing. Thanks for the video!
Suggestion: You could include the lecture number in the video title.
Your comment is 11 days old when the video was uploaded only 4 hours back? 😮
@@TNTsundar That is because the entire video playlist is available on the channel but individual publish dates are in the future time (t). To know when the next video will come out you will have to solve the y(t) [youtube] diff eqn or you can change the coordinate system by going to the playlist vector and watch all the videos from future.
Huge thanks sir @Steve Brunton . Future generation of engineers will have real motivation to study eigen values and other fun stuff if they don’t waste their time in PubG or on TikTok. I wish these lectures were there during my college days. All I heard in class for 4 semesters was Cauchy Reimann theorem and Eigen values, Eigen vectors without actually understanding the crux of the matter.
Muito massa nunca tinha aprendido isso na minha vida, a minha faculdade me ensinou de uma maneira muito estranha é essa maneira que você ensina é muito mais simples e intuitiva do que apenas fazer cálculos automáticos usando formulas prontas. Incrível vídeo
Steve, thank you for your work, I'm eternaly thankful for this video series. The most difficult part for me in this topic is notorious Jordan form for non-diagonalizable matrices. It would be a most nice of you if you make a video explaining this topic
I'm a donkey. There is already a video that elaborates on this🙃
Beautiful. Really wish the best for you!!! Thank you for sharing such wonderful knowledge
Thanks for this excellent lecture!
I wonder what is the physical meaning of a system of ODEs that thier matrix is not diagonalizable? After all not all matrices can transform into diag matrix via eig values and eigvectors.
بی نظیر بود🎉
Thank you sir!
THIS VIDEO IS REALY REALY COOL!
Very good
When is the book coming out ?
How different is it for discrete systems? And how the limit dt->0 gives same results as a continuous system.
Also, I have noticed the general solution for differential equations is of form: y=c1*v1*exp(\lambda1*t)+c2*v2*exp(\lambda2*t), while the P matrix is represented by {v1,v2} where is inv(P)?
Good lecture, but what about using this solution when we have boundary conditions and not initial conditions?
thanks a lot
Amazing!
I know this has been mirrored since the beginning, but it just occurred to me to look up your university page, and confirm that your hair is in fact flipped here.
:)
Note for thinking) I think we don't have to expand e^At.
- We can just directly transform z(t)=e^(Dt)z(0) to x(t)=Te^(Dt)T^(-1)x(0), since x=Tz and thus T^(-1)x=z.
Note2) Oh, and the professor forgot to show us why x(t)=e^(At)x(0)=Te^(Dt)T^-1x(0) is solution of x'=Ax. Just calculate x' and Ax, and using the fact A=TDT^-1, then we see it's the solution! Very trivial but for logical completeness :)
This does confuses me. I think x(t)=e^At works only if A is diagonal.
ruclips.net/video/O85OWBJ2ayo/видео.html
This video answered my question.
You do need to expand for the proof. How else would you show that e^(At) = Te^(Dt)T^-1?
@@APaleDot Thanks for the comment. Well I think it depends on how we think. If we want to express the solution of z'=Dz as the form of "matrix exponential", z=e^(Dt)z(0), then it includes the concept of the expansion of e^(Dt) which is simpler than expanding e^(At) since D is diagonal, and we don't have to directly show that e^(At) = Te^(Dt)T^-1, anyway. But A=TDT^-1 and everything is connected so... it's up to our viewpoint :)
Steve, I really like your content. But I think the text for equations and font size for codes are too small, even watching on a big monitor...
They’re really not
Hmm but what if the coefficients are not just simple constants, but functions of `t` instead?
Then we won't have just simple numbers in our matrix `A`, but functions of `t` :q
Which I suppose that it means that the directions of eigenvectors will be moving (rotating) with time, perhaps even along with their centres (fixpoints), am I right?
How can we deal with such equations?
Yes then you can’t use this method. This assumes constant coefficients from the beginning
@@hydropage2855 Your reply was true, known to me, and completely useless. Read till the end and perhaps you will notice that I'm asking what to do with the cases in which this method doesn't apply, and what to do instead that will work.
@@bonbonpony Sorry, my bad. Are you autistic? That was a pretty robotic response
@@bonbonpony Oh sorry, my bad. Just curious, do you have autism? That response was a bit robotic and socially awkward
It’s raining Eigen vectors! 😂
Why is only one T and T inverse pair canceled when matrix A is squared?
Matrix multiplication is not commutative. You cannot rearrange the terms in a product as you please.
So only the terms which touch each other directly will cancel.
You weren’t ready to watch this video if you’re asking this lol
How do you people write backwards so neatly?
I think they flip the video.
Fascinating stuff! 😂
No It Doesn't
What are you talking about
at 11:40, does the multiplication between matrices follow transpose(A)*A, instead of A*A? I wonder if A*A is valid.