A great lecturer. Some people are born teachers. A great lecturer and I am buying their book to show my appreciation and thanks to Mr Brunton - and his colleague. From downtown London, UK.
Every time I see a tutorial video for professors in the US universities I envy their students, they don't have to learn eveything twice, once from the classroom from a bad teacher then again on RUclips from amazing teachers like Steve or others
Amazing! Well explained! I've had to watch a few times tbh, because I kept getting distracted by how Impressively he writes backwards. He doesn't even get distracted by the effort!! Just keeps on talking, teaching, and drawing while writing tidy, informative, concise notes... backwards! Amazing!! Thank you!!
I feel so lucky that I can see your videos and your channel, your lessons are amazing, it make me feel deep happy. I can feel that you would love to share your knowledge to everyone. Thank you so much Steve Brunton.
I've used PCA, but have never heard of SVD. After this video I can see how they are related and wonder how I never heard of it. Looking forward to the rest of the series! Thank you so much, you are great at explaining!
Same for me, it`s like your told to use some math(PCA) but they left out the foundation(SVD). The lectures and the book together are really really good and that you get the material for free is awesome.
The fact that I'm seeing this video 2 years after it was posted makes me feel that I'm about two years behind the best in what exists in machine learning and data science. =( Well, better late than never! Haha! Great class, Steve! You're an awesome teacher! Greetings from Brazil!
The instructor’s machine learning basics course is clear and makes complex concepts easy to understand. The structure is well-organized, making it very beginner-friendly. Thank you for the thoughtful teaching!
The professor has so patiently explained every individual part of the whole equation with so much of attention and beauty, he literally made me "feel" the entire concept! Sir, kindly teach others on how to teach as well. We need a lot more people like you in this world.
Mr. Brunton certainly understands how to take the students along in the lecture and not leave them at sea. Also, I believe the video is a mirror reflection of Mr Brunton actions which would produce the same effects.
A good RUclips rule of thumb is to never read the comments. An caveat to that rule is if the poster is a skilled educator. Thank you so much for your wonderful video!
In my first semester of graduate school for ML, was starting to think this wasnt for me based on how lost I have been on this topic. You saved me from imposter syndrome, thank you
I haven't finished the video but it is so great to have a preamble explaining what is what, especially something very simple but sometimes ambiguous like what is the direction of the matrix vectors! It might seem obvious to some but in my experience this is often confusing, I tend to make an assumption but often halfway through I have to backtrace the whole calculation trying to evaluate the alternative interpretation.
Steve I feel there is a correction required, (I might be wrong also) starting from duration 06:14, where you have explained that UUT = UTU = [I]nxn. It will be identity matrix but will their dimensions will be same? Suppose U is an nxr matrix then UT will be rxn. Then in that case UUT will be nxn and UTU will be rxr. Please guide
The SVD, in general, is not unique. Contrary to what the author said at 12:10 the non-uniqueness is not merely a question of signs of the vectors in the unitary matrices. Rather, when sigular values appear more than once, the corresponding eigenvectors can be multiplied by any unitary transformation of the space corresponding to the sigular value with higher multiplicity.
Amazing content, amazing explanation, amazing videography. Thank you so much for your work. I am truly grateful and I wish I could be your student in my lifetime. I watched one video and immediately knew that I have to hit the subscribe button. Thank you once again
Buying Steve's book after watch few of the videos. I am experienced Aerospace Engineering PM and write python code for automation. Want to learn SVD using Python. Thanks for the videos
@steve brunton or anyone else. What does Steve mean at 9:43 when he says that big sigma captures the "energy" of the flows? How is sigma sorted? is it done manually? what mathematical process does he do to get BIG sigma from U?
at 3:46 it is said that matrix U is of shape nxn. Now considering the fact that the U is collection of eigen vectors of data, how can the total number of eigen vectors exceed number of examples (m)? Why isnt the shape nxm where 1st column represent eigen vector of 1st example in data and mth column represent eigen vectors of mth example in data. What are the eigen vectors from mth column to nth column representing?
In this pandemic, online teaching is a must, and this style of presentation is excellent, as the student can the face the teacher all the time. This method of teaching should be incorporated across the world.
Hello Prof. Steve, it is such an interesting series. I think that A = (nxm) at min 0:45 it should be n faces, and m are pixels? isn't it? Because normal people show data in vertical format rather than in horizontal format
thank you so much for this series! I just started and have a stupid question, what does it mean when you say eigen something? I know eigenvectors are the vectors that remain the same direction after matrix transformation, but eigen-flows/faces?
Good question. I use this to mean "characteristic" or "latent" (which is what it means for eigenvalues/vectors too). But eigenflows and eigenfaces are eigenvectors of a large data correlation matrix. So in a sense, they are eigenvectors of a particular linear algebra problem. Specifically, if I stack images as column vectors in a matrix X, then the "eigen-images" of X are the eigenvectors of X^T * X.
RtoL writing behind a glass is revolutionizing online teaching, the 1st of which seems to be that on Nancy's channel on calculus. Super cerebrum-cerebellum-hand axis👍
math needs to be reviewed again and again, and now I am back to this class again to pick up the SVD... Orz, by the way, excellent class if you want to know the frame about this method, this class is excellent, and if you want to go deep into the math, maybe you should buy a book and try some exercise and give the solution to the problem yourself, and then you can handle the conception(or method)
At time 7:11 you stated the sigmas m is greater than or equal to 0, I was reading through my linear notes and I through that sigma m was strictly greater than.
SVD does not only solve Ax = b, but akso (and more directly) Ax=0 or more accurately Ax + e = 0. Highest singular value is the least squares solution, with the solved coefficients for solved simultaneous equations Ax + e = 0 is the first columnn in V (transposed) for the least squares solution.
Awesome!! I'm not really good at it but learning it or listening enhances and adds to others skills that are relevant to this almost... if that makes scense...
I've been looking around the topics of Linear Algebra and its application to Data Science. And I found the best path is from Gilbert Strang to Steve Bruton ! Your videos are awesomely inspiring, and it aligns with Gil's fundamental lectures. A nice baton pass from basic to contemporary application. Sir, are you intentionally doing or influenced by his work ?
This is the best SVD lecture I have ever known! and I have a question why SVD always give first come first serve importance to each vector? if X is a random matrix, how does SVD decide/know the first vector is the most important one and the importance of other vectors decreasing in order. thank you
Hi Professor Bunton, really appreciate the video. I am a bit confused on the 'columns' and 'rows' you refer to the U and V to be honest and I was wondering if it is possible for you to clarify a bit more: If I heard you correctly, essentially the original X matrix is a n x m matrix where n represent the 'features' of a table and m represent the observations of a table. If above is correct: the U is a shape of nxn, where each column represent each feature of the table, aka, each row of the original X matrix. Is that right? and the similar thing goes for: V is a shape of mxm, where each column of V represent each observation of the table, aka, the column of the original X matrix. I think in your video, you mentioned U represent each column of X matrix, which is different from above. Can you lmk where I missed?
Good question. This can be a bit tricky. The columns of "U" are linear combinations of the columns of "X" (and vice versa: the columns of "X" are linear combinations of the columns of "U"), so that is why I'm saying that U represents the column information in X. Similar for V and the rows of X
@@Eigensteve Thank you so much for the response!! Just to make sure: the dimension of linear combination of columns of X is nxn.... (which means, the column count is m but after linear combination, it becomes n?) is that right?
Just after the 10 minute point you talk about the physics, you seem to describe different multiplication depending on faces or flow fields, but the underlying multiplication is the same correct? I understand e.g. there is no time element for the eigen faces but you seem to describe a different multiplication procedure?
Thanks for this great lecture! btw, assuming matrix X is composed of 200 proteins (columns), the level of each of those proteins was measured in 2000 positions (rows) on human skin. How can you explain\interpret U, Sig, and VT matrices in this case? The technical calculation is straight forward with python, but the meaning is quite ambiguous for me.
shouldn't each sample in X be row vectors instead of colum vectors? ie, the X matrix should be m*n instead of the n*m matrix shown in the video. Or am I missing something?
@@xToTaLBoReDoMx Not sure what you mean... Just give it a try: write regularly on a glass and flip the video horizontally... What it is the proof tha this is ridiculous? Could you point the moment in time in the video? I'm really curious...
@@xToTaLBoReDoMx Put a transparent glass between teacher and camera. When the teacher write something on the glass, the camera will see that the teacher write it from right to left. Then if we flip the video horizontally, it will be that the teacher write it from left to right.
Why are comments disabled on other videos?! We all want to say thank you to this dude
this man is such a great teacher
a legend
You are so right. He skips straight through things but he brings you along. I have met some good teachers but SB is incredible!
He also works as a teacher (the two things are not related, just saying he also works as a teacher).
@@romanemul1 a god
@@abeke5523 A myth
Steve Brunton is saving my masters as he saved my undergraduate. What a guy!
I don't know how they do it, but Steve Brunton and 3Blue1Brown can explain stuff in a very impressive comprehensive way
Relating SVD to Fourier series is the most enlightening sentence I had ever heard. Thank you.
Yeah that blew my mind too!
A great lecturer. Some people are born teachers.
A great lecturer and I am buying their book to show my appreciation and thanks to Mr Brunton - and his colleague.
From downtown London, UK.
Every time I see a tutorial video for professors in the US universities I envy their students, they don't have to learn eveything twice, once from the classroom from a bad teacher then again on RUclips from amazing teachers like Steve or others
@@robensonlarokulu4963even at great schools, not all professors are equally good at teaching.
Amazing! Well explained! I've had to watch a few times tbh, because I kept getting distracted by how Impressively he writes backwards. He doesn't even get distracted by the effort!! Just keeps on talking, teaching, and drawing while writing tidy, informative, concise notes... backwards! Amazing!! Thank you!!
You’ve now become the only channel I’m subscribed to where I’ve also hit the bell - I really appreciate your approach
Awesome!
I feel so lucky that I can see your videos and your channel, your lessons are amazing, it make me feel deep happy. I can feel that you would love to share your knowledge to everyone. Thank you so much Steve Brunton.
I've used PCA, but have never heard of SVD. After this video I can see how they are related and wonder how I never heard of it. Looking forward to the rest of the series!
Thank you so much, you are great at explaining!
Same for me, it`s like your told to use some math(PCA) but they left out the foundation(SVD). The lectures and the book together are really really good and that you get the material for free is awesome.
The fact that I'm seeing this video 2 years after it was posted makes me feel that I'm about two years behind the best in what exists in machine learning and data science. =(
Well, better late than never! Haha! Great class, Steve! You're an awesome teacher! Greetings from Brazil!
The instructor’s machine learning basics course is clear and makes complex concepts easy to understand.
The structure is well-organized, making it very beginner-friendly.
Thank you for the thoughtful teaching!
The professor has so patiently explained every individual part of the whole equation with so much of attention and beauty, he literally made me "feel" the entire concept!
Sir, kindly teach others on how to teach as well. We need a lot more people like you in this world.
Mr. Brunton certainly understands how to take the students along in the lecture and not leave them at sea. Also, I believe the video is a mirror reflection of Mr Brunton actions which would produce the same effects.
This guy is like the greatest teacher ever!!
You obviously can see this through your viewership, but holy smokes you have an amazing delivery style. Thank you
i am doing an elective about this and you are practically saving my life
I think i have found a corner of youtube that brings me true joy. Thank you.
A good RUclips rule of thumb is to never read the comments. An caveat to that rule is if the poster is a skilled educator. Thank you so much for your wonderful video!
Outstanding and brilliant with the intuition. A great teacher, and the best set of videos, bar none, on the topic.
Such smooth and intuitive understanding saving me from perplexity of the topics. Thanks Sir!! May God bless you! Love from students from Bharat!
This man is the Guru of so many topics. His explanation is so good even I can understand the material.
Thanks!
This is the first time I learn about SVD and I can fully understand. Thank you!!!
In my first semester of graduate school for ML, was starting to think this wasnt for me based on how lost I have been on this topic. You saved me from imposter syndrome, thank you
Happy to help and thanks for watching!
Great videos!
Attending UW at the moment as a junior in ACMS and I have been referencing your videos for just about every class.
I haven't finished the video but it is so great to have a preamble explaining what is what, especially something very simple but sometimes ambiguous like what is the direction of the matrix vectors! It might seem obvious to some but in my experience this is often confusing, I tend to make an assumption but often halfway through I have to backtrace the whole calculation trying to evaluate the alternative interpretation.
I cannot thank you more for the splendid elucidation of SVD
This is the best video on SVD i have ever come across. Just wow.
Great stuff! You explain the story the mathematics is trying to say in such a clear and understandable way!
Steve I feel there is a correction required, (I might be wrong also) starting from duration 06:14, where you have explained that UUT = UTU = [I]nxn. It will be identity matrix but will their dimensions will be same? Suppose U is an nxr matrix then UT will be rxn. Then in that case UUT will be nxn and UTU will be rxr. Please guide
U and V are singular matrices. So they have to be square.
The SVD, in general, is not unique. Contrary to what the author said at 12:10 the non-uniqueness is not merely a question of signs of the vectors in the unitary matrices. Rather, when sigular values appear more than once, the corresponding eigenvectors can be multiplied by any unitary transformation of the space corresponding to the sigular value with higher multiplicity.
Dude you and some other RUclips teachers have been make my life great and helping me love studying again, thank you truly
Amazing content, amazing explanation, amazing videography. Thank you so much for your work. I am truly grateful and I wish I could be your student in my lifetime. I watched one video and immediately knew that I have to hit the subscribe button. Thank you once again
Oh my God. What a teacher. Thank you sir. I needed to learn this mid career and you are God send...
Incredible production! Congrats from Spain!
This is the best explanation I have seen so far! Great explanation!!
Awsome explanation, before coming to this channel watched other videos and got a bit confused but here the concept is explained so smoothly. Thanks!!
Buying Steve's book after watch few of the videos. I am experienced Aerospace Engineering PM and write python code for automation. Want to learn SVD using Python. Thanks for the videos
@steve brunton or anyone else. What does Steve mean at 9:43 when he says that big sigma captures the "energy" of the flows? How is sigma sorted? is it done manually? what mathematical process does he do to get BIG sigma from U?
at 3:46 it is said that matrix U is of shape nxn. Now considering the fact that the U is collection of eigen vectors of data, how can the total number of eigen vectors exceed number of examples (m)? Why isnt the shape nxm where 1st column represent eigen vector of 1st example in data and mth column represent eigen vectors of mth example in data. What are the eigen vectors from mth column to nth column representing?
Precise, comprehensive, tangible
Great❤
wow. when you strated describing the meaning of U sigma and V, I could see how it was of similar concept of that of fourier series and transform
In this pandemic, online teaching is a must, and this style of presentation is excellent, as the student can the face the teacher all the time. This method of teaching should be incorporated across the world.
You are amazing. Thank you .. The explanation was really clear and your way of teaching is great.
Wow I can clearly visualise it in my head how the multidimensional data is stacked and matched across time.
looking like a nerd, teaching like a BOSS. big like for that man
Hello Prof. Steve, it is such an interesting series.
I think that A = (nxm) at min 0:45 it should be n faces, and m are pixels? isn't it?
Because normal people show data in vertical format rather than in horizontal format
This guy is awesome. This is the Full SVD not the reduced SVD FYI
This guy is amazing!! What a fantastic teacher 🙏
Thank you so much. Simple, clear and with examples, it's nice 👌
WOOOOWW ! Amazing teacher! Thanks Professor, I'll get the book.
OMG this is exactly what I was looking for! And you have explained it in the clearest way possible. Thank you! Instantly subscribed.
thank you so much for this series! I just started and have a stupid question, what does it mean when you say eigen something? I know eigenvectors are the vectors that remain the same direction after matrix transformation, but eigen-flows/faces?
Good question. I use this to mean "characteristic" or "latent" (which is what it means for eigenvalues/vectors too). But eigenflows and eigenfaces are eigenvectors of a large data correlation matrix. So in a sense, they are eigenvectors of a particular linear algebra problem.
Specifically, if I stack images as column vectors in a matrix X, then the "eigen-images" of X are the eigenvectors of X^T * X.
Thank you for all the awesome lectures.We wish you all the best..
RtoL writing behind a glass is revolutionizing online teaching, the 1st of which seems to be that on Nancy's channel on calculus. Super cerebrum-cerebellum-hand axis👍
You're a star , Mr Brunton. Thanks !
blew my mind with the explanation of each of the three elements. thanks!
Awesome, great to hear!
I cannot thank you enough for the awesome explanation! Thank you!
At 10:32 duration, eigen mixtures concept of U w.r.t VT is explained
Much respect from a mathematics teacher
Thank you very much for explaining these in very simple words
Quick question: What software do you use in order to write on the screen?
I’m also wondering about the same thing
math needs to be reviewed again and again, and now I am back to this class again to pick up the SVD... Orz, by the way, excellent class if you want to know the frame about this method, this class is excellent, and if you want to go deep into the math, maybe you should buy a book and try some exercise and give the solution to the problem yourself, and then you can handle the conception(or method)
At time 7:11 you stated the sigmas m is greater than or equal to 0, I was reading through my linear notes and I through that sigma m was strictly greater than.
I am still impressed with how the professor can mirror-write
Thanks a lot teacher, I didn't knew SVD is so simple to understand..
Thank god you made this video, sir. Awesome!
Jesus started to teach Math. Super clear and comprehensive
SVD does not only solve Ax = b, but akso (and more directly) Ax=0 or more accurately Ax + e = 0. Highest singular value is the least squares solution, with the solved coefficients for solved simultaneous equations Ax + e = 0 is the first columnn in V (transposed) for the least squares solution.
Awesome!! I'm not really good at it but learning it or listening enhances and adds to others skills that are relevant to this almost... if that makes scense...
I've been looking around the topics of Linear Algebra and its application to Data Science. And I found the best path is from Gilbert Strang to Steve Bruton ! Your videos are awesomely inspiring, and it aligns with Gil's fundamental lectures. A nice baton pass from basic to contemporary application. Sir, are you intentionally doing or influenced by his work ?
actually he mentions that he follows Brunton's lecture
My linear algebra teacher used Strang's textbook and it was a great experience for learning linear algebra
@steve Brunton! Amazingly explained !! Ite super clear now in my head
Keep up the great work! Excellent channel!
4:49 how come there are n U vectors? Shouldnt it be m of them?
thank you so much for the clear explanation. i'm so touched
This is the best SVD lecture I have ever known! and I have a question why SVD always give first come first serve importance to each vector? if X is a random matrix, how does SVD decide/know the first vector is the most important one and the importance of other vectors decreasing in order. thank you
Hi Professor Bunton, really appreciate the video. I am a bit confused on the 'columns' and 'rows' you refer to the U and V to be honest and I was wondering if it is possible for you to clarify a bit more: If I heard you correctly, essentially the original X matrix is a n x m matrix where n represent the 'features' of a table and m represent the observations of a table.
If above is correct: the U is a shape of nxn, where each column represent each feature of the table, aka, each row of the original X matrix. Is that right?
and the similar thing goes for: V is a shape of mxm, where each column of V represent each observation of the table, aka, the column of the original X matrix.
I think in your video, you mentioned U represent each column of X matrix, which is different from above. Can you lmk where I missed?
Good question. This can be a bit tricky. The columns of "U" are linear combinations of the columns of "X" (and vice versa: the columns of "X" are linear combinations of the columns of "U"), so that is why I'm saying that U represents the column information in X. Similar for V and the rows of X
@@Eigensteve Thank you so much for the response!! Just to make sure: the dimension of linear combination of columns of X is nxn.... (which means, the column count is m but after linear combination, it becomes n?) is that right?
@@GabrielleYadventuremore I think of it a little differently. There will usually be r
Very, very nice explanation and presentation. Thank you!
Probably the best explanation I've seen. At least I understood it))
Why is V transpose represented differently @10:30
Why does the data matrix necessarily have rank m? In general, isn't it possible for X to have rank less than both n and m?
That's a great lecture on svd.Hope for getting more initiative videos
sir you are the best,too good explanation and material
Really sir, how can someone explain like this, thanks for your efforts.
Great video, love the different real world examples
Just after the 10 minute point you talk about the physics, you seem to describe different multiplication depending on faces or flow fields, but the underlying multiplication is the same correct? I understand e.g. there is no time element for the eigen faces but you seem to describe a different multiplication procedure?
You are a legend, Steve
This is so useful and clearly explained. Thank you
you made it so easy to understand.
Thanks!
This is how you teach. Thank you.
This is pure genius.
Brilliant approach. So intuitive.
Thanks for your great video(s). A technical question: Asyou're ordering your sigmas, are you assuming they're all Real-valued?
Thanks for this great lecture!
btw, assuming matrix X is composed of 200 proteins (columns), the level of each of those proteins was measured in 2000 positions (rows) on human skin. How can you explain\interpret U, Sig, and VT matrices in this case? The technical calculation is straight forward with python, but the meaning is quite ambiguous for me.
amazing series! New subscriber here! Love the aesthetics and how clearly you explain everything.
Awesome, thank you!
shouldn't each sample in X be row vectors instead of colum vectors? ie, the X matrix should be m*n instead of the n*m matrix shown in the video. Or am I missing something?
I cannot understand why SVD of a matrix is unique.
Thanks for your great explanation.
I can't find the suitable word that depict your effort .. Great thanks
Can you make a lesson about how to write things backwards x)
Writing backwards 101: write regularly in a transparent glass board and then flip the video horizontally... :P
@@youmarcube You're a genius
@@youmarcube So you think he's just miming the hand movements afterwards then? That's ridiculous
@@xToTaLBoReDoMx Not sure what you mean... Just give it a try: write regularly on a glass and flip the video horizontally... What it is the proof tha this is ridiculous? Could you point the moment in time in the video? I'm really curious...
@@xToTaLBoReDoMx Put a transparent glass between teacher and camera. When the teacher write something on the glass, the camera will see that the teacher write it from right to left. Then if we flip the video horizontally, it will be that the teacher write it from left to right.