A great lecturer. Some people are born teachers. A great lecturer and I am buying their book to show my appreciation and thanks to Mr Brunton - and his colleague. From downtown London, UK.
Mr. Brunton certainly understands how to take the students along in the lecture and not leave them at sea. Also, I believe the video is a mirror reflection of Mr Brunton actions which would produce the same effects.
The instructor’s machine learning basics course is clear and makes complex concepts easy to understand. The structure is well-organized, making it very beginner-friendly. Thank you for the thoughtful teaching!
Every time I see a tutorial video for professors in the US universities I envy their students, they don't have to learn eveything twice, once from the classroom from a bad teacher then again on RUclips from amazing teachers like Steve or others
Not all US universities are like that. These are the guys at top schools who really mastered their subject's state-of-the-art. At ordinary state universities, students can not even get a proper education in basic subjects, and professors are too busy, depressed, and pressured with research to be able to have any sanity to teach properly.
I feel so lucky that I can see your videos and your channel, your lessons are amazing, it make me feel deep happy. I can feel that you would love to share your knowledge to everyone. Thank you so much Steve Brunton.
Amazing! Well explained! I've had to watch a few times tbh, because I kept getting distracted by how Impressively he writes backwards. He doesn't even get distracted by the effort!! Just keeps on talking, teaching, and drawing while writing tidy, informative, concise notes... backwards! Amazing!! Thank you!!
I've used PCA, but have never heard of SVD. After this video I can see how they are related and wonder how I never heard of it. Looking forward to the rest of the series! Thank you so much, you are great at explaining!
Same for me, it`s like your told to use some math(PCA) but they left out the foundation(SVD). The lectures and the book together are really really good and that you get the material for free is awesome.
A good RUclips rule of thumb is to never read the comments. An caveat to that rule is if the poster is a skilled educator. Thank you so much for your wonderful video!
The fact that I'm seeing this video 2 years after it was posted makes me feel that I'm about two years behind the best in what exists in machine learning and data science. =( Well, better late than never! Haha! Great class, Steve! You're an awesome teacher! Greetings from Brazil!
I haven't finished the video but it is so great to have a preamble explaining what is what, especially something very simple but sometimes ambiguous like what is the direction of the matrix vectors! It might seem obvious to some but in my experience this is often confusing, I tend to make an assumption but often halfway through I have to backtrace the whole calculation trying to evaluate the alternative interpretation.
The SVD, in general, is not unique. Contrary to what the author said at 12:10 the non-uniqueness is not merely a question of signs of the vectors in the unitary matrices. Rather, when sigular values appear more than once, the corresponding eigenvectors can be multiplied by any unitary transformation of the space corresponding to the sigular value with higher multiplicity.
The professor has so patiently explained every individual part of the whole equation with so much of attention and beauty, he literally made me "feel" the entire concept! Sir, kindly teach others on how to teach as well. We need a lot more people like you in this world.
In my first semester of graduate school for ML, was starting to think this wasnt for me based on how lost I have been on this topic. You saved me from imposter syndrome, thank you
Amazing content, amazing explanation, amazing videography. Thank you so much for your work. I am truly grateful and I wish I could be your student in my lifetime. I watched one video and immediately knew that I have to hit the subscribe button. Thank you once again
Buying Steve's book after watch few of the videos. I am experienced Aerospace Engineering PM and write python code for automation. Want to learn SVD using Python. Thanks for the videos
RtoL writing behind a glass is revolutionizing online teaching, the 1st of which seems to be that on Nancy's channel on calculus. Super cerebrum-cerebellum-hand axis👍
This is the best SVD lecture I have ever known! and I have a question why SVD always give first come first serve importance to each vector? if X is a random matrix, how does SVD decide/know the first vector is the most important one and the importance of other vectors decreasing in order. thank you
Steve I feel there is a correction required, (I might be wrong also) starting from duration 06:14, where you have explained that UUT = UTU = [I]nxn. It will be identity matrix but will their dimensions will be same? Suppose U is an nxr matrix then UT will be rxn. Then in that case UUT will be nxn and UTU will be rxr. Please guide
SVD does not only solve Ax = b, but akso (and more directly) Ax=0 or more accurately Ax + e = 0. Highest singular value is the least squares solution, with the solved coefficients for solved simultaneous equations Ax + e = 0 is the first columnn in V (transposed) for the least squares solution.
Please have a lecture on Wavelet decomposition you are the best. The time and effort you have put in is so clearly seen, please also have lecture on how to give good presentations like yours and how to prepare for it 🤓🤓
I've been looking around the topics of Linear Algebra and its application to Data Science. And I found the best path is from Gilbert Strang to Steve Bruton ! Your videos are awesomely inspiring, and it aligns with Gil's fundamental lectures. A nice baton pass from basic to contemporary application. Sir, are you intentionally doing or influenced by his work ?
thank you so much for this series! I just started and have a stupid question, what does it mean when you say eigen something? I know eigenvectors are the vectors that remain the same direction after matrix transformation, but eigen-flows/faces?
Good question. I use this to mean "characteristic" or "latent" (which is what it means for eigenvalues/vectors too). But eigenflows and eigenfaces are eigenvectors of a large data correlation matrix. So in a sense, they are eigenvectors of a particular linear algebra problem. Specifically, if I stack images as column vectors in a matrix X, then the "eigen-images" of X are the eigenvectors of X^T * X.
Dear Professor: May God bless you for your kindness of teaching us!!! One question: in some country which has a huge population, by scanning a person's face, the authority can identify him/her in a couple of seconds. Considering the 'face library' is so huge, then the matrix will be huuuuge. There must be additional trick(s) to do this. Could you hint a little bit on it? Thank you very much!
math needs to be reviewed again and again, and now I am back to this class again to pick up the SVD... Orz, by the way, excellent class if you want to know the frame about this method, this class is excellent, and if you want to go deep into the math, maybe you should buy a book and try some exercise and give the solution to the problem yourself, and then you can handle the conception(or method)
Hi Professor Bunton, really appreciate the video. I am a bit confused on the 'columns' and 'rows' you refer to the U and V to be honest and I was wondering if it is possible for you to clarify a bit more: If I heard you correctly, essentially the original X matrix is a n x m matrix where n represent the 'features' of a table and m represent the observations of a table. If above is correct: the U is a shape of nxn, where each column represent each feature of the table, aka, each row of the original X matrix. Is that right? and the similar thing goes for: V is a shape of mxm, where each column of V represent each observation of the table, aka, the column of the original X matrix. I think in your video, you mentioned U represent each column of X matrix, which is different from above. Can you lmk where I missed?
Good question. This can be a bit tricky. The columns of "U" are linear combinations of the columns of "X" (and vice versa: the columns of "X" are linear combinations of the columns of "U"), so that is why I'm saying that U represents the column information in X. Similar for V and the rows of X
@@Eigensteve Thank you so much for the response!! Just to make sure: the dimension of linear combination of columns of X is nxn.... (which means, the column count is m but after linear combination, it becomes n?) is that right?
Awesome!! I'm not really good at it but learning it or listening enhances and adds to others skills that are relevant to this almost... if that makes scense...
Thanks for this great lecture! btw, assuming matrix X is composed of 200 proteins (columns), the level of each of those proteins was measured in 2000 positions (rows) on human skin. How can you explain\interpret U, Sig, and VT matrices in this case? The technical calculation is straight forward with python, but the meaning is quite ambiguous for me.
Can't tell what is more impressive, the fact that he's able to explain SVD in such simple terms or the fact that he wrote the entire lecture... BACKWARDS?
In this pandemic, online teaching is a must, and this style of presentation is excellent, as the student can the face the teacher all the time. This method of teaching should be incorporated across the world.
I didn't get the dimentions of U, S and V. In Wikipedia, a full svd is: U (mxm), S (nxn) and V (nxn). In this video, the dimensions are different, U (nxm) and S (mxn).
Why are comments disabled on other videos?! We all want to say thank you to this dude
this man is such a great teacher
a legend
You are so right. He skips straight through things but he brings you along. I have met some good teachers but SB is incredible!
He also works as a teacher (the two things are not related, just saying he also works as a teacher).
@@romanemul1 a god
@@abeke5523 A myth
Steve Brunton is saving my masters as he saved my undergraduate. What a guy!
I don't know how they do it, but Steve Brunton and 3Blue1Brown can explain stuff in a very impressive comprehensive way
Relating SVD to Fourier series is the most enlightening sentence I had ever heard. Thank you.
Yeah that blew my mind too!
A great lecturer. Some people are born teachers.
A great lecturer and I am buying their book to show my appreciation and thanks to Mr Brunton - and his colleague.
From downtown London, UK.
Mr. Brunton certainly understands how to take the students along in the lecture and not leave them at sea. Also, I believe the video is a mirror reflection of Mr Brunton actions which would produce the same effects.
The instructor’s machine learning basics course is clear and makes complex concepts easy to understand.
The structure is well-organized, making it very beginner-friendly.
Thank you for the thoughtful teaching!
Every time I see a tutorial video for professors in the US universities I envy their students, they don't have to learn eveything twice, once from the classroom from a bad teacher then again on RUclips from amazing teachers like Steve or others
Not all US universities are like that. These are the guys at top schools who really mastered their subject's state-of-the-art. At ordinary state universities, students can not even get a proper education in basic subjects, and professors are too busy, depressed, and pressured with research to be able to have any sanity to teach properly.
@@robensonlarokulu4963even at great schools, not all professors are equally good at teaching.
I feel so lucky that I can see your videos and your channel, your lessons are amazing, it make me feel deep happy. I can feel that you would love to share your knowledge to everyone. Thank you so much Steve Brunton.
You’ve now become the only channel I’m subscribed to where I’ve also hit the bell - I really appreciate your approach
Awesome!
Amazing! Well explained! I've had to watch a few times tbh, because I kept getting distracted by how Impressively he writes backwards. He doesn't even get distracted by the effort!! Just keeps on talking, teaching, and drawing while writing tidy, informative, concise notes... backwards! Amazing!! Thank you!!
This man is the Guru of so many topics. His explanation is so good even I can understand the material.
Thanks!
I've used PCA, but have never heard of SVD. After this video I can see how they are related and wonder how I never heard of it. Looking forward to the rest of the series!
Thank you so much, you are great at explaining!
Same for me, it`s like your told to use some math(PCA) but they left out the foundation(SVD). The lectures and the book together are really really good and that you get the material for free is awesome.
This guy is like the greatest teacher ever!!
You obviously can see this through your viewership, but holy smokes you have an amazing delivery style. Thank you
A good RUclips rule of thumb is to never read the comments. An caveat to that rule is if the poster is a skilled educator. Thank you so much for your wonderful video!
I think i have found a corner of youtube that brings me true joy. Thank you.
Great videos!
Attending UW at the moment as a junior in ACMS and I have been referencing your videos for just about every class.
I cannot thank you more for the splendid elucidation of SVD
The fact that I'm seeing this video 2 years after it was posted makes me feel that I'm about two years behind the best in what exists in machine learning and data science. =(
Well, better late than never! Haha! Great class, Steve! You're an awesome teacher! Greetings from Brazil!
i am doing an elective about this and you are practically saving my life
Outstanding and brilliant with the intuition. A great teacher, and the best set of videos, bar none, on the topic.
I haven't finished the video but it is so great to have a preamble explaining what is what, especially something very simple but sometimes ambiguous like what is the direction of the matrix vectors! It might seem obvious to some but in my experience this is often confusing, I tend to make an assumption but often halfway through I have to backtrace the whole calculation trying to evaluate the alternative interpretation.
The SVD, in general, is not unique. Contrary to what the author said at 12:10 the non-uniqueness is not merely a question of signs of the vectors in the unitary matrices. Rather, when sigular values appear more than once, the corresponding eigenvectors can be multiplied by any unitary transformation of the space corresponding to the sigular value with higher multiplicity.
This is the first time I learn about SVD and I can fully understand. Thank you!!!
This is the best video on SVD i have ever come across. Just wow.
The professor has so patiently explained every individual part of the whole equation with so much of attention and beauty, he literally made me "feel" the entire concept!
Sir, kindly teach others on how to teach as well. We need a lot more people like you in this world.
In my first semester of graduate school for ML, was starting to think this wasnt for me based on how lost I have been on this topic. You saved me from imposter syndrome, thank you
Happy to help and thanks for watching!
Great stuff! You explain the story the mathematics is trying to say in such a clear and understandable way!
Such smooth and intuitive understanding saving me from perplexity of the topics. Thanks Sir!! May God bless you! Love from students from Bharat!
Dude you and some other RUclips teachers have been make my life great and helping me love studying again, thank you truly
looking like a nerd, teaching like a BOSS. big like for that man
This guy is amazing!! What a fantastic teacher 🙏
Incredible production! Congrats from Spain!
Wow I can clearly visualise it in my head how the multidimensional data is stacked and matched across time.
This is the best explanation I have seen so far! Great explanation!!
Oh my God. What a teacher. Thank you sir. I needed to learn this mid career and you are God send...
Amazing content, amazing explanation, amazing videography. Thank you so much for your work. I am truly grateful and I wish I could be your student in my lifetime. I watched one video and immediately knew that I have to hit the subscribe button. Thank you once again
Buying Steve's book after watch few of the videos. I am experienced Aerospace Engineering PM and write python code for automation. Want to learn SVD using Python. Thanks for the videos
This guy is awesome. This is the Full SVD not the reduced SVD FYI
Precise, comprehensive, tangible
Great❤
Awsome explanation, before coming to this channel watched other videos and got a bit confused but here the concept is explained so smoothly. Thanks!!
Thank you so much. Simple, clear and with examples, it's nice 👌
You are amazing. Thank you .. The explanation was really clear and your way of teaching is great.
RtoL writing behind a glass is revolutionizing online teaching, the 1st of which seems to be that on Nancy's channel on calculus. Super cerebrum-cerebellum-hand axis👍
Jesus started to teach Math. Super clear and comprehensive
You're a star , Mr Brunton. Thanks !
wow. when you strated describing the meaning of U sigma and V, I could see how it was of similar concept of that of fourier series and transform
This is the best SVD lecture I have ever known! and I have a question why SVD always give first come first serve importance to each vector? if X is a random matrix, how does SVD decide/know the first vector is the most important one and the importance of other vectors decreasing in order. thank you
Much respect from a mathematics teacher
Thank you for all the awesome lectures.We wish you all the best..
Thank you very much for explaining these in very simple words
Steve I feel there is a correction required, (I might be wrong also) starting from duration 06:14, where you have explained that UUT = UTU = [I]nxn. It will be identity matrix but will their dimensions will be same? Suppose U is an nxr matrix then UT will be rxn. Then in that case UUT will be nxn and UTU will be rxr. Please guide
U and V are singular matrices. So they have to be square.
OMG this is exactly what I was looking for! And you have explained it in the clearest way possible. Thank you! Instantly subscribed.
SVD does not only solve Ax = b, but akso (and more directly) Ax=0 or more accurately Ax + e = 0. Highest singular value is the least squares solution, with the solved coefficients for solved simultaneous equations Ax + e = 0 is the first columnn in V (transposed) for the least squares solution.
I am still impressed with how the professor can mirror-write
This is how you teach. Thank you.
Please have a lecture on Wavelet decomposition you are the best. The time and effort you have put in is so clearly seen, please also have lecture on how to give good presentations like yours and how to prepare for it 🤓🤓
blew my mind with the explanation of each of the three elements. thanks!
Awesome, great to hear!
What a fantastic teacher!
I cannot thank you enough for the awesome explanation! Thank you!
I've been looking around the topics of Linear Algebra and its application to Data Science. And I found the best path is from Gilbert Strang to Steve Bruton ! Your videos are awesomely inspiring, and it aligns with Gil's fundamental lectures. A nice baton pass from basic to contemporary application. Sir, are you intentionally doing or influenced by his work ?
actually he mentions that he follows Brunton's lecture
My linear algebra teacher used Strang's textbook and it was a great experience for learning linear algebra
This is pure genius.
You are a legend, Steve
thank you so much for this series! I just started and have a stupid question, what does it mean when you say eigen something? I know eigenvectors are the vectors that remain the same direction after matrix transformation, but eigen-flows/faces?
Good question. I use this to mean "characteristic" or "latent" (which is what it means for eigenvalues/vectors too). But eigenflows and eigenfaces are eigenvectors of a large data correlation matrix. So in a sense, they are eigenvectors of a particular linear algebra problem.
Specifically, if I stack images as column vectors in a matrix X, then the "eigen-images" of X are the eigenvectors of X^T * X.
Dear Professor: May God bless you for your kindness of teaching us!!!
One question: in some country which has a huge population, by scanning a person's face, the authority can identify him/her in a couple of seconds. Considering the 'face library' is so huge, then the matrix will be huuuuge. There must be additional trick(s) to do this. Could you hint a little bit on it? Thank you very much!
you made it so easy to understand.
Thanks!
math needs to be reviewed again and again, and now I am back to this class again to pick up the SVD... Orz, by the way, excellent class if you want to know the frame about this method, this class is excellent, and if you want to go deep into the math, maybe you should buy a book and try some exercise and give the solution to the problem yourself, and then you can handle the conception(or method)
Brilliant approach. So intuitive.
Thank god you made this video, sir. Awesome!
Hi Professor Bunton, really appreciate the video. I am a bit confused on the 'columns' and 'rows' you refer to the U and V to be honest and I was wondering if it is possible for you to clarify a bit more: If I heard you correctly, essentially the original X matrix is a n x m matrix where n represent the 'features' of a table and m represent the observations of a table.
If above is correct: the U is a shape of nxn, where each column represent each feature of the table, aka, each row of the original X matrix. Is that right?
and the similar thing goes for: V is a shape of mxm, where each column of V represent each observation of the table, aka, the column of the original X matrix.
I think in your video, you mentioned U represent each column of X matrix, which is different from above. Can you lmk where I missed?
Good question. This can be a bit tricky. The columns of "U" are linear combinations of the columns of "X" (and vice versa: the columns of "X" are linear combinations of the columns of "U"), so that is why I'm saying that U represents the column information in X. Similar for V and the rows of X
@@Eigensteve Thank you so much for the response!! Just to make sure: the dimension of linear combination of columns of X is nxn.... (which means, the column count is m but after linear combination, it becomes n?) is that right?
@@GabrielleYadventuremore I think of it a little differently. There will usually be r
Quick question: What software do you use in order to write on the screen?
I’m also wondering about the same thing
I cannot understand why SVD of a matrix is unique.
Thanks for your great explanation.
Thank you so much Steve Brunton
This is so useful and clearly explained. Thank you
Thanks a lot teacher, I didn't knew SVD is so simple to understand..
Very, very nice explanation and presentation. Thank you!
Great video, love the different real world examples
Awesome!! I'm not really good at it but learning it or listening enhances and adds to others skills that are relevant to this almost... if that makes scense...
Great video series...I don't understand this maths part here...can u advise what topic should I be studying to understand this better?
Keep up the great work! Excellent channel!
That's a great lecture on svd.Hope for getting more initiative videos
Really sir, how can someone explain like this, thanks for your efforts.
very good...
In this way, Can I say that Transpose of U (or V) is equal to Inverse of U (or V)?
Thanks for your great video(s). A technical question: Asyou're ordering your sigmas, are you assuming they're all Real-valued?
I can't find the suitable word that depict your effort .. Great thanks
@steve Brunton! Amazingly explained !! Ite super clear now in my head
crazy how this guy explains everythng AND writes it in mirror-image so that we can read it!
WOOOOWW ! Amazing teacher! Thanks Professor, I'll get the book.
sir you are the best,too good explanation and material
Thanks for this great lecture!
btw, assuming matrix X is composed of 200 proteins (columns), the level of each of those proteins was measured in 2000 positions (rows) on human skin. How can you explain\interpret U, Sig, and VT matrices in this case? The technical calculation is straight forward with python, but the meaning is quite ambiguous for me.
Can't tell what is more impressive, the fact that he's able to explain SVD in such simple terms or the fact that he wrote the entire lecture... BACKWARDS?
He wrote it normally and flipped the video
In this pandemic, online teaching is a must, and this style of presentation is excellent, as the student can the face the teacher all the time. This method of teaching should be incorporated across the world.
To clarify, the shape of the U matrix will always be n x n and the shape of the V^T matrix will always be m x m?
I didn't get the dimentions of U, S and V. In Wikipedia, a full svd is: U (mxm), S (nxn) and V (nxn). In this video, the dimensions are different, U (nxm) and S (mxn).
amazing series! New subscriber here! Love the aesthetics and how clearly you explain everything.
Awesome, thank you!
Thanks for your great video! I will read your book as well.
Amazing teacher! Great work!
What a great teacher, thank you!!! 🇧🇷❤
verdade