It's amazing how genuinely interesting topics like these become when you understand what it could represent in the real world rather than treating it all abstractly.
This is a really amazing video. It's no joke explaining a concept like SVD in such simple terms and you have nailed it. Concepts become so much more clearer now.
Excellent video! My brain still hurts from this, but I agree with other posters that this is about the best, easiest to understand explanation of SVD I've come across so far. Thank you.
The first entry firdt column 0.13 is a ery small decimal so how can he say thst entry heavily corresponds or relatrs to rhe first concept ..thst low number would suggest low relation..
really really amazing. Now it is cleared. Before, by reading many blogs and watching videos i made my own different concept like SVD is A*A(transpose) or A(transpose)*A and U = eigen vectors of first matrix and V = eigen vector of second matrix and singular values are eigen vector of these matrix.
Your explanation with visualization makes this concept so clearly explained! Thank you so much for clearing my confusion with this concept which had me struggle for weeks! Great work!
Superb explanation of SVD!! The best I have come across. I was struggling for the last few months, but this video gave me a clear idea about it. Thank you.
This is one of the most well explained concept videos I have ever watched. This video, and other like it, will go well alongside my Data Science course I am going through.
it was indeed the most intuitive explanation of SVD. it's been about a week that I was trying to understand this concept for use in a deeper way n I couldn't find anything like this
This is a really good video. I've been struggling with SVD in a course I am taking and for the first time I "almost" understand it. I am still confused about the signs of the elements in matrices U and V. Some numbers are positive and some are negative. Is it just the magnitude that matters? When I entered this in MATLAB I got different results with the signs. At first I thought the signs were all just the opposite, but upon closer inspection I can see that sometimes that signs are opposite and sometimes not. I can't detect a pattern. I also don't completely understand how we can tell which column in the U matrix corresponds to which concept. The sizes of U and V are different in the MATLAB output as well, but I notice that the truncated columns are those where the strength value in the Sigma matrix are zero, so I think that makes sense.
Thanks for providing this insight, because its all good and well how to calculate the SVD, but its equally important to know what insights it provides.
Thank you so much whoever you are! As a LitMajor it's been hard to understand some key concepts for Latent Semantic Analysis, but you explained it beautifully!
Good intuition. On the formal definition side, this is a bit at odds with the Strang and wikipedia. Where U and V are defined as square matrices, But perhaps, that's not so important in this context since those extra rows and columns are mostly zeros I think
Pretty sure at 11:53 what he meant to say was that the V transpose matrix on the bottom is users to concepts because it has 5 columns corresponding to each person (and it makes sense that person 2 corresponds to the 3rd concept because their behavior was different in liking some romance movies). U is definitely movies to concepts as well, as there are 7 rows for each movie.
This is what I understood. In general left singular matrix shows how the rows are related to each other. The right singular matrix shows how columns are related to each other and the diagonal matrix shows that the strength of the relation. But have one question though in this case it was movies and audience so we could find the correlation and attribute to it. But if we don't know the correlation before hand can we know we determine it?
I love this explanation, thank you so much! However I have a question at 12:03 . The aspect of concepts is well understood although abit confusing towards the end when explaining concept to movie matrix. Having 3 concepts represented as rows and movies as columns. On the last movie, its 0.09, -0.69 and 0.09 as first second and third concept respectively. You mentioned that the first concept bears more strength due to matrix sigma information compared to the rest. The third concept bears the least strength. The confusing part is, how can the movie belong to the second concept despite having the least 'score'?
Suppose there are little bit of actions in both sci-fi and romantic movies. The third column in U represents such inconspicuous concept. It comes from the sigma matrix. It shows that the significance of the third concept is very low.
It’s not always going to have such clear separation between concepts. He just used a very easy to understand example. But what we can know for sure is that the number of concepts will equal to the rank of matrix A.
Question, since sigma 1 is always to largest, how did you know 12.4 is measure the strength of scifi instead of something else? If you change the matric, the sigma is always the largest. Thank you.
Great explanation. I have a question here if anyone can help why in the U matrix the users who love romance concepts take negative values instead of positive ones like SciFi lovers.
When i invoked SVD on the matrix A i got 3 matrices U,S and V. But the order of the U was 7x7. S had t,he order of 5x5 as diagnonal square matrix. Order of V was 5x5. But your slide at 9:50 has different dimensions as U (7x3), S(3x3) and V(3x5). Why is it different?
6:27 I learnt Linear Algebra 8 years ago, so I have forgotten a lot about it. But doing some research online, I think orthogonal means the multiply the matrix by the inverse of itself will result in a Identical Matrix as stated in the slide? The lecturer in the video says it means the dot product of any two columns of a matrix will result in 0, which I don't think it is correct? Can someone correct me if I am wrong as I am really terrible of Math now but I want to get it right... Thanks
Thanks for the video, awesome explanation. I got the intuition and the math, however I feel like I am missing a piece to the puzzle: How do we know what the different "concepts" correspond to? After doing the decomposition, we obtain three matrices with numbers, how do we know what is the meaning of the different r concepts? Thanks to anyone who will spend 5 minutes to answer this question.
As I've understood it (and I'm a noob) the concepts don't have any mapping. You'd have to look for yourself and interpret based on values. Similar to any clustering algorithm, since these are all unsupervised learning techniques
@@rolandheinze7182 yeah I figured that out later on hehe. I'm actually doing a Recommendation system course for my Master's and they actually pointed out that one of the main drawbacks of this method is that you don't have an interpretation of the latent factors (other than the fact that works pretty poorly with sparse data, which is usually the case).
This video has confused me, as most other sources claim the factor are U (mxm) Sigma (mxn) and V^t (nxn). That is, square rectangular square. However this video claims it should be rectangular square rectangular. Anyone have insight on this?
You are right. However, due to rank deficiency, some of the singular values in 'your' rectangular matrix will become either zero or negligibly low. See that, the corresponding columns (actually vectors in U space) has practically no role to play.
It's amazing how genuinely interesting topics like these become when you understand what it could represent in the real world rather than treating it all abstractly.
Definitely
AGREEED
It’s interesting to mathematicians independent of any real world applications.
This is a really amazing video. It's no joke explaining a concept like SVD in such simple terms and you have nailed it. Concepts become so much more clearer now.
Those who think quality of teaching doesn't matter need to watch these videos, this guy explained SVD better than anyone I've ever encountered
This guy saved me almost 2-3 hours of time and 2 gigs of data that I was about to spend on RUclips if I haven’t found this video. Perfect explanation.
This vampire is good at teaching.
😂😂🧛🧛
hahaha it's funny~
🧛🏻♂️🧛🏻♂️🧛🏻♂️🧛🏻♂️🧛🏻♂️😂😂😂
By far the best explanation on SVD I've ever seen! Now I understand why it is called Singular Value Decomposition
The origin or the term is just historic and has nothing to do with what is explained here and with data science in general.
This is by far the best SVD explanation I've come across and I've watched a half dozen and read the same.
Excellent video! My brain still hurts from this, but I agree with other posters that this is about the best, easiest to understand explanation of SVD I've come across so far. Thank you.
Awesome awesome.. in 13 mins i got more intuition on SVDthan i got reading lots of papers on SVD over last few days
The first entry firdt column 0.13 is a ery small decimal so how can he say thst entry heavily corresponds or relatrs to rhe first concept ..thst low number would suggest low relation..
This has to be the best explanation I have come across for SVD!! Much appreciated!!
Thanks for teaching us to the point. Reading this topic for 3 years, no one could have explained it better.
Best video ever in human history
Brilliant! Went through so many videos and sites but this was the most lucid explanation done.
The example was one of the best ones for SVD that I've seen.
Well-explains about Singular Value Decomposition (SVD) with one clear easy example. Thanks!
This has to be the best explanation of SVD I have encountered on RUclips. Bravo!
really really amazing. Now it is cleared. Before, by reading many blogs and watching videos i made my own different concept like SVD is A*A(transpose) or A(transpose)*A and U = eigen vectors of first matrix and V = eigen vector of second matrix and singular values are eigen vector of these matrix.
Your explanation with visualization makes this concept so clearly explained! Thank you so much for clearing my confusion with this concept which had me struggle for weeks!
Great work!
Superb explanation of SVD!! The best I have come across. I was struggling for the last few months, but this video gave me a clear idea about it. Thank you.
Excellent overview of SVD and one of its widely-used applications!
This is one of the most well explained concept videos I have ever watched. This video, and other like it, will go well alongside my Data Science course I am going through.
This is hands down one of the best explanations of SVD and its practical applications
The most esplaining video that I have seen about SVD it helped me understanding latent semantic indexing
This is the first time I learn it clearly! Thanks for the amazing video
Wow...I spent lot of months just to get a clear understanding of this and you did this in just few mins.
it was indeed the most intuitive explanation of SVD. it's been about a week that I was trying to understand this concept for use in a deeper way n I couldn't find anything like this
This is a really good video. I've been struggling with SVD in a course I am taking and for the first time I "almost" understand it. I am still confused about the signs of the elements in matrices U and V. Some numbers are positive and some are negative. Is it just the magnitude that matters? When I entered this in MATLAB I got different results with the signs. At first I thought the signs were all just the opposite, but upon closer inspection I can see that sometimes that signs are opposite and sometimes not. I can't detect a pattern. I also don't completely understand how we can tell which column in the U matrix corresponds to which concept. The sizes of U and V are different in the MATLAB output as well, but I notice that the truncated columns are those where the strength value in the Sigma matrix are zero, so I think that makes sense.
im not an expert but I heard sometime that weird sign things with low values might have to do with round off errors during the calculations
the best resource for SVD on youtube
Is this Dr. Leskovec? Very nice video, very nice professor as well. Thanks for your SNAP project as well.
Thanks for providing this insight, because its all good and well how to calculate the SVD, but its equally important to know what insights it provides.
This is AWESOME! "If you can't explain it simply, you don't understand it well". Can't say more than that...
Amazing explanation in such a short and simple way.
Thanks!
What an excellent explanation.. Stanford professors are really smart
I am studying analytics with very limited prior background in linear algebra - you could not have made it easier for me. Thank you!
Best tutorial on SVD. Thanks a lot. And here is my question: How is r determined?
wow! just wow! one of the best SVD breakdown videos around
Thank you so much whoever you are! As a LitMajor it's been hard to understand some key concepts for Latent Semantic Analysis, but you explained it beautifully!
This was a remarkable presentation.
This is a very good channel.
Thanks!! Keep Learning and Sharing :)
Penaldo
Simple and precise example with meaning - couldn't be better, extremely well prepared, thanks so much for this.
Thank you, this really helps me to understand the concept of SVD!
Good intuition. On the formal definition side, this is a bit at odds with the Strang and wikipedia. Where U and V are defined as square matrices, But perhaps, that's not so important in this context since those extra rows and columns are mostly zeros I think
wow this is the teaching level at standford !!!! hatsoff
A good pro gives a good example
What a brilliant mind.... explaining something so well... Feynman would have approved! Thanks!
Pretty sure at 11:53 what he meant to say was that the V transpose matrix on the bottom is users to concepts because it has 5 columns corresponding to each person (and it makes sense that person 2 corresponds to the 3rd concept because their behavior was different in liking some romance movies). U is definitely movies to concepts as well, as there are 7 rows for each movie.
Awesome.. Now I truly understand the interpretation of SVD.
That is a clear example! Good job
Best Explanation - especially the example with movies! - I heard for SVD!
By far!!!
Amazing video. All clear now.
Thank you for this brief explanation of the SVD
The best explanation of SVD I've seen. Thanks for the video!
Great video, but how do we calculate the SVD manually. Could that also be explained in another video?
I never really understood the concept behind SVD until now. The example in the first minute made everything click!
simply, admirable.
thanks for this amazing video !! 👏👏
This is so much better than those MIT lectures. Good Lord, they did not make sense to me.
"It just models in some sense our noise" ... wow
I was wondering what that was for. And hence this line makes sense.
This is what I understood.
In general left singular matrix shows how the rows are related to each other. The right singular matrix shows how columns are related to each other and the diagonal matrix shows that the strength of the relation.
But have one question though in this case it was movies and audience so we could find the correlation and attribute to it. But if we don't know the correlation before hand can we know we determine it?
Brilliant explanation, Thank You!
thank you sir your video was rich of infos...with all gratitude & respect
a fantastic explanation of SVD, thank you very much!
Proved very helpful , thanks
Best way one can describe...
could not stop myself to like this video
this video is the best to SVD, the best, the best!!!
Excellent explanation 👍 the background score matches SVD's awesome explanation.
Can you give me the name of the background score. Thanks
Crystal clear. Appreciate it so much :)
Great lecture with good intuition about SVD
Best SVD explanation ever!!!
Excellent teacher !! Keep up with the good work !!
I love this explanation, thank you so much! However I have a question at 12:03 . The aspect of concepts is well understood although abit confusing towards the end when explaining concept to movie matrix. Having 3 concepts represented as rows and movies as columns. On the last movie, its 0.09, -0.69 and 0.09 as first second and third concept respectively. You mentioned that the first concept bears more strength due to matrix sigma information compared to the rest. The third concept bears the least strength. The confusing part is, how can the movie belong to the second concept despite having the least 'score'?
Excellent explanation. Thanks,
@11:10 what about the third column? can you please explain it's lack of relevance mathematically?
Suppose there are little bit of actions in both sci-fi and romantic movies. The third column in U represents such inconspicuous concept. It comes from the sigma matrix. It shows that the significance of the third concept is very low.
My thought exactly, thank you for your question
It’s not always going to have such clear separation between concepts. He just used a very easy to understand example.
But what we can know for sure is that the number of concepts will equal to the rank of matrix A.
So far the best SVD explanation (still in 2020)
Great video and very clear explanations. Thank you
Awsome Explanation! why some ppl gave dislike ?!!!!!!
Question, since sigma 1 is always to largest, how did you know 12.4 is measure the strength of scifi instead of something else? If you change the matric, the sigma is always the largest. Thank you.
I think that is potentially the downside of using this in practice to interpret the classes
This is absolutely brilliant.
amazing explanation!
I love this guy!!! super good explained
Great explanation. I have a question here if anyone can help why in the U matrix the users who love romance concepts take negative values instead of positive ones like SciFi lovers.
Thank you very much!!!
Thank you so much. Awesome work awesome lecture. Very easy to understand
When i invoked SVD on the matrix A i got 3 matrices U,S and V. But the order of the U was 7x7. S had t,he order of 5x5 as diagnonal square matrix. Order of V was 5x5. But your slide at 9:50 has different dimensions as U (7x3), S(3x3) and V(3x5). Why is it different?
6:27 I learnt Linear Algebra 8 years ago, so I have forgotten a lot about it. But doing some research online, I think orthogonal means the multiply the matrix by the inverse of itself will result in a Identical Matrix as stated in the slide? The lecturer in the video says it means the dot product of any two columns of a matrix will result in 0, which I don't think it is correct? Can someone correct me if I am wrong as I am really terrible of Math now but I want to get it right... Thanks
Best explanation I've seen!!! Love
The best explanation I have ever found! thank you so much! :)
GODLIKE EXPLANATION
this video actually does a great job explaining this hard concept
As far as I know, this is the best explanation of SVD on youtube.
OMG.. You made it super easy
Really good explanation!!!
Quick question! Is it fair to say that the left singular vectors reflect the distribution (statistical distribution) of the column vectors?!
Very well done huge thank you
Thanks for the video, awesome explanation. I got the intuition and the math, however I feel like I am missing a piece to the puzzle: How do we know what the different "concepts" correspond to?
After doing the decomposition, we obtain three matrices with numbers, how do we know what is the meaning of the different r concepts?
Thanks to anyone who will spend 5 minutes to answer this question.
As I've understood it (and I'm a noob) the concepts don't have any mapping. You'd have to look for yourself and interpret based on values. Similar to any clustering algorithm, since these are all unsupervised learning techniques
@@rolandheinze7182 yeah I figured that out later on hehe. I'm actually doing a Recommendation system course for my Master's and they actually pointed out that one of the main drawbacks of this method is that you don't have an interpretation of the latent factors (other than the fact that works pretty poorly with sparse data, which is usually the case).
Thank you very much. it's very clear and very well explained !
This video has confused me, as most other sources claim the factor are U (mxm) Sigma (mxn) and V^t (nxn). That is, square rectangular square. However this video claims it should be rectangular square rectangular. Anyone have insight on this?
You are right. However, due to rank deficiency, some of the singular values in 'your' rectangular matrix will become either zero or negligibly low. See that, the corresponding columns (actually vectors in U space) has practically no role to play.
so atlast we will take the first 2 columns in user to concept matrix (as they are having high strength) as our new features right?
Great video, and great lecturer, thanks a lot.