Lecture 47 - Singular Value Decomposition | Stanford University

Поделиться
HTML-код
  • Опубликовано: 23 окт 2024

Комментарии • 264

  • @awesome3dan
    @awesome3dan 4 года назад +125

    It's amazing how genuinely interesting topics like these become when you understand what it could represent in the real world rather than treating it all abstractly.

    • @MsAlarman
      @MsAlarman 2 года назад +1

      Definitely

    • @paolacalle5737
      @paolacalle5737 2 года назад +1

      AGREEED

    • @jessewolf7649
      @jessewolf7649 Год назад

      It’s interesting to mathematicians independent of any real world applications.

  • @pradyumnakaushik5836
    @pradyumnakaushik5836 8 лет назад +217

    This is a really amazing video. It's no joke explaining a concept like SVD in such simple terms and you have nailed it. Concepts become so much more clearer now.

  • @LRCatFan
    @LRCatFan 4 года назад +20

    Those who think quality of teaching doesn't matter need to watch these videos, this guy explained SVD better than anyone I've ever encountered

  • @pravii444
    @pravii444 4 года назад +1

    This guy saved me almost 2-3 hours of time and 2 gigs of data that I was about to spend on RUclips if I haven’t found this video. Perfect explanation.

  • @mattbritzius570
    @mattbritzius570 5 лет назад +322

    This vampire is good at teaching.

    • @MultiAtif
      @MultiAtif 4 года назад

      😂😂🧛🧛

    • @mark_grey
      @mark_grey 4 года назад

      hahaha it's funny~

    • @arindammitra2293
      @arindammitra2293 3 года назад

      🧛🏻‍♂️🧛🏻‍♂️🧛🏻‍♂️🧛🏻‍♂️🧛🏻‍♂️😂😂😂

  • @lighted321
    @lighted321 5 лет назад +32

    By far the best explanation on SVD I've ever seen! Now I understand why it is called Singular Value Decomposition

    • @nournote
      @nournote 4 года назад +3

      The origin or the term is just historic and has nothing to do with what is explained here and with data science in general.

  • @clapdrix72
    @clapdrix72 2 года назад +4

    This is by far the best SVD explanation I've come across and I've watched a half dozen and read the same.

  • @diekluge
    @diekluge 5 лет назад +7

    Excellent video! My brain still hurts from this, but I agree with other posters that this is about the best, easiest to understand explanation of SVD I've come across so far. Thank you.

  • @Girishlimaye
    @Girishlimaye 7 лет назад +78

    Awesome awesome.. in 13 mins i got more intuition on SVDthan i got reading lots of papers on SVD over last few days

    • @leif1075
      @leif1075 3 года назад

      The first entry firdt column 0.13 is a ery small decimal so how can he say thst entry heavily corresponds or relatrs to rhe first concept ..thst low number would suggest low relation..

  • @chrisjerome3707
    @chrisjerome3707 3 года назад +2

    This has to be the best explanation I have come across for SVD!! Much appreciated!!

  • @sagnikroy6405
    @sagnikroy6405 Год назад

    Thanks for teaching us to the point. Reading this topic for 3 years, no one could have explained it better.

  • @yasinilulea
    @yasinilulea 4 года назад +1

    Best video ever in human history

  • @harineemosur6530
    @harineemosur6530 5 лет назад +3

    Brilliant! Went through so many videos and sites but this was the most lucid explanation done.

  • @singhpratyush_
    @singhpratyush_ 8 лет назад +4

    The example was one of the best ones for SVD that I've seen.

  • @Art_Blue_Liberalism
    @Art_Blue_Liberalism 2 месяца назад

    Well-explains about Singular Value Decomposition (SVD) with one clear easy example. Thanks!

  • @ryanshafiei2374
    @ryanshafiei2374 2 года назад

    This has to be the best explanation of SVD I have encountered on RUclips. Bravo!

  • @satyamgupta4808
    @satyamgupta4808 Год назад

    really really amazing. Now it is cleared. Before, by reading many blogs and watching videos i made my own different concept like SVD is A*A(transpose) or A(transpose)*A and U = eigen vectors of first matrix and V = eigen vector of second matrix and singular values are eigen vector of these matrix.

  • @impzhu3088
    @impzhu3088 4 года назад

    Your explanation with visualization makes this concept so clearly explained! Thank you so much for clearing my confusion with this concept which had me struggle for weeks!
    Great work!

  • @unpatel1
    @unpatel1 2 года назад

    Superb explanation of SVD!! The best I have come across. I was struggling for the last few months, but this video gave me a clear idea about it. Thank you.

  • @jdm89s13
    @jdm89s13 Год назад

    Excellent overview of SVD and one of its widely-used applications!

  • @kenbobcorn
    @kenbobcorn 6 лет назад +2

    This is one of the most well explained concept videos I have ever watched. This video, and other like it, will go well alongside my Data Science course I am going through.

  • @raamav.4837
    @raamav.4837 4 года назад

    This is hands down one of the best explanations of SVD and its practical applications

  • @jalalsadeghi66
    @jalalsadeghi66 4 года назад

    The most esplaining video that I have seen about SVD it helped me understanding latent semantic indexing

  • @evayang1398
    @evayang1398 4 года назад +1

    This is the first time I learn it clearly! Thanks for the amazing video

  • @amitsinghrathore9012
    @amitsinghrathore9012 6 лет назад +3

    Wow...I spent lot of months just to get a clear understanding of this and you did this in just few mins.

  • @rrooho
    @rrooho 5 лет назад +5

    it was indeed the most intuitive explanation of SVD. it's been about a week that I was trying to understand this concept for use in a deeper way n I couldn't find anything like this

  • @paultoronto42
    @paultoronto42 4 года назад +8

    This is a really good video. I've been struggling with SVD in a course I am taking and for the first time I "almost" understand it. I am still confused about the signs of the elements in matrices U and V. Some numbers are positive and some are negative. Is it just the magnitude that matters? When I entered this in MATLAB I got different results with the signs. At first I thought the signs were all just the opposite, but upon closer inspection I can see that sometimes that signs are opposite and sometimes not. I can't detect a pattern. I also don't completely understand how we can tell which column in the U matrix corresponds to which concept. The sizes of U and V are different in the MATLAB output as well, but I notice that the truncated columns are those where the strength value in the Sigma matrix are zero, so I think that makes sense.

    • @louisebuijs3221
      @louisebuijs3221 3 года назад +1

      im not an expert but I heard sometime that weird sign things with low values might have to do with round off errors during the calculations

  • @mikezhang2266
    @mikezhang2266 3 года назад

    the best resource for SVD on youtube

  • @xingfang8507
    @xingfang8507 8 лет назад +8

    Is this Dr. Leskovec? Very nice video, very nice professor as well. Thanks for your SNAP project as well.

  • @rudypieplenbosch6752
    @rudypieplenbosch6752 2 года назад

    Thanks for providing this insight, because its all good and well how to calculate the SVD, but its equally important to know what insights it provides.

  • @jegankarunakaran5798
    @jegankarunakaran5798 6 лет назад

    This is AWESOME! "If you can't explain it simply, you don't understand it well". Can't say more than that...

  • @vaigokhale222
    @vaigokhale222 7 лет назад +2

    Amazing explanation in such a short and simple way.
    Thanks!

  • @TheStevenSinger
    @TheStevenSinger 6 лет назад

    What an excellent explanation.. Stanford professors are really smart

  • @dusicamilosavljevic6718
    @dusicamilosavljevic6718 4 года назад

    I am studying analytics with very limited prior background in linear algebra - you could not have made it easier for me. Thank you!

  • @hadijafari622
    @hadijafari622 2 года назад

    Best tutorial on SVD. Thanks a lot. And here is my question: How is r determined?

  • @abhisheksharma6617
    @abhisheksharma6617 7 лет назад

    wow! just wow! one of the best SVD breakdown videos around

  • @silviagutierrez7296
    @silviagutierrez7296 8 лет назад +2

    Thank you so much whoever you are! As a LitMajor it's been hard to understand some key concepts for Latent Semantic Analysis, but you explained it beautifully!

  • @rbnn
    @rbnn Год назад

    This was a remarkable presentation.

  • @BadriNathJK
    @BadriNathJK 8 лет назад +22

    This is a very good channel.

  • @jerrycote1661
    @jerrycote1661 6 лет назад

    Simple and precise example with meaning - couldn't be better, extremely well prepared, thanks so much for this.

  • @xiyuanliu2712
    @xiyuanliu2712 8 лет назад +5

    Thank you, this really helps me to understand the concept of SVD!

  • @glarange72
    @glarange72 5 лет назад +1

    Good intuition. On the formal definition side, this is a bit at odds with the Strang and wikipedia. Where U and V are defined as square matrices, But perhaps, that's not so important in this context since those extra rows and columns are mostly zeros I think

  • @biswasshubendu4
    @biswasshubendu4 5 лет назад

    wow this is the teaching level at standford !!!! hatsoff

  • @rishikesanravichandran9914
    @rishikesanravichandran9914 3 года назад +1

    A good pro gives a good example

  • @systematictrader
    @systematictrader 6 лет назад +3

    What a brilliant mind.... explaining something so well... Feynman would have approved! Thanks!

  • @b3arodactyl
    @b3arodactyl 4 года назад

    Pretty sure at 11:53 what he meant to say was that the V transpose matrix on the bottom is users to concepts because it has 5 columns corresponding to each person (and it makes sense that person 2 corresponds to the 3rd concept because their behavior was different in liking some romance movies). U is definitely movies to concepts as well, as there are 7 rows for each movie.

  • @sanjaykrish8719
    @sanjaykrish8719 7 лет назад +1

    Awesome.. Now I truly understand the interpretation of SVD.

  • @marc2564
    @marc2564 3 года назад

    That is a clear example! Good job

  • @patrickmullan8356
    @patrickmullan8356 8 лет назад +7

    Best Explanation - especially the example with movies! - I heard for SVD!
    By far!!!

  • @marcopierrefernandezburgos3116
    @marcopierrefernandezburgos3116 10 месяцев назад

    Amazing video. All clear now.

  • @Magnetron692
    @Magnetron692 3 года назад

    Thank you for this brief explanation of the SVD

  • @Cappy1102
    @Cappy1102 6 лет назад

    The best explanation of SVD I've seen. Thanks for the video!

  • @sasuke-6521
    @sasuke-6521 2 года назад +1

    Great video, but how do we calculate the SVD manually. Could that also be explained in another video?

  • @astroknight5
    @astroknight5 7 месяцев назад

    I never really understood the concept behind SVD until now. The example in the first minute made everything click!

  • @daesoolee1083
    @daesoolee1083 4 года назад

    simply, admirable.

  • @Loudjazz
    @Loudjazz Год назад

    thanks for this amazing video !! 👏👏

  • @muhammadmubashirullah7152
    @muhammadmubashirullah7152 4 года назад

    This is so much better than those MIT lectures. Good Lord, they did not make sense to me.

  • @orhanraufakdemir900
    @orhanraufakdemir900 4 года назад +9

    "It just models in some sense our noise" ... wow

    • @TheOYENDRILA
      @TheOYENDRILA 3 года назад

      I was wondering what that was for. And hence this line makes sense.

  • @gamerfela8317
    @gamerfela8317 5 лет назад

    This is what I understood.
    In general left singular matrix shows how the rows are related to each other. The right singular matrix shows how columns are related to each other and the diagonal matrix shows that the strength of the relation.
    But have one question though in this case it was movies and audience so we could find the correlation and attribute to it. But if we don't know the correlation before hand can we know we determine it?

  • @gayatrivenugopal3
    @gayatrivenugopal3 3 года назад

    Brilliant explanation, Thank You!

  • @rakilachraf391
    @rakilachraf391 5 лет назад +2

    thank you sir your video was rich of infos...with all gratitude & respect

  • @yoyomemory6825
    @yoyomemory6825 3 года назад

    a fantastic explanation of SVD, thank you very much!

  • @Akashphs7217
    @Akashphs7217 16 дней назад

    Proved very helpful , thanks

  • @VictorBanerjeeF
    @VictorBanerjeeF 7 лет назад +1

    Best way one can describe...

  • @soumyaiter1
    @soumyaiter1 4 года назад

    could not stop myself to like this video

  • @yangningxin1832
    @yangningxin1832 4 месяца назад

    this video is the best to SVD, the best, the best!!!

  • @firstkaransingh
    @firstkaransingh Год назад

    Excellent explanation 👍 the background score matches SVD's awesome explanation.
    Can you give me the name of the background score. Thanks

  • @daesoolee1083
    @daesoolee1083 4 года назад +1

    Crystal clear. Appreciate it so much :)

  • @ruohongzhang7245
    @ruohongzhang7245 5 лет назад

    Great lecture with good intuition about SVD

  • @karthik-ex4dm
    @karthik-ex4dm 5 лет назад

    Best SVD explanation ever!!!

  • @dimitriskass1208
    @dimitriskass1208 4 года назад

    Excellent teacher !! Keep up with the good work !!

  • @scottk5083
    @scottk5083 3 года назад

    I love this explanation, thank you so much! However I have a question at 12:03 . The aspect of concepts is well understood although abit confusing towards the end when explaining concept to movie matrix. Having 3 concepts represented as rows and movies as columns. On the last movie, its 0.09, -0.69 and 0.09 as first second and third concept respectively. You mentioned that the first concept bears more strength due to matrix sigma information compared to the rest. The third concept bears the least strength. The confusing part is, how can the movie belong to the second concept despite having the least 'score'?

  • @Ahmedkedir
    @Ahmedkedir 7 лет назад +1

    Excellent explanation. Thanks,

  • @ueiwqoak
    @ueiwqoak 7 лет назад +11

    @11:10 what about the third column? can you please explain it's lack of relevance mathematically?

    • @rajat4640rajat
      @rajat4640rajat 6 лет назад +6

      Suppose there are little bit of actions in both sci-fi and romantic movies. The third column in U represents such inconspicuous concept. It comes from the sigma matrix. It shows that the significance of the third concept is very low.

    • @billmanassas7746
      @billmanassas7746 5 лет назад

      My thought exactly, thank you for your question

    • @yuichichi
      @yuichichi 4 года назад +1

      It’s not always going to have such clear separation between concepts. He just used a very easy to understand example.
      But what we can know for sure is that the number of concepts will equal to the rank of matrix A.

  • @arnobarefin3946
    @arnobarefin3946 4 года назад

    So far the best SVD explanation (still in 2020)

  • @seemuNeu
    @seemuNeu 6 лет назад

    Great video and very clear explanations. Thank you

  • @alifarahani4398
    @alifarahani4398 5 лет назад

    Awsome Explanation! why some ppl gave dislike ?!!!!!!

  • @43SunSon
    @43SunSon 7 лет назад +10

    Question, since sigma 1 is always to largest, how did you know 12.4 is measure the strength of scifi instead of something else? If you change the matric, the sigma is always the largest. Thank you.

    • @TheBjjninja
      @TheBjjninja 5 лет назад

      I think that is potentially the downside of using this in practice to interpret the classes

  • @ntcool123
    @ntcool123 7 лет назад

    This is absolutely brilliant.

  • @vivekgr3001
    @vivekgr3001 3 года назад

    amazing explanation!

  • @jonathanuis
    @jonathanuis 5 лет назад

    I love this guy!!! super good explained

  • @KARAKOZA22
    @KARAKOZA22 Год назад

    Great explanation. I have a question here if anyone can help why in the U matrix the users who love romance concepts take negative values instead of positive ones like SciFi lovers.

  • @Андрюхаслазерки
    @Андрюхаслазерки 10 месяцев назад

    Thank you very much!!!

  • @mohammadwahiduzzamankhan4397
    @mohammadwahiduzzamankhan4397 4 года назад

    Thank you so much. Awesome work awesome lecture. Very easy to understand

  • @prabhacar
    @prabhacar 4 года назад

    When i invoked SVD on the matrix A i got 3 matrices U,S and V. But the order of the U was 7x7. S had t,he order of 5x5 as diagnonal square matrix. Order of V was 5x5. But your slide at 9:50 has different dimensions as U (7x3), S(3x3) and V(3x5). Why is it different?

  • @alsonyang230
    @alsonyang230 3 года назад

    6:27 I learnt Linear Algebra 8 years ago, so I have forgotten a lot about it. But doing some research online, I think orthogonal means the multiply the matrix by the inverse of itself will result in a Identical Matrix as stated in the slide? The lecturer in the video says it means the dot product of any two columns of a matrix will result in 0, which I don't think it is correct? Can someone correct me if I am wrong as I am really terrible of Math now but I want to get it right... Thanks

  • @aprilsun2572
    @aprilsun2572 7 лет назад +1

    Best explanation I've seen!!! Love

  • @inesarous9117
    @inesarous9117 7 лет назад +1

    The best explanation I have ever found! thank you so much! :)

  • @HaouasLeDocteur
    @HaouasLeDocteur 6 лет назад +6

    GODLIKE EXPLANATION

  • @ProfessionalTycoons
    @ProfessionalTycoons 6 лет назад

    this video actually does a great job explaining this hard concept

  • @jatayubaxi4553
    @jatayubaxi4553 4 года назад +2

    As far as I know, this is the best explanation of SVD on youtube.

  • @priyankrajsharma
    @priyankrajsharma 4 года назад

    OMG.. You made it super easy

  • @louisebuijs3221
    @louisebuijs3221 3 года назад

    Really good explanation!!!

  • @majidkh2695
    @majidkh2695 3 года назад

    Quick question! Is it fair to say that the left singular vectors reflect the distribution (statistical distribution) of the column vectors?!

  • @yevg3907
    @yevg3907 3 года назад

    Very well done huge thank you

  • @gabrielemazzola9652
    @gabrielemazzola9652 5 лет назад +4

    Thanks for the video, awesome explanation. I got the intuition and the math, however I feel like I am missing a piece to the puzzle: How do we know what the different "concepts" correspond to?
    After doing the decomposition, we obtain three matrices with numbers, how do we know what is the meaning of the different r concepts?
    Thanks to anyone who will spend 5 minutes to answer this question.

    • @rolandheinze7182
      @rolandheinze7182 5 лет назад +3

      As I've understood it (and I'm a noob) the concepts don't have any mapping. You'd have to look for yourself and interpret based on values. Similar to any clustering algorithm, since these are all unsupervised learning techniques

    • @gabrielemazzola9652
      @gabrielemazzola9652 5 лет назад +1

      @@rolandheinze7182 yeah I figured that out later on hehe. I'm actually doing a Recommendation system course for my Master's and they actually pointed out that one of the main drawbacks of this method is that you don't have an interpretation of the latent factors (other than the fact that works pretty poorly with sparse data, which is usually the case).

  • @h.moussa2190
    @h.moussa2190 6 лет назад +1

    Thank you very much. it's very clear and very well explained !

  • @cooperunderwood7342
    @cooperunderwood7342 7 лет назад +6

    This video has confused me, as most other sources claim the factor are U (mxm) Sigma (mxn) and V^t (nxn). That is, square rectangular square. However this video claims it should be rectangular square rectangular. Anyone have insight on this?

    • @rajat4640rajat
      @rajat4640rajat 6 лет назад +3

      You are right. However, due to rank deficiency, some of the singular values in 'your' rectangular matrix will become either zero or negligibly low. See that, the corresponding columns (actually vectors in U space) has practically no role to play.

  • @adhiyamaanpon4168
    @adhiyamaanpon4168 4 года назад

    so atlast we will take the first 2 columns in user to concept matrix (as they are having high strength) as our new features right?

  • @dvirginz4001
    @dvirginz4001 6 лет назад

    Great video, and great lecturer, thanks a lot.