What is the Singular Value Decomposition?

Поделиться
HTML-код
  • Опубликовано: 23 окт 2024

Комментарии • 132

  • @pablorc8091
    @pablorc8091 3 года назад +115

    Thank you, I think I now understand the bigger picture of what SVD actually is, beyond just the formula. You remind me very much of 3B1B.

    • @psychedmortal
      @psychedmortal Год назад +8

      3B1B's software Manim (Math animator) is free (open-source) for anyone to use.

  • @stefanandritoiu
    @stefanandritoiu Год назад +26

    If anyone is confused at 2:55 , why he is multiplying by V transpose and not the inverse, to get rid of V on the left side. It's because V is orthogonal matrix (that is how we defined it to be), and the transpose of an orthogonal matrix is equal to its inverse. This stems from the fact that if a matrix Q is orthogonal then QQ_T = Q_TQ = I

    • @shubhamgattani5357
      @shubhamgattani5357 7 месяцев назад +1

      Thank you so much for the clarification

    • @fernandamacedo4472
      @fernandamacedo4472 Месяц назад +1

      I think he should have mentioned at the beginning that the initial vector is of length one so the set of vectors are orthonormal, right?

    • @stefanandritoiu
      @stefanandritoiu Месяц назад +1

      @@fernandamacedo4472 Yes, you are correct. He should've said the initial vectors we are using are orthonormal not just orthogonal. Unfortunately math has a lot of very similar words for very similar concepts, and is easy to get confused.
      For instance, if a matrix has orthogonal but non-unit-length vectors as columns (or rows), it wouldn't satisfy the condition QQ_T = Q_TQ = I. But QQ_T = Q_TQ = D, where D is a diagonal matrix whose diagonal entries represent the squares of the norms of the columns (or rows).

  • @saketdeshmukh6881
    @saketdeshmukh6881 2 года назад +44

    after watching gilbert strang I needed exactly this to visualize and understand very clearly. You taught most difficult to explain concept in Linear algebra , Machine learning and even deep learning in 7 minutes. Really grateful. You are at the level of 3blueandbrown and gilbert stang in terms of explaining difficult concept like its understandable to even beginners. Keep the good work. I would subcribe for paid contents for sure. Sadly very under appreciated channel but hope people will make it popular. Waiting for PCA.

  • @adityakhedekar9669
    @adityakhedekar9669 2 года назад +40

    you explained it better than gilbert strang

    • @UtkarshiniEdutech
      @UtkarshiniEdutech Год назад +8

      Don't compare bro, trust me he is inspired by prof. Strang like many of us...

    • @spiderjerusalem4009
      @spiderjerusalem4009 Год назад +4

      yep. It is very unfortunate that strang didn't get to mention that the whole point was "orthonormal matrices still after being transformed"

  • @mysanoop
    @mysanoop 2 года назад +8

    At 2:55, inverse(V) is equal to transpose(V) because V is an orthogonal matrix

  • @iagodantasf
    @iagodantasf 3 года назад +8

    Bro, please make this second video relating to PCA, I beg you. Your explanations are the best

  • @momidimidhilesh829
    @momidimidhilesh829 Год назад +1

    I was looking for SVD in 3blue1brown channel. And you created using same library. Thank you

  • @schwaartz
    @schwaartz 2 месяца назад +2

    Make the sequel god damn it. I've been waiting for 3 years.

  • @dawne2780
    @dawne2780 Год назад +2

    This is the clearest explanation I've been able to find thank you!!

  • @vidajohn2199
    @vidajohn2199 3 года назад +4

    Great video, thank you. Eagerly waiting for your next PCA video.

  • @BjörnThorStefánsson
    @BjörnThorStefánsson 5 месяцев назад +3

    Where is the next video about PCA, loved this video.

  • @arminneashrafi2846
    @arminneashrafi2846 2 года назад +4

    Hi! I was wondering when you would upload the PCA video, your method of explanation was really lucid and enlightening!

  • @zaccandels6695
    @zaccandels6695 5 месяцев назад +1

    A (rather well-known) mathematician at my graduate institution once said, "The SVD is the single-most beautiful thing that can be said about a matrix". After seeing this, I'm convinced that he's right.

  • @marlondeoliveiragomes4874
    @marlondeoliveiragomes4874 3 года назад +9

    Great work, keep it up!
    Minor typos at 3:45 (eigendecomposition), 5:21 (maximum).

  • @emersonruizrico1925
    @emersonruizrico1925 3 года назад +6

    Great video! Digestible and beautifully animated. This channel can become a good complement to 3b1b.

  • @omridrori3286
    @omridrori3286 3 года назад +6

    Wowww when the next part ?!
    It’s amaizing

  • @evanparshall1323
    @evanparshall1323 2 года назад +32

    There is a giant error in this video. At 3:07 the matrix A is defined as A=USV_T. Up until 6:43, the proof holds that the arg max will correspond to the vector with the largest corresponding eigenvalue. However at 6:44, the error is that A_T*A is NOT USU_T as A_T*A would be (USV_T)_T * USV_T = V*S*U_T*U*S*V_T. This would mean that the vector corresponding to the maximum stretching would be the first column of V and not U. This is very misleading to viewers.

    • @amaygada919
      @amaygada919 2 года назад +1

      why does it have to be the first column and not the second?

    • @evanparshall1323
      @evanparshall1323 2 года назад

      @@amaygada919 The first column corresponds to the largest singular value. A*v = sigma*u. The vector u is unit length for all columns in the matrix U (V is also unit length for all column vectors in the matrix). This means that A*v1 will correspond to the output of greatest magnitude since sigma1 is the largest.

    • @bulutosman
      @bulutosman 2 года назад +1

      @@evanparshall1323 why the sigma 1 has to be the largest? We did not assume any rule about that at the beginning? What makes sigma 1 bigger than sigma 2 for instance? (Btw thanks for the giant error correcting, that was so important I guess)

    • @evanparshall1323
      @evanparshall1323 2 года назад +6

      @@bulutosman That is just how the SVD is defined. The largest singular value will always be at the 1,1 entry second largest at 2,2 etc

    • @phineasnyangoro1906
      @phineasnyangoro1906 Год назад +1

      You're right, and I also think that it's important to emphasize that Σ contains all of the square roots of the eigenvalues on its diagonal in descending order. (i.e at 6:43, in addition to the U's needing to be switched to V's for the equation defining (A^T)*A ,the Σ should actually be Σ^2)

  • @kaustuvray5066
    @kaustuvray5066 2 года назад +2

    Thanks so much.
    This is a whole new perspective for me.
    Loved it!!!!
    Keep up the amazing work
    Manim will help in creating a lot of amazing content.

  • @debashishdutta
    @debashishdutta 3 года назад +1

    Eagerly waiting for your next video on PCA. Please upload soon. This video was brilliant.🤩💥💥

  • @werdasize
    @werdasize 2 года назад +4

    Very nice! But, in the beginning, A= USV^T so AA^T = US^2U^T, but then towards the end, you say A^TA = US^2U^T. I guess you switched the notation V U ?

  • @devarpitabag9417
    @devarpitabag9417 2 года назад

    I really like your video. It is minimalistic and helps visualizing so well. I see it's been almost 1 year and there is no othe video from you. Please come back. Eagerly waiting for your next video.

  • @JordanTensor
    @JordanTensor 3 года назад +6

    This ia great. Well explained. In your next video you could also talk about compressing high dimensional data by truncating the small-eigenvalue terms in the SVD. It's an important step in tensor network simulations etc.

  • @marbles3662
    @marbles3662 3 года назад +1

    Great explanation , and also I couldn't find the next video on Principal Components.

  • @kiunthmo
    @kiunthmo Год назад

    You did a really good job with this, very very clear. Hope you find time to get back to doing this.

  • @raphael714
    @raphael714 13 дней назад +1

    Great video but I'm a bit confused at 6.10, at what point is it proven that the maximum is obtained for an Eigenvector of A^TA

  • @haydenchiu4199
    @haydenchiu4199 3 года назад

    Best explaination video on SVD ever

  • @josephj1643
    @josephj1643 2 года назад

    A video that really helps!!! Thanks, buddy please continue to make videos!
    Love from India!

  • @zhangxiyuan3508
    @zhangxiyuan3508 2 года назад

    Very sightful and I really want to see the explaination of PCA in the next video

  • @shubhamgattani5357
    @shubhamgattani5357 7 месяцев назад

    What an amazing explanation of SVD.

  • @nurulashikin4930
    @nurulashikin4930 2 года назад

    I'm looking forward to watching your next video sir 😊

  • @tabhashim3887
    @tabhashim3887 2 года назад

    why does this not have a million views or at least a couple hundred thousand...

  • @erwanquintin3057
    @erwanquintin3057 4 месяца назад

    I only have one word in reaction to your video: wow
    This so well done man

  • @NinjaAdvisor
    @NinjaAdvisor Год назад

    Thank you very much. Please make more of these.

  • @pieternouwen4664
    @pieternouwen4664 2 года назад +2

    epic work brotha

  • @yuanyuanding8606
    @yuanyuanding8606 Год назад

    Very helpful!!!!!!!!! You make everything so clear

  • @jinyoungchoi3443
    @jinyoungchoi3443 Год назад

    You saved my life man. Thank you.

  • @MohammedQaiss-eg4is
    @MohammedQaiss-eg4is Год назад

    one of the best videos I have ever seen. keep up the good work

  • @nomad3571
    @nomad3571 9 месяцев назад +1

    @6:45 eigen vectors of A^TA is column of matrix V right and not U as he said,right? What am I missing

  • @selayan4985
    @selayan4985 4 месяца назад

    Thanks! So well explained

  • @brucewalsh6784
    @brucewalsh6784 Год назад

    Excellent video! More please!!

  • @prologenjoyer
    @prologenjoyer 8 дней назад

    Bro dropped a banger and disappeared forever

  • @mihirsutaria8430
    @mihirsutaria8430 Год назад

    Great explanation.

  • @tanveermuttaqueen7856
    @tanveermuttaqueen7856 3 года назад +1

    Great Video. Waiting for the PCA

  • @pollenbarua5450
    @pollenbarua5450 Год назад

    wow! so fluent explanation

  • @axscs1178
    @axscs1178 9 месяцев назад

    Which software do you use for these animations. They are great!!

  • @easychemistry7560
    @easychemistry7560 3 года назад +1

    You make it very clear

  • @user-wr4yl7tx3w
    @user-wr4yl7tx3w 2 года назад

    This is really insightfully presented.

  • @francoiscollet2772
    @francoiscollet2772 2 года назад

    wonderful video, very clear, intuitive.

  • @zuggrr
    @zuggrr 2 года назад

    Thank you so much, I didn't get it in my uni lecture but got it thanks to you now :)

  • @GonzaloB-g2j
    @GonzaloB-g2j Год назад

    Amazing video. Thank you very much!

  • @martingreler6236
    @martingreler6236 Год назад

    Awesome Video! Thank you.

  • @M_a_t_z_ee
    @M_a_t_z_ee 3 месяца назад

    You have a typo ("eigne" instead of "eigen") at 3:44 and 4:17.
    Otherwise, great video, thank you very much! It's the first video I watch from you, but if the others are of similar quality, I'm really suprised your channel hasn't blown up yet considering this video is already three years old.

  • @papername1237
    @papername1237 2 года назад

    This was great! Please do more!

  • @nicolaschou1396
    @nicolaschou1396 2 года назад +2

    6:10 why ATAx can be written as lambda x? What if x is not the eigenvector of ATA?

    • @张笑寒-k8n
      @张笑寒-k8n 2 года назад

      i dont understand either,could you explain it to me?

  • @Bulgogi_Haxen
    @Bulgogi_Haxen Месяц назад

    so.. the right singular vectors are the ONB for the rows and the left singular vectors are ONB for the image of A multiplied by its right singular vectors.
    Geometric interpretation is somewhat intuitive but what do they mean?

  • @azizbounouh1576
    @azizbounouh1576 Год назад

    this video is amazing, it helped me so much, can't wait to see the next one ;)

  • @singhnaveen5694
    @singhnaveen5694 Год назад

    Best video on svd

  • @user-wr4yl7tx3w
    @user-wr4yl7tx3w 2 года назад

    How does one come up with such insightful way to explain things? What steps are needed?

  • @bhavinmoriya9216
    @bhavinmoriya9216 Год назад

    Thanks for wonderful work! At 6.57, I guess you wanted to say V is an eigen vector matrix for A' A.

  • @zaccandels6695
    @zaccandels6695 5 месяцев назад

    Excellent vid

  • @joy2000cyber
    @joy2000cyber 3 года назад +1

    Great! When is the next?

  • @usama57926
    @usama57926 2 года назад +1

    2:21 The right hand side of 3rd equation you write u1, u2 (unit vectors) as columns. Doesn't we write them as rows.......... ❓ ❓ ❓

  • @mohamadalizeraatkar1348
    @mohamadalizeraatkar1348 Год назад

    It is noted that "V is an orthogonal matrix because A_T A is symmetric", How we say that ?

  • @ridazouga4144
    @ridazouga4144 Год назад

    Can't wait for PCA ❤️❤️❤️❤️

  • @alovyachowdhury9143
    @alovyachowdhury9143 Год назад

    please make the video on principal component analysis!

  • @zgbjnnw9306
    @zgbjnnw9306 3 года назад +2

    Where's the next part, the SVD and PCA?

  • @giantbee9763
    @giantbee9763 2 года назад

    Awesome video! Look forward to the next one :D

  • @gamerscience9389
    @gamerscience9389 2 года назад

    thanks for geometric animations

  • @patakhte6367
    @patakhte6367 2 года назад

    Excellent. Thank you so much.

  • @BB-mr3vy
    @BB-mr3vy Год назад

    nice video. last part doesn't make so much sense tho. all you seem to show is that x^tA^tAx attains the eigenvalues of A^tA when x= the eigenvectors of A^tA. why is one of those the max of x^tA^tAx where x ranges over all unit vectors? rest of video is great though. great visualizations.

  • @BubbleBubble-3
    @BubbleBubble-3 2 года назад

    Great video! Sadly, where is next one?

  • @zgbjnnw9306
    @zgbjnnw9306 3 года назад

    This is such a inspiring video! It shows transformation so clear! I got the intuition. Thank you so much! Subscribed : )

  • @PREMKUMAR-wz6gq
    @PREMKUMAR-wz6gq Год назад +1

    Where is the next video? I hope you are all right, given covid and all...

  • @jackryan3312
    @jackryan3312 2 года назад +1

    You seem to assume that the solution to the argmax of x^T(A^TA)x is an eigenvector. What's the explanation?

    • @张笑寒-k8n
      @张笑寒-k8n 2 года назад

      hi,could you explain it to me if you have understand it?

  • @ionutbosie6017
    @ionutbosie6017 Год назад

    Thanks, very helpful

  • @Erik20766
    @Erik20766 2 года назад +1

    This gives a really good intuition. The only thing missing is the generalization to non-square matrices

  • @eneserdogan34
    @eneserdogan34 3 года назад

    nice visualization, keep going

  • @pechu1902
    @pechu1902 3 года назад +1

    Pretty cool my man

  • @tedsheridan8725
    @tedsheridan8725 2 года назад

    Great video!

  • @diego898
    @diego898 Год назад

    Please post your second video!

  • @nami1540
    @nami1540 3 года назад

    Did you base this on Gilbert Strangs lecture?

  • @qq315465327
    @qq315465327 2 года назад

    Beautiful

  • @easychemistry7560
    @easychemistry7560 3 года назад

    This was realy helping

  • @muhammadasadhaider6893
    @muhammadasadhaider6893 2 года назад

    Thank you.

  • @niranjanshankaran3493
    @niranjanshankaran3493 2 года назад

    Thank you very much

  • @MegaBender-yt8zj
    @MegaBender-yt8zj Год назад

    What kind of a matrix has vectors as its elements? Isn't that a tensor?

  • @Wow_1991
    @Wow_1991 2 года назад

    Nice video

  • @sebastianandrade8500
    @sebastianandrade8500 2 года назад

    Nice video but i didn't properly understand what he did at 2:06 . Can someone please explain it to me?

  • @gamerscience9389
    @gamerscience9389 2 года назад

    masterpiece genius !!

  • @chaoukimachreki6422
    @chaoukimachreki6422 Год назад

    But why are we asked to find an orthogonal set of vectors that when transformed by A will remain orthogonal !

  • @mydaiel6647
    @mydaiel6647 3 года назад +1

    Great video, very simple and clear. Just a question: I can’t understand why the columns of U are the eigenvectors of A^T A. Could you please explain this? Thanks

    • @satishreddy3668
      @satishreddy3668 3 года назад +1

      That is because A^TA is a symmetric matrix and can be decomposed using its Eigen Vectors in the form of Q(lambda)Q^T which would be U in this case

  • @The2378AlpacaMan
    @The2378AlpacaMan 2 года назад

    well explained, but i don't agree with your statement about u_1 having the largest eigenvalue. after all, you could just switch v_1 and v_2 and u_1 and u_2, and then the second column has the largest eigenvalue

  • @nidhipatil7692
    @nidhipatil7692 9 месяцев назад

    thank you

  • @zhilihuang6609
    @zhilihuang6609 2 года назад

    Thank u!

  • @724younjinlee
    @724younjinlee 2 года назад

    At 6:00', how does ||Ax||^2 become (Ax)^T(Ax)??

  • @450eli
    @450eli 2 года назад

    TY!!!
    where is the 2 video

  • @aryamanjhunjhunwala9064
    @aryamanjhunjhunwala9064 2 года назад

    PLEASE UPLOAD MORE

  • @noamzilo6730
    @noamzilo6730 Год назад

    I wish this same very good video existed without the background music. Really interrupting

  • @yassirelhamidi139
    @yassirelhamidi139 Год назад

    The next video about SVD in PCA will never be out XD

  • @easychemistry7560
    @easychemistry7560 3 года назад

    amazing