QR Factorization | Practical Linear Algebra (Lecture 10)

Поделиться
HTML-код
  • Опубликовано: 23 окт 2024

Комментарии • 7

  • @aleksanderchelyunskii5968
    @aleksanderchelyunskii5968 Год назад

    This is a neat way to identify linearly dependent matrices. Before truly understanding QR factorization, I used to do Gaussian forward elimination to spot linearly dependent column/row.

  • @davidmurphy563
    @davidmurphy563 Год назад

    Ahhh. I see. I've stopped at 3:40 to digest.
    So, if I've understood right, this is a transform which goes from any non-orthonormal basis to an orthonormal basis while preserving the vectors within. That's different to how you phrased it so it's possible I'm subtly wrong...
    You arbitrarily pick the X axis. Normalise. Then take the dot product and deduct the X component from the Y. Then you normalise again. Excellent. Makes perfect sense. Although I suspect this could be done with a single matrix-matrix multiplication... I wonder, would good generalise? Maybe, there's a lot of the video left.
    Listen, that is lesson 10. I'll skip back to the first now and work my way from the beginning.
    Simply superb explanation, thank you. Subbed.

    • @control-room
      @control-room  Год назад

      Thanks! Glad you found it helpful.
      Actually, QR doesn't preserve the original vectors at all. It finds different vectors that happen to describe the exact same space, just in a more useful way. The orthonormal vectors from QR may point in completely different directions from the original vectors (except the first vector, if you follow the Gram-Schmidt procedure).
      As for your suspicion that you can do this all with one matrix-matrix multiplication... yes, good intuition! That's exactly the next step in the video.

    • @davidmurphy563
      @davidmurphy563 Год назад

      @@control-room Oh wow ok, I appreciate the explanation. I suspected that I was getting ahead of myself so I started the course from the beginning and I've been through the first half a dozen videos or so. I'm up to the robot arm. All very clear so far but I'm taking my time to digest.
      I like how you give practical examples. What I want to do is code up a few vector / matrix class libraries myself along with transforms like this as methods or functions - maybe in a game engine - so I can play around with it and get a proper feel for it. If you mess about with this stuff first in lower dimensionality then it's easier to get a proper feel for what's happening.
      Really excellent, excellent stuff. Thank you so much.

  • @aleksanderchelyunskii5968
    @aleksanderchelyunskii5968 Год назад

    I saw in Lecture 5, you mentioned that proof for equality between column rank and row rank of matrices will be illustrated using QR factorization. However, I have difficulties to find the proof here. Did I miss it somewhere here?

    • @control-room
      @control-room  Год назад +1

      Great comment. Thanks for calling me out.
      Yes, I did mention that "column rank = row rank" requires QR to prove it. I didn't prove it in the QR video because, honestly, it's a long proof and the video was already getting long.
      However, you can find the full proof here:
      arxiv.org/pdf/2112.06638.pdf
      There are three proofs here -- see the second one (section 3, "Gram-Schmidt Way"). It uses UTV decomposition, which is outside the scope of this course, but it's closely related to QR, as explained in the paper.
      They also prove the UTV decomposition in that paper, so everything's self-contained. You have all the tools you need to understand the proofs. Give it a try;)

    • @aleksanderchelyunskii5968
      @aleksanderchelyunskii5968 Год назад

      @@control-room It is quite involved indeed. Thank you for sharing the reference, I will make the effort