The Strang Decomposition A = CR

Поделиться
HTML-код
  • Опубликовано: 25 авг 2024
  • This video is about a new matrix decomposition invented by Gilbert Strang.

Комментарии • 54

  • @lanimulrepus
    @lanimulrepus Год назад +11

    Wonderful to have you back... Still keep your excellent book on Tensor Analysis & The Calculus of Moving Surfaces on my desk... Strange and Grinfeld: Great Books and lectures to learn by...

  • @mikelevels1
    @mikelevels1 Год назад +15

    Your back! Thank you so much for this upload.

  • @shlokdave6360
    @shlokdave6360 Год назад +14

    Professor, please continue making videos regularly. I absolutely love your lecture series and books. Your book on Tensor Calculus is amazing. And thank you for making this particular lecture available online. If there is one math book which I like more than yours, its by Prof Strang. You two have an amazing command over these critical subjects.

    • @benstear581
      @benstear581 Год назад +2

      Dr Grinfeld studied under Strang at MIT :)

    • @shlokdave6360
      @shlokdave6360 Год назад +1

      @@benstear581 and it shows. Both are just amazing! But then, Dr PGs use of R as column space is a massive offense!

  • @133839297
    @133839297 Год назад +4

    Thank you, mr. Strang's intellectual heir.

  • @antoniolewis1016
    @antoniolewis1016 Год назад +2

    I appreciate the emphasis on beauty and aesthetics in these formulas. They help make mathematics less dry and more beautiful.

  • @Nothing-jo8ci
    @Nothing-jo8ci Год назад +6

    Hello back professor Grinfeld, you're Linear Algebra playlist the the True One and I've learned so much so far, now I'm addicted to your teaching approach for math. Wished you had a calculus1 playlist too. Sheers

  • @thelocalsage
    @thelocalsage Год назад +1

    you are one of my favorite math teachers on youtube i would definitely be lost trying to study math after college without your work

  • @shlokdave6360
    @shlokdave6360 Год назад +6

    Professor, kindly make a series on ordinary differential equations as well. It will be great to re learn the subject using your insights on the topics. Thank you.

  • @ichwillkeinenchanne1
    @ichwillkeinenchanne1 Год назад +3

    Thank you for being back, with new style, so to say.

  • @ronaldjorgensen6839
    @ronaldjorgensen6839 Год назад +2

    thank you for your persistence

  • @Danny-hj2qg
    @Danny-hj2qg Год назад +21

    Must be very cool for those kids to have Seth Rogen as their math professor.

  • @kabuda1949
    @kabuda1949 Год назад +4

    Nice presentation. I learnt alot

  • @jimlbeaver
    @jimlbeaver Год назад +2

    Great video! Welcome back!

  • @ebog4841
    @ebog4841 Год назад +3

    YAAAAAS! NEW VIDYA! LESS GO! MathTheUploader!

  • @wakari5444
    @wakari5444 Год назад +5

    „Let me dive right into it…“ is at 11:16

    • @yoelcalev2763
      @yoelcalev2763 Год назад +2

      Thank you so much, you saved me 6 minutes of fluff :)

  • @donaldbesong8853
    @donaldbesong8853 Год назад +6

    Yep! Professor Gilbert Strang is the main go-to person for anything Linear Algebra.

    • @adamsniffen5187
      @adamsniffen5187 Год назад +1

      lol

    • @Nothing-jo8ci
      @Nothing-jo8ci Год назад +3

      I've tried first with professor Strang. I watched the first video of his playlist and I learned quite goo. Then the second, and I didn't know what was going on. Only with Dr Grinfeld playlist I truly learned the essence of Linear Algebra. From totally beginner to enjoy LA. I don't know how did you learn from Strang. Maybe in engineering way? Could you tell more?

    • @donaldbesong8853
      @donaldbesong8853 Год назад

      @aDBo'Ch 1 The 'Albert' was used for fun. As an allusion to Einstein. But this one is Gilbert.

  • @unknownvariablex7
    @unknownvariablex7 Год назад +2

    And he is back

  • @iamtraditi4075
    @iamtraditi4075 Год назад +1

    Lovely! Thank you so much

  • @barryzeeberg3672
    @barryzeeberg3672 Год назад +2

    That is a very Strang[e] decomposition :)

  • @seanflanigan4597
    @seanflanigan4597 Год назад +2

    What he said is portable as potentially portraiture of dancers in flying.

  • @horaciormartinez1551
    @horaciormartinez1551 Год назад +2

    Welcome back !!!

  • @brockobama257
    @brockobama257 Год назад +1

    Are you telling me I should’ve learned this but didn’t? It’s very nice

  • @Nothing-jo8ci
    @Nothing-jo8ci Год назад +3

    Hello professor Grinfeld.! Have you ever thought of creating a calculus playlist for beginners, as your Linear Algebra playlist? It would be phenomenal for people like myself, whom\who enjoy math and want to become mathematicians.

    • @weicheng4608
      @weicheng4608 Год назад +1

      I also wish professor Grinfeld creating a calculus playlist. Professor Grinfeld's Linear Algebra playlist is the course make me become confident with Linear Algebra(I went through several others before). Your insight in the field helps me understand more deeply.

  • @ebog4841
    @ebog4841 Год назад +3

    the statement about algebra is so true

  • @tuongnguyen9391
    @tuongnguyen9391 Год назад +1

    How to do this CR decomposition in Matlab ?

  • @cmilkau
    @cmilkau Год назад +1

    I'm certain I've used this many times before without realizing it has a merit on its own.

  • @dilsydiltak0101
    @dilsydiltak0101 Год назад +1

    Sir, will you make videos on the Applications of Linear Algebra...like Data Science field etc...

  • @eytansuchard8640
    @eytansuchard8640 Год назад

    Hi professor, 1) At first perform Gram Schmidt on the columns and throw away dependent columns which are mapped to 0 by the GS algorithm. The orthogonal vectors are then discarded after they did the job of throwing away the dependent columns. This process reaches the C matrix. 2) Derive the coefficient of the linear combination that yield the dependent columns. For some applications, taking the orthogonal columns in (1) and not throwing them away in favor of the original independent columns, may be more attractive, especially in Machine Learning.

    • @MathTheBeautiful
      @MathTheBeautiful  Год назад +2

      Hi Eytan, I like the idea of using GS to eliminate the linearly dependent vectors, as it injects the discussion with geometric ideas. However, I think it's less efficient. Also, the Strang decomposition is "affine", i.e. it does not require the notion of an inner product. Meanwhile, GS needs an inner product.

    • @eytansuchard8640
      @eytansuchard8640 Год назад +1

      @@MathTheBeautiful Inner product exits both in the C and in the R fields. From my experience, it is a numerically highly efficient process unless the column vectors are nearly parallel. In ML applications a near representation is sufficient. If after normalization, the removal of projections on previous columns leaves a small norm, the new column is not normalized buy discarded even if its norm is not zero. A threshold on the norm is defined before the process starts. A representation which is nearly perfect is desirable in Machine Learning.

  • @scottmiller2591
    @scottmiller2591 Год назад +1

    Gilbert Strang's video on CR factorization: ruclips.net/video/BFYU9wywgtY/видео.html

  • @thechannelwithoutanyconten6364

    How can I find that decomposition on the internet?

  • @rishiaman2288
    @rishiaman2288 6 месяцев назад +2

    strang

  • @mathisnotforthefaintofheart
    @mathisnotforthefaintofheart Год назад

    His outfit matches the color of the chalk board...

  • @aaronrobertcattell8859
    @aaronrobertcattell8859 Год назад +1

    cool

  • @juice-ih6fw
    @juice-ih6fw Год назад +3

    orange juice

    • @Rayquesto
      @Rayquesto Год назад

      Give him some bongos. I’ll even join in with an alto sax to add some quantum math to it!

  • @iRReligious
    @iRReligious Год назад +1

    I didn't dn't understand a shit! But was really interesting 😂😂

  • @bmebri1
    @bmebri1 Год назад

    Video actually starts at 11:13. You're welcome.

  • @jasoncampbell1464
    @jasoncampbell1464 8 месяцев назад

    No offense but "how useful is it" is about the dumbest question I've ever heard asked in a math class, hands down. Followed by "Who's {the first name of the author of the method}?" What a knee slapper. 🤣