Square root of a matrix

Поделиться
HTML-код
  • Опубликовано: 11 апр 2019
  • Square root of a matrix: definition and calculation using eigenvalues. What does it mean for a matrix to have a square root?
    Check out my Eigenvalues playlist: • Diagonalize 2x2 matrix
    Subscribe to my channel: / @drpeyam

Комментарии • 239

  • @blackpenredpen
    @blackpenredpen 5 лет назад +363

    I never leant this until now....

    • @hardeejd
      @hardeejd 5 лет назад +9

      Linear algebra Hoffman and kunze theorem 13 in 9.5

    • @umbraemilitos
      @umbraemilitos 5 лет назад +16

      This is exploited in Quantum Mechanics.

    • @ffggddss
      @ffggddss 5 лет назад +13

      I got this in my linear algebra course in my undergrad years. We did it like this, as I recall:
      1) Find the eigenvalues, λ, & eigenvectors of A
      2) Use the eigenvectors to make a matrix, S, & its inverse
      3) Diagonalize A by similarity transformation, B = SAS⁻¹. The diagonal elements of B will be the λ's
      4) √B is then just the diagonal matrix whose diagonal elements are √λ's [Either sign can be taken, ±√λ, with each λ]
      5) Reverse the similarity transformation, √A = S⁻¹(√B)S. Done!
      NB1: Not every matrix is diagonalizable; not every matrix has a square root.
      I don't recall whether either set (or both!) is a subset of the other.
      Every symmetric matrix has both properties, however.
      NB2: This can also be done with (some) complex matrices.
      Fred

    • @Safwan.Hossain
      @Safwan.Hossain 5 лет назад +16

      Damnnnnn. Even the Calculus God didn't know about this linear Algebra topic

    • @umbraemilitos
      @umbraemilitos 5 лет назад +4

      @@Safwan.Hossain I know, right? He's brilliant, but I was surprised he hadn't seen the Taylor Series expansion of a matrix argument of a function.

  • @FaithInNachos
    @FaithInNachos 5 лет назад +160

    I swear this man's too friendly

  • @mildlyacidic
    @mildlyacidic 5 лет назад +74

    Now natural log of matrix Peyam ;)
    so we can get sneaky introduction into Lie Groups :P

    • @NessHX
      @NessHX 5 лет назад +6

      The way he uses to take a square root from matrix works for every function f. f(A) = P f(D) P ^ -1. This works for every function that can be expanded into Taylor series.

  • @77Chester77
    @77Chester77 5 лет назад +53

    I don't like to be the one, but: 4:02 the matrix should be [4 -4; -4 4].
    Great video doctor ;-)

  • @woowooNeedsFaith
    @woowooNeedsFaith 5 лет назад +13

    Your enthusiasm is fun to watch. If I had only √ of it, I would be fine.

  • @michaelnobibux2886
    @michaelnobibux2886 5 лет назад +10

    A very enthusiastic math teacher! Very rare these days!!!

  • @genekisayan6564
    @genekisayan6564 5 лет назад +15

    this is the most excited dude in the whole YT

  • @eneapane5831
    @eneapane5831 5 лет назад +38

    Try cubing it and call it matrix triology😂

  • @danielmilyutin9914
    @danielmilyutin9914 5 лет назад +37

    With spectral calculus you can define any function of matrix. Either formally or by converging power series. 💪

  • @madcapprof
    @madcapprof 5 лет назад +14

    There is a small mistake in the number of "square roots" of a diagonalizable positive semi definite matrix A.
    It would be 2^k, where k is the number of non-zero eigenvalues of A.

  • @sam-kx3ty
    @sam-kx3ty 4 года назад

    I love the way you combine knowledge of both linear algebra and calculus to help solve knew math problems every time, I sure hope I can be as calm as you later in future and thanks for this video .

  • @FatihKarakurt
    @FatihKarakurt 3 года назад +8

    [[1,2],[2,1]] is also a solution, so are the negative complements. So, there are 4 solutions in total.

  • @anhthiensaigon
    @anhthiensaigon 5 лет назад +5

    this man must love teaching so much

  • @etienneparcollet727
    @etienneparcollet727 5 лет назад +5

    You can also do this for other matrices but generally the answer is non-double. You need tu use sub-spaces stability arguments to get the answers.

  • @joaohenriquerochadonascime1615
    @joaohenriquerochadonascime1615 5 лет назад +3

    Is there such a thing as a matrix that has matrices as components?

    • @drpeyam
      @drpeyam  5 лет назад +3

      Sadly not, elements in a matrix have to be in a field, and we can’t divide any nonzero matrix

  • @piguyalamode164
    @piguyalamode164 5 лет назад +4

    You can do this other times, such as a rotation matrix a=[[cosø, -sinø],[sinø, cosø]] -> √a = [[cos(ø/2),-sin(ø/2)],[sin(ø/2),cos(ø/2)]], but it is unclear when you can do this.

  • @dsantistevan99
    @dsantistevan99 5 лет назад +9

    Were was this when I was coursing Linear Algebra lol. Blame on youtube for not showing me your channel sooner

  • @GoingsOn
    @GoingsOn Год назад

    The way I found one of the square roots of A = [[5, 4],[4, 5]] was by reasoning that it is some matrix R such that |R| times itself is |A| = 9. Which means |R| = 3 or -3. Then I guessed that [[1, 2],[2, 1]] worked, checked its determinant (which is -3), and then multiplied the matrix by itself, and it worked. This method of diagonalization you have demonstrated is a lot more intuitive!

  • @cipres6539
    @cipres6539 5 лет назад +30

    You are really crazy

  • @luisrosano3510
    @luisrosano3510 5 лет назад +2

    What if one of the eigenvalues are negative?

    • @drpeyam
      @drpeyam  5 лет назад +2

      You’d get imaginary matrices, which are technically also square roots

  • @RandomExtreme
    @RandomExtreme 5 лет назад +7

    I used other method and I got 4 matrix results for square root of A, all of them works

  • @bjoernaagaard
    @bjoernaagaard 5 лет назад +1

    I love your passion. Big ups

  • @Arup497
    @Arup497 5 лет назад

    Thank you.your videos are helping me a lot.

  • @Safwan.Hossain
    @Safwan.Hossain 5 лет назад +2

    U seem really happy teaching this :)

  • @catherinehubbard1382
    @catherinehubbard1382 3 года назад

    This put a smile on my face, thank you :)

  • @friendlyneighborhoodbastar3723
    @friendlyneighborhoodbastar3723 9 месяцев назад

    love the enthusiasm bro

  • @ebeb9156
    @ebeb9156 5 лет назад +2

    Listen to me very carefully....you are the maaaaaaaaan! Omg never been so satisfyed by mathematic!!!!

  • @shmuelzehavi4940
    @shmuelzehavi4940 5 лет назад +8

    There is another positive solution, which is the matrix: [1 2 ; 2 1].

  • @theproofessayist8441
    @theproofessayist8441 2 года назад

    Wow diagonalization really does wonders - and now you can generalize this for any fractional power of a matrix.

  • @daemond8093
    @daemond8093 3 года назад

    @dr Peyam What is the reason we ignore the negative sign of the square root of the matrix?

  • @ashisranjanudgata5526
    @ashisranjanudgata5526 5 лет назад +1

    Very interesting and shortcut method of matrix inverse

  • @mehdisi9194
    @mehdisi9194 5 лет назад

    Thank you so much. Very nice video.

  • @tdiaz5555
    @tdiaz5555 5 лет назад +5

    That is so cool! Thanks for this linear algebra refresher.
    I do have a question, how would you go about finding the other matrices that give B^2 = A?

    • @drpeyam
      @drpeyam  5 лет назад +5

      Use minus the square roots of the eigenvalues, and rearrange the eigenvectors in any order you wish. That should pretty much give you all of them

  • @danielfisher587
    @danielfisher587 5 лет назад +2

    Thank you for the video. At 1:13 you mention there are N^2 square roots. Are there not 2^N square roots, obtained by independently choosing between the positive and negative square roots of each of the N eigenvalues?

    • @drpeyam
      @drpeyam  5 лет назад +3

      Yeah, 2^n, my bad

  • @patrykszlufik4020
    @patrykszlufik4020 4 года назад +1

    Doesn't it leave like n^2 (for nxn matrix) possibilities of square roots (just by this definition; over complex numbers?). Also why eigenvalues and not determinant afterall?

  • @afrolichesmain777
    @afrolichesmain777 5 лет назад +3

    Hello Dr Payam, I really enjoy your linear algebra videos. Could you do a video talking about the matrix exponential? I know what it is but I don't think I understand it completely. Cheers!

    • @drpeyam
      @drpeyam  5 лет назад +2

      Sunday 😉

    • @drpeyam
      @drpeyam  5 лет назад +2

      Haha, Monday actually, I’m gonna take a rest day tomorrow

    • @Bjowolf2
      @Bjowolf2 5 лет назад +1

      @@drpeyam A while curvature? 😂

  • @rizwankhan-st5sl
    @rizwankhan-st5sl 21 день назад

    Why to find Model Matrix and Diagonal Matrix if we can find the the square root or any higher power of a matrix with eigen values in characteristics equation only.

  • @ixian98
    @ixian98 3 года назад

    that was awesome.... thank you!

  • @lanceayapana2755
    @lanceayapana2755 5 лет назад +2

    How about considering an n by n matrix in general?

  • @fikrimuhammad5511
    @fikrimuhammad5511 5 лет назад +2

    Holy cow, I encountered this when I was studying Unscented Transform for UKF...

  • @hadisehfallah1125
    @hadisehfallah1125 2 года назад

    Thanks for your great teaching. How can we comute A^(2^1/2)? I used this way but my teacher said its not true here.

  • @federicopagano6590
    @federicopagano6590 5 лет назад +4

    But the opposite matrix of what was found also is a solution what happened to that matrix ?

    • @Bjowolf2
      @Bjowolf2 5 лет назад +2

      Yes, they are "-sqrt(A)" 😎

  • @VladTepesh409
    @VladTepesh409 5 лет назад +3

    I'm looking forward to linear algebra.

  • @omarsalameh3132
    @omarsalameh3132 4 года назад

    Thank you very much

  • @gurindersinghkiom1
    @gurindersinghkiom1 3 года назад

    I learnt it from u thanks

  • @mastershooter64
    @mastershooter64 Год назад +1

    is this analogous to the square root of a differential operator?

  • @spockfan2000
    @spockfan2000 5 лет назад

    Thanks, doc!

  • @sushmitamukherjee2215
    @sushmitamukherjee2215 5 лет назад +1

    0:33 it should have been "if and only if y>=0" o0therwise
    Square root of 4 can be -2 according to ur definition

  • @sharzilkhan9150
    @sharzilkhan9150 2 года назад

    oh i really forgot the concepts... thanks for the revision

  • @brendanlawlor2214
    @brendanlawlor2214 3 года назад

    I love his linear algebra esp fun and clarity

  • @silasrodrigues1446
    @silasrodrigues1446 5 лет назад

    Find a polynomial p(x) such that p(y) = f(y) for all y, where y are eingenvalues of A, so p(A) = f(A), where f(x) is any function defined at the points x=y.

  • @MessedUpSystem
    @MessedUpSystem Год назад

    You can extend this to non-diagonalizable matrices with the Taylor series of square root

    • @drpeyam
      @drpeyam  Год назад +1

      Or Jordan form :)

  • @garyhuntress6871
    @garyhuntress6871 5 лет назад

    I'm here because of BPRP and wow I'm loving these !!!!

    • @drpeyam
      @drpeyam  5 лет назад

      ❤️❤️❤️

  • @johnchristian5027
    @johnchristian5027 5 лет назад

    great vid!

  • @maximilianmueller4707
    @maximilianmueller4707 2 года назад

    Can we use sqrt(A) = e^0,5ln(A) so we have to calculate ln(A) with its Taylor Serie and the same argument for the exponential

    • @drpeyam
      @drpeyam  2 года назад +1

      That works too

  • @bilepancreas
    @bilepancreas 3 года назад

    Thank you!

  • @77Fortran
    @77Fortran 3 года назад

    For anyone interested in gravity, there is a quite recent formulation of general relativity due to Krasnov where the Lagrangian for the theory involves the trace of the square root of a matrix!

  • @happyface4234
    @happyface4234 3 года назад

    Dr. Petam should do a course in graduate math study series.

  • @bagusamartya5325
    @bagusamartya5325 5 лет назад +3

    I didn't even know what eigenvalue is yet why do I still watched it till the end

    • @pabloagsutinnavavieyra2308
      @pabloagsutinnavavieyra2308 5 лет назад +2

      There's a nice RUclips series of "3brown1blue" in where he explains the essence of linear algebra.
      It took me a little while to get grips to it, but it was REALLY satisfactory when I get concepts as the Eigenvalue

    • @Bjowolf2
      @Bjowolf2 5 лет назад +1

      In a physical system ( mechanics, acoustics, quantum mechanics...) the eigenvalues are related ( squared) to the natural frequencies of the system - i.e. where it tends to oscillate more easily by itself ( without being forced - think of a swing, a metal rod, a string, a membrane or a bridge etc.).
      The associated eigenvectors then describe the various possible oscillating states - where each point of the system are at their extreme position - before returning to the opposite position ( ideally, when there is no energy lost to internal heat, damping or air resistance etc.)
      For a one dimensional string:
      A set of possible sine waves (eigenfunctions) with fix nodes at the end points, which depend on the lenght of the string and its density etc.
      In this case the number of nodes ( the positions of EACH coordinate of these eigenvectors! ) becomes infinite as the eigenvector turns into a (continuous) eigenfunction in the limit. ( depending on how accurate a mathematical description we want of our physical system ).

  • @tcoren1
    @tcoren1 5 лет назад +3

    Why demand that the matrix is positive definite? If not you just get a complex matrix as the root, but I feel like that is a valid answer... if someone ask you what the square root of minus 1 is, you answer that it is i or minus i. Likewise there is no shame in returning a complex answer when asked for the square root of a non positive definite matrix...

    • @kazedcat
      @kazedcat 5 лет назад +2

      Because having complex value in your matrix is tensor algebra.

    • @Bjowolf2
      @Bjowolf2 5 лет назад +1

      I guess he was only referring to the normal (real) square root function, not the more advanced complex square root function, which just "happens" to coincide with the former on the non-negative real axis.

  • @GhostyOcean
    @GhostyOcean 4 года назад +1

    Since you said this can be extended to all real roots for "non negative" matrices, could a "negative" matrix have an odd root?

    • @drpeyam
      @drpeyam  4 года назад +1

      It could have an imaginary root, like the square root of -1 is i

    • @GhostyOcean
      @GhostyOcean 4 года назад +1

      @@drpeyam oh, nice!

  • @ThePharphis
    @ThePharphis 5 лет назад +1

    wow, this was very different than I would have guessed. I thought you were just going to have B = [a b c d] matrix, square it and set the cells equal to the matrix you're taking the root of. ie 4 equations with 4 unknowns.
    Well I'm too lazy to do it but would this work?

    • @drpeyam
      @drpeyam  5 лет назад +2

      I think it might be possible, but a huge mess!

  • @miro.s
    @miro.s 4 года назад

    Also nice video would be to find a square root through direct decomposition into subspaces and apply it on non-diagonalizable matrices and especially nilpotent matrices. E.g. you can find what is square root of finite first derivative operator matrix applied on polynomials and how to interpret its coeficients. Another way how to handle it would be using exponencial on operator matrix.

    • @miro.s
      @miro.s 4 года назад

      As nilpotent matrix can never get back to have non zero diagonal elements, because sqrt(N) has to have all eigenvalues again 0 nilpotent precedent is always nilpotent if it exists. After each sqrt(N), the dim of its null space is at least 1 lower, so if we get to the corner of the chain with only one eigenvector, I suppose the sqrt of that matrix doesn't exist, because it would change nilpotent to regular matrix that can not have any eigenvalue 0.

  • @afifakimih8823
    @afifakimih8823 4 года назад +1

    "What I learning day by day is that everything is possible in math and physics"

  • @WildAnimalChannel
    @WildAnimalChannel 5 лет назад +1

    what about the other square roots? I can see at least 3 more.

    • @drpeyam
      @drpeyam  5 лет назад +1

      There are at least 4, up to conjugation

  • @freddy3940
    @freddy3940 Год назад

    MUCHAS GRACIAS, ME FUE DE GRAN AYUDA. ME SUSCRIBO

  • @Tomaplen
    @Tomaplen 5 лет назад +11

    me some months ago when the course started: Pff, I already did a linear algebra course years ago, how boring this will be
    me after the limit of a matrix and this: O.O holly shit, this course is sick!

  • @chibigato3x311
    @chibigato3x311 5 лет назад +14

    I want to be a square root

    • @ruffifuffler8711
      @ruffifuffler8711 5 лет назад +1

      Generally speaking, I want to be a cube root, because that is much better to be then merely a square root, but the grand champ has to be all the complex roots of __.

    • @ashtonsmith1730
      @ashtonsmith1730 3 года назад

      then dont be a perfect sqaure

  • @MrRyanroberson1
    @MrRyanroberson1 5 лет назад +1

    0:35 why is this still a condition XD because actually that condition works just as well allowing negative x values, and certainly doing this with matrices would make the distinction very blurry because you haven't defined comparing matrices, as far as i've seen (edit you then go on to define that, but i don't see why eigenvalues can't be negative or complex)

  • @mathisnotforthefaintofheart
    @mathisnotforthefaintofheart 2 года назад

    The interesting thing is that a 2 by 2 matrix can have 4 square roots! That is because each eigenvalue has 2 roots and there are 2 eigenvalues, thus 4 possibilities. They can be found much easier using Cayley Hamilton and completing the square with respect to the constant instead of the linear term in the Characteristic equation. And to make things crazier, the unit matrix has infinitely many square roots!

  • @LouisEmery
    @LouisEmery 2 года назад

    After 40 years of using eigenvectors in my work in optics I never once thought of taking a square root of a matrix. We have the equivalent but it is not called the "square root". In a a periodic transport system we would call that quantity one half of the transport matrix.

  • @JoseTorres-pe6tt
    @JoseTorres-pe6tt 5 лет назад +1

    I would like to understand this but I'm not prepared even if I don't, I have to say that watching the video was interesting and fun

  • @schirox1
    @schirox1 5 лет назад

    Very nice

  • @mikeburns6603
    @mikeburns6603 4 года назад

    When the original matrix is not positive semi-definite, can you extend this method to have the square root be complex numbers, or does it break down?

    • @drpeyam
      @drpeyam  4 года назад

      Complex numbers, like the square root of -1

  • @ibraheemalani3584
    @ibraheemalani3584 5 лет назад +5

    But isn't the matrix
    [1 2]
    [2 1]
    an answer also?

    • @drpeyam
      @drpeyam  5 лет назад +5

      Yep! That’s because the columns of P are not unique, your matrix comes from P but flipping the columns, which is also ok. So more than 4 square roots

    • @Bjowolf2
      @Bjowolf2 5 лет назад +1

      @Chig Bungus Sort of
      They are the "- sqrt(A)" solutions, since they have negative eigenvalues - i.e.
      B = -sqrt(A) is not positive definite.

    • @martinpohl2383
      @martinpohl2383 2 года назад

      @Chig Bungus its like sqrt(4) has 2 as its unique solution but x^2 = 4 has solutions -2 and 2.

  • @user-ip9bn7lt6g
    @user-ip9bn7lt6g 2 года назад

    Ok. But, for sqrt for 1 and 9 we get 1 (+/-) and 3 (+/-). So, we have 4 roots for matrix A. Why we have only one solution?

  • @grecuandy309
    @grecuandy309 5 лет назад +1

    Well...Jordan decomposition is good at something here

  • @juanghpx
    @juanghpx 5 лет назад

    I have a question:
    for the matrix to be positive, is it required that all its eigenvalues ​​be positive?
    If it has positive and negative eigenvalues, can not the definition be applied?
    Thank you for the video, very interesting (and sorry for my bad english )

    • @drpeyam
      @drpeyam  5 лет назад +2

      In that case the matrix is neither positive nor negative (some people say it’s hyperbolic)

    • @ffggddss
      @ffggddss 5 лет назад +1

      I think it can still be done, but only in matrices over the complex field, ℂ.
      And keep in mind that, in complex vector spaces, the dot product always has a complex conjugate on one of the vectors.
      (When you transpose, e.g., to turn a row vector into a column vector, or vice versa, you must also conjugate.)
      Is this about right, Dr. ∏M?
      Fred

    • @mohammadw.alomari1322
      @mohammadw.alomari1322 5 лет назад

      The answer for your question it is not necessarily unless the matrix is selfadjoint

  • @MrConverse
    @MrConverse 5 лет назад +2

    Can someone refresh my memory? Is matrix multiplication associative? That is, does (AB)C = A(BC) for matrices A, B, & C?

  • @wooyoungkim2925
    @wooyoungkim2925 4 года назад

    so....clear !!!!!

  • @miknscy
    @miknscy 4 года назад

    This 2x2 matrix (which is positive def) has 4 different choices of square root- matrices, and of the four possible choices he showed the one that is the positive definite one.

  • @mikoajwalak701
    @mikoajwalak701 4 месяца назад

    Just started the introduction to Time Series and found out that somehow no one has tought me how to calculate square roots of matrices. Wish me luck on my colloquium tommorow lol

    • @drpeyam
      @drpeyam  4 месяца назад

      Good luck!!!

  • @alejrandom6592
    @alejrandom6592 2 года назад

    Notice that f(A) where A=([a,b],[b,a]) is equal to ( [f(a+b)+f(a-b) , f(a+b)-f(a-b)],[f(a+b)-f(a-b) , f(a+b)+f(a-b) ]*1/2

  • @Nukestarmaster
    @Nukestarmaster 5 лет назад +1

    So if your matrix is of complex numbers, the requirement for the determinants to be zero goes away, just like with numbers.

    • @drpeyam
      @drpeyam  5 лет назад

      The determinant is still zero by definition of an eigenvalue

    • @Nukestarmaster
      @Nukestarmaster 5 лет назад

      @@drpeyam I'm dumb, what I wrote wasn't anything like what I meant to say. I meant to say is that the requirement for the eigenvalues to be positive goes away when the matrix is of complex numbers.

    • @drpeyam
      @drpeyam  5 лет назад

      That makes more sense :)

  • @AnindoSarker
    @AnindoSarker 4 года назад +1

    How to calculate the square root of a 3x3 matrix? I can't solve it

    • @drpeyam
      @drpeyam  4 года назад +1

      Same idea, diagonalize it

    • @AnindoSarker
      @AnindoSarker 4 года назад +1

      @@drpeyam for some matrixes it gives complex numbers as sq root. Then if I square that matrix, it doesn't give the real matrix back.

    • @drpeyam
      @drpeyam  4 года назад +1

      It should give the real matrix back

  • @rataleprosa1780
    @rataleprosa1780 2 года назад

    I really enjoy that "something to a matrix" ideas. what about "cos to a matrix"

    • @drpeyam
      @drpeyam  2 года назад

      Already done ✅

  • @Bjowolf2
    @Bjowolf2 5 лет назад

    Guessed the result straight away 😎

  • @Meedamon
    @Meedamon 5 лет назад +1

    At 1:14 you say there are at least n^2 values, that should be 2^n instead.

  • @diegogambaro3823
    @diegogambaro3823 5 лет назад

    (D)^1/2 may be [1 0; 0 3] or [-1 0; 0 3] or [1 0; 0 -3] or [-1 0; 0 -3]

  • @gabrieletrovato3939
    @gabrieletrovato3939 3 месяца назад

    What is that "NUL"? ("noss"??) 3:25

  • @wolframalpha8634
    @wolframalpha8634 5 лет назад +4

    Heyyy Dr.Payam

  • @Algebrodadio
    @Algebrodadio 5 лет назад

    So now you can show people how to compute the exponential of a matrix (by expanding e^A in a taylor series). Then you can talk about Lie Groups and Lie Group actions !

    • @drpeyam
      @drpeyam  5 лет назад

      Already done ✅

  • @modolief
    @modolief 5 лет назад

    I was hoping for one or more families of equations describing all solutions to the problem. I.e. en.m.wikipedia.org/wiki/Square_root_of_a_2_by_2_matrix

    • @drpeyam
      @drpeyam  5 лет назад

      That’s more or less what I did, I just used eigenvalues instead of trace and determinants

  • @lanceayapana2755
    @lanceayapana2755 5 лет назад

    There has to be certain condition to this. [1 2 ; 2 1] may apply.

    • @drpeyam
      @drpeyam  5 лет назад +2

      Well, up to choice and rearrangement of the eigenvectors

  • @DanNguyen-oc3xr
    @DanNguyen-oc3xr 5 лет назад

    What school do you teach at?

  • @Jay-pp6bu
    @Jay-pp6bu 5 лет назад

    [2 1 1 2] works, but so does [1 2 2 1], and so does [-2 -1 -1 -2] and [-1 -2 -2 -1]. All four matrices can be squared and produce [5 4 4 5]. Why can’t all four matrices be considered solutions to sqrt([5 4 4 5])?

    • @Jay-pp6bu
      @Jay-pp6bu 5 лет назад

      I see Orang found these solutions as well...

    • @drpeyam
      @drpeyam  5 лет назад +1

      Sure they can! They are the same up to rearrangement of the eigenvectors

  • @sugarfrosted2005
    @sugarfrosted2005 5 лет назад +1

    Algebraic closure of the matrices. :3 I don't remember enough about 251 to say if this actually works.

  • @chimetimepaprika
    @chimetimepaprika 5 лет назад

    Fresh!

  • @jakubxyz9507
    @jakubxyz9507 5 лет назад +1

    n-th where n is natural number

  • @crazy71achmed
    @crazy71achmed 5 лет назад

    Good explanation. Thank you. :)