Matrix Proof: det(exp A) = exp(Tr A)

Поделиться
HTML-код
  • Опубликовано: 22 ноя 2024

Комментарии • 75

  • @APaleDot
    @APaleDot Месяц назад +96

    7:10
    Small correction. You can't order the matrices inside the trace function however you want. You can only rearrange them cyclically. So, for instance, you can do JP⁻¹P or P⁻¹PJ but not JPP⁻¹.

    • @slavinojunepri7648
      @slavinojunepri7648 Месяц назад +5

      Can the rearrangement be arbitrary if the matrices involved in the multiplication (which we are taking the trace) have the same dimensions? The cyclical rearrangement is obviously required if the dimensions are different.

    • @APaleDot
      @APaleDot Месяц назад +24

      @@slavinojunepri7648
      No, even if they are all square matrices, they have to be rearranged cyclically (unless they otherwise commute).

  •  Месяц назад +10

    from Morocco thank you very much son you took me far away 40 years in my past university studies

  • @Geenimetsuri
    @Geenimetsuri 28 дней назад +4

    Thanks. I understood it. Excellent presentation.

  • @khastakachori123
    @khastakachori123 15 дней назад

    What a beautiful result!

  • @aremathukr
    @aremathukr Месяц назад +6

    I actually have this formula tattooed on my left forearm:) I remember seeing it for the first time in 3b1b video, and immediately trying to prove it in my head. I managed to do it for the case of the diagonal matrix when I was falling asleep, and when I woke up I knew how to generalize it for any square matrix. After I got the tattoo, this formula started appearing everywhere in my studying, which is funny)
    Thanks for the video

  • @i.h.i.d9725
    @i.h.i.d9725 Месяц назад +4

    I just started linear algebra this semester, and this video makes me excited about the subject.

  • @berlinisvictorious
    @berlinisvictorious 15 дней назад

    Always loved me some linear algebra

  • @shehbazthakur1
    @shehbazthakur1 19 дней назад +1

    superb, loving it. subscribing.

  • @alejrandom6592
    @alejrandom6592 Месяц назад +5

    Note that f(PJP^-1) = P f(J) P^-1 for any f(•) that has a taylor series

  • @tomkerruish2982
    @tomkerruish2982 Месяц назад +77

    Alternatively, tr = ln○det○exp.

    • @BenfanichAbderrahmane
      @BenfanichAbderrahmane Месяц назад +10

      det=exp(tr(ln))

    • @considerthehumbleworm
      @considerthehumbleworm Месяц назад +12

      Ah, what an elegant way of calculating trace!

    • @CrazyShores
      @CrazyShores Месяц назад +2

      Excellent explanation! Only the notation for elements of a matrix is a bit off, most books use lowercase letters

  • @JohnSmall314
    @JohnSmall314 Месяц назад +3

    Very nice and clear explanation

  • @coshy2748
    @coshy2748 Месяц назад +1

    Very clear. Good presentation.

  • @alejrandom6592
    @alejrandom6592 Месяц назад +1

    Very nice and easy to follow

  • @andreagourion9104
    @andreagourion9104 15 дней назад

    I think you can prove it with this method too : (lambda,v) is an eigenvalue/eigenvector tuple of A iff (e^{lambda}, v) is an eigenvalue/eigenvector tuple of exp(A) (provable with the definition of eigenvalue on the Taylor expansion, a bit more difficult for the other side). The other thing is that det(A)=lamba_1…lambda_n, tr(A)=lamba_1+…+lambda_n (provable with a root factorization of the characteristic polynomial of A). Finally plug exp(A) in the the det, use the two last equalities and you have the result.

  • @nikkatalnikov
    @nikkatalnikov 21 день назад

    very clear, thank you very much

  • @jceepf
    @jceepf 21 день назад

    I believe that you can prove this directly by using the formula exp(xA)= lim_(N->infinity} (1+xA/N)^N and expanding the deterninant in x/N for small x/N. The leading order would be related to the trace and the rest goes away as N-> infinity.
    Of course, the Jordan decomposition provides an elegant proof if you assume the existance of Jordan decomposition.

    • @m4gh3
      @m4gh3 21 день назад

      Jordan decomposition is always possiblw over an algebraically closed field, for example the complex numbers

    • @jceepf
      @jceepf 21 день назад

      @@m4gh3 Yes, I did not say it was wrong.

    • @dominiquelarchey-wendling5829
      @dominiquelarchey-wendling5829 19 дней назад

      Then you have to explain how to get the algebraic closure which is not as easy as this exercise...

  • @thomasjefferson6225
    @thomasjefferson6225 Месяц назад +2

    This is a very good video. I really enjoyed it.

  • @ehudkotegaro
    @ehudkotegaro Месяц назад +1

    Take the function
    det(e^(At))
    The derivative of det at matrix A applied to matrix X is det(A)* tr(A^-1*X). And so the derivative of this function is
    det(e^(At))*tr(e^(-At)*A*e^(At))=det(e^(At))*tr(A)
    Also, det(e^(A*0))=1
    And so we get
    det(e^(At))=e^(t*tr(A))
    And subsuiting in one, we get
    det(e^A)=e^tr(A)

    • @98danielray
      @98danielray Месяц назад

      I suppose you are using that both functions satisfy the same linear differential equation?
      f'(t) = trA f(t) and g'(t) = trA g(t)
      f(0)=g(0)=1
      Otherwise, I dont see where you show that their derivatives are equal.

    • @ehudkotegaro
      @ehudkotegaro Месяц назад

      @@98danielray Yes, they are equal because the second function is the only solution for the differential equation that the left function fulfils.

  • @Ajay-ib1xk
    @Ajay-ib1xk Месяц назад

    Nice collection of topics

  • @flyingtiger123
    @flyingtiger123 23 дня назад

    Good work ❤

  • @soyoltoi
    @soyoltoi Месяц назад

    Very clear and easy to follow

  • @ironbutterfly3701
    @ironbutterfly3701 26 дней назад

    order matters if more than two terms inside trace, it allows cyclic rotation.

  • @nabla_mat
    @nabla_mat Месяц назад

    You’re back!

  • @matteovissani1071
    @matteovissani1071 Месяц назад

    Love this topic

  • @Mini_Wolf.
    @Mini_Wolf. 27 дней назад

    Fabulous

  • @ilheussauer323
    @ilheussauer323 Месяц назад

    thank you!! how i wish you had a vid on hermitian matrices :(

  • @ulychun
    @ulychun 28 дней назад

    Is there a geometric explanation to this equation?

  • @pahom2
    @pahom2 Месяц назад +1

    Does it only work for e or for any base?

    • @MuPrimeMath
      @MuPrimeMath  Месяц назад +1

      The result holds for any positive base.

  • @arpitdwivedi9175
    @arpitdwivedi9175 19 дней назад

    It would be shorter if we allow eigenvalues in the picture. Det(e^A) = product of eigenvalues of e^A = e^L1.... e^Ln = e^(L1+ ... +Ln) = e^(Tr A).
    Where L1,..., Ln are eigenvalues of A.

  • @antormosabbir4750
    @antormosabbir4750 Месяц назад

    Wow!

  • @dgsndmt4963
    @dgsndmt4963 29 дней назад

    Nice...

  • @jakeaustria5445
    @jakeaustria5445 Месяц назад

    Thank You

  • @wargreymon2024
    @wargreymon2024 Месяц назад

    👍🏻👍🏻👍🏻👍🏻👍🏻🔥🔥🔥🔥

  • @ManishSingh-gc5fv
    @ManishSingh-gc5fv 3 дня назад

    It is only valid for square matrices and not for rectangular matrices.

  • @aquamanGR
    @aquamanGR 25 дней назад

    Hmm... I don't think J is *always* upper-triangular, as you have assumed. For example, if A is 2x2 with complex eigenvalues, then I don't think J can be upper-triangular.

    • @MuPrimeMath
      @MuPrimeMath  25 дней назад +2

      In fact the Jordan canonical form is always upper triangular. We permit J to have complex entries, as the rest of the proof still holds.

    • @aquamanGR
      @aquamanGR 25 дней назад

      @@MuPrimeMath OK I missed the "complex entries" thing although you did state it in the video. Makes perfect sense now, thanks!

  • @licks1_
    @licks1_ Месяц назад +1

    When we write A as PJP^-1, you said P is _any_ matrix. Just to clarify, I'm assuming you meant P is any _invertible_ matrix?

    • @MuPrimeMath
      @MuPrimeMath  Месяц назад +4

      More specifically, P is a change-of-basis matrix, which is always invertible.

    • @alejrandom6592
      @alejrandom6592 Месяц назад

      Yeah it's a slight mistake

    • @sagnikbiswas3268
      @sagnikbiswas3268 Месяц назад

      Im not great at linear algebra but isn't P just the matrix of eigenvectors

    • @wargreymon2024
      @wargreymon2024 Месяц назад +1

      dude you see P^-1, that said invertible

  • @csilval18
    @csilval18 Месяц назад

    Very cool video. Interesting, to the point, well explained...
    You should get more views

  • @sirshabiswas3010
    @sirshabiswas3010 Месяц назад +5

    Sir? This could be an insult to you but I was wondering if you could give me a tip on how to find the limits while finding the area using integration. Please don't mind. I really have a hard time figuring out the limits. And lots of love and support! :⁠-⁠)

    • @MuPrimeMath
      @MuPrimeMath  Месяц назад +1

      Sorry, the question is not clear. It's hard for me to give advice on such general topics. I wish you the best of luck.

    • @sirshabiswas3010
      @sirshabiswas3010 Месяц назад

      @@MuPrimeMath it's okay, no worries. Thanks for your reply. Keep uploading. Lots of love!

    • @alejrandom6592
      @alejrandom6592 Месяц назад

      ​@@sirshabiswas3010if the limits aren't given, you might be referring to interesection points. Equate functions and solve for intersection, is my guess. For these topics there are good resources online. Good luck!

    • @sirshabiswas3010
      @sirshabiswas3010 Месяц назад

      @@alejrandom6592 thanks! good luck too!

  • @MyOneFiftiethOfADollar
    @MyOneFiftiethOfADollar Месяц назад +3

    Note that we could call this a Simply Beautiful solution, BUT not as beautiful as a Cowboy cheerleader.

  • @edofarmer812
    @edofarmer812 11 дней назад

    Me on a first date

  • @spaceman392001
    @spaceman392001 Месяц назад +2

    The infinite sum for e^A starts with I, the identity matrix, not the number 1

    • @MuPrimeMath
      @MuPrimeMath  Месяц назад +1

      Yes. Here 1 denotes the multiplicative identity in the ring of matrices, which is the identity matrix.

  • @jaimeduncan6167
    @jaimeduncan6167 16 дней назад

    He keep saying "any matrix", but the power of a Matrix, in general, is not well defined. For square Matrixes it is, A^n will always exist, but it's not the case for more random matrixes (m x n) than one can define. This is a mayor flaw of the video, in particular for a formal mathematics one.

  • @MrNerst
    @MrNerst Месяц назад +6

    Further assumptions are needed. For example, that A is a square-matrix, never explicitly mentioned along the video, otherwise det(e^A) doesn't make any sense, as the determinant is defined only for square-matrices. A proof that the power-expansion of e^A is valid because A is a bounded linear operator in finite dimensions.

    • @wargreymon2024
      @wargreymon2024 Месяц назад +2

      No, he did perfect. Whatever you said isn't defined on the domain of determinant to begin with, yours is excessive.

    • @MrNerst
      @MrNerst Месяц назад

      @@wargreymon2024 First, study a little about endomorphisms and then share your opinion on social media.

    • @wargreymon2024
      @wargreymon2024 Месяц назад +1

      ambiguous and excessive

  • @newdayrising
    @newdayrising 27 дней назад

    The series expansion of e^A starts with the identity matrix, not 1.

    • @MuPrimeMath
      @MuPrimeMath  27 дней назад +1

      Yes, here 1 denotes the multiplicative identity of the matrix ring.

  • @harrypewpew901
    @harrypewpew901 22 дня назад

    I do not know why, but these math/cs geeks creep me the f out, they give serial killer vibe.

  • @kingfrozen4257
    @kingfrozen4257 27 дней назад +1

    This proof is so wrong in many aspect I can't resolve it in a comment so I will just leave a dislike and tell viewers to not use this proof in their HW or any actual work they are doing!

    • @louis8041
      @louis8041 27 дней назад

      I 100% agree and I’ll do it myself : Given a matrix A and considered as a complex matrix, its caracteristic polynomial has n roots (Gauss theorem) if not counted with mutltiplicities so A is C-trigonalizable and its diagonal coefficients are its eigen values lambda. Thus exp(A) is similar to a triangular matrix which diagonal coefficients are exp(lambda). Taking the determinant gives the result. No need to talk about Jordan’s form (absolute non-trivial result which can’t be used like nothing!!)