Square root of a matrix
HTML-код
- Опубликовано: 11 апр 2019
- Square root of a matrix: definition and calculation using eigenvalues. What does it mean for a matrix to have a square root?
Check out my Eigenvalues playlist: • Diagonalize 2x2 matrix
Subscribe to my channel: / @drpeyam
I never leant this until now....
Linear algebra Hoffman and kunze theorem 13 in 9.5
This is exploited in Quantum Mechanics.
I got this in my linear algebra course in my undergrad years. We did it like this, as I recall:
1) Find the eigenvalues, λ, & eigenvectors of A
2) Use the eigenvectors to make a matrix, S, & its inverse
3) Diagonalize A by similarity transformation, B = SAS⁻¹. The diagonal elements of B will be the λ's
4) √B is then just the diagonal matrix whose diagonal elements are √λ's [Either sign can be taken, ±√λ, with each λ]
5) Reverse the similarity transformation, √A = S⁻¹(√B)S. Done!
NB1: Not every matrix is diagonalizable; not every matrix has a square root.
I don't recall whether either set (or both!) is a subset of the other.
Every symmetric matrix has both properties, however.
NB2: This can also be done with (some) complex matrices.
Fred
Damnnnnn. Even the Calculus God didn't know about this linear Algebra topic
@@Safwan.Hossain I know, right? He's brilliant, but I was surprised he hadn't seen the Taylor Series expansion of a matrix argument of a function.
I swear this man's too friendly
Now natural log of matrix Peyam ;)
so we can get sneaky introduction into Lie Groups :P
The way he uses to take a square root from matrix works for every function f. f(A) = P f(D) P ^ -1. This works for every function that can be expanded into Taylor series.
I don't like to be the one, but: 4:02 the matrix should be [4 -4; -4 4].
Great video doctor ;-)
same row reduction
Your enthusiasm is fun to watch. If I had only √ of it, I would be fine.
A very enthusiastic math teacher! Very rare these days!!!
this is the most excited dude in the whole YT
Try cubing it and call it matrix triology😂
With spectral calculus you can define any function of matrix. Either formally or by converging power series. 💪
There is a small mistake in the number of "square roots" of a diagonalizable positive semi definite matrix A.
It would be 2^k, where k is the number of non-zero eigenvalues of A.
I love the way you combine knowledge of both linear algebra and calculus to help solve knew math problems every time, I sure hope I can be as calm as you later in future and thanks for this video .
[[1,2],[2,1]] is also a solution, so are the negative complements. So, there are 4 solutions in total.
this man must love teaching so much
You can also do this for other matrices but generally the answer is non-double. You need tu use sub-spaces stability arguments to get the answers.
Is there such a thing as a matrix that has matrices as components?
Sadly not, elements in a matrix have to be in a field, and we can’t divide any nonzero matrix
You can do this other times, such as a rotation matrix a=[[cosø, -sinø],[sinø, cosø]] -> √a = [[cos(ø/2),-sin(ø/2)],[sin(ø/2),cos(ø/2)]], but it is unclear when you can do this.
Were was this when I was coursing Linear Algebra lol. Blame on youtube for not showing me your channel sooner
The way I found one of the square roots of A = [[5, 4],[4, 5]] was by reasoning that it is some matrix R such that |R| times itself is |A| = 9. Which means |R| = 3 or -3. Then I guessed that [[1, 2],[2, 1]] worked, checked its determinant (which is -3), and then multiplied the matrix by itself, and it worked. This method of diagonalization you have demonstrated is a lot more intuitive!
You are really crazy
What if one of the eigenvalues are negative?
You’d get imaginary matrices, which are technically also square roots
I used other method and I got 4 matrix results for square root of A, all of them works
I love your passion. Big ups
Thank you.your videos are helping me a lot.
U seem really happy teaching this :)
This put a smile on my face, thank you :)
love the enthusiasm bro
Listen to me very carefully....you are the maaaaaaaaan! Omg never been so satisfyed by mathematic!!!!
There is another positive solution, which is the matrix: [1 2 ; 2 1].
Wow diagonalization really does wonders - and now you can generalize this for any fractional power of a matrix.
@dr Peyam What is the reason we ignore the negative sign of the square root of the matrix?
Very interesting and shortcut method of matrix inverse
Thank you so much. Very nice video.
That is so cool! Thanks for this linear algebra refresher.
I do have a question, how would you go about finding the other matrices that give B^2 = A?
Use minus the square roots of the eigenvalues, and rearrange the eigenvectors in any order you wish. That should pretty much give you all of them
Thank you for the video. At 1:13 you mention there are N^2 square roots. Are there not 2^N square roots, obtained by independently choosing between the positive and negative square roots of each of the N eigenvalues?
Yeah, 2^n, my bad
Doesn't it leave like n^2 (for nxn matrix) possibilities of square roots (just by this definition; over complex numbers?). Also why eigenvalues and not determinant afterall?
Hello Dr Payam, I really enjoy your linear algebra videos. Could you do a video talking about the matrix exponential? I know what it is but I don't think I understand it completely. Cheers!
Sunday 😉
Haha, Monday actually, I’m gonna take a rest day tomorrow
@@drpeyam A while curvature? 😂
Why to find Model Matrix and Diagonal Matrix if we can find the the square root or any higher power of a matrix with eigen values in characteristics equation only.
that was awesome.... thank you!
How about considering an n by n matrix in general?
Same idea
Holy cow, I encountered this when I was studying Unscented Transform for UKF...
Cool! I am using it too.
Thanks for your great teaching. How can we comute A^(2^1/2)? I used this way but my teacher said its not true here.
But the opposite matrix of what was found also is a solution what happened to that matrix ?
Yes, they are "-sqrt(A)" 😎
I'm looking forward to linear algebra.
Thank you very much
I learnt it from u thanks
is this analogous to the square root of a differential operator?
Yep
Thanks, doc!
0:33 it should have been "if and only if y>=0" o0therwise
Square root of 4 can be -2 according to ur definition
oh i really forgot the concepts... thanks for the revision
I love his linear algebra esp fun and clarity
Find a polynomial p(x) such that p(y) = f(y) for all y, where y are eingenvalues of A, so p(A) = f(A), where f(x) is any function defined at the points x=y.
You can extend this to non-diagonalizable matrices with the Taylor series of square root
Or Jordan form :)
I'm here because of BPRP and wow I'm loving these !!!!
❤️❤️❤️
great vid!
Can we use sqrt(A) = e^0,5ln(A) so we have to calculate ln(A) with its Taylor Serie and the same argument for the exponential
That works too
Thank you!
For anyone interested in gravity, there is a quite recent formulation of general relativity due to Krasnov where the Lagrangian for the theory involves the trace of the square root of a matrix!
Dr. Petam should do a course in graduate math study series.
I didn't even know what eigenvalue is yet why do I still watched it till the end
There's a nice RUclips series of "3brown1blue" in where he explains the essence of linear algebra.
It took me a little while to get grips to it, but it was REALLY satisfactory when I get concepts as the Eigenvalue
In a physical system ( mechanics, acoustics, quantum mechanics...) the eigenvalues are related ( squared) to the natural frequencies of the system - i.e. where it tends to oscillate more easily by itself ( without being forced - think of a swing, a metal rod, a string, a membrane or a bridge etc.).
The associated eigenvectors then describe the various possible oscillating states - where each point of the system are at their extreme position - before returning to the opposite position ( ideally, when there is no energy lost to internal heat, damping or air resistance etc.)
For a one dimensional string:
A set of possible sine waves (eigenfunctions) with fix nodes at the end points, which depend on the lenght of the string and its density etc.
In this case the number of nodes ( the positions of EACH coordinate of these eigenvectors! ) becomes infinite as the eigenvector turns into a (continuous) eigenfunction in the limit. ( depending on how accurate a mathematical description we want of our physical system ).
Why demand that the matrix is positive definite? If not you just get a complex matrix as the root, but I feel like that is a valid answer... if someone ask you what the square root of minus 1 is, you answer that it is i or minus i. Likewise there is no shame in returning a complex answer when asked for the square root of a non positive definite matrix...
Because having complex value in your matrix is tensor algebra.
I guess he was only referring to the normal (real) square root function, not the more advanced complex square root function, which just "happens" to coincide with the former on the non-negative real axis.
Since you said this can be extended to all real roots for "non negative" matrices, could a "negative" matrix have an odd root?
It could have an imaginary root, like the square root of -1 is i
@@drpeyam oh, nice!
wow, this was very different than I would have guessed. I thought you were just going to have B = [a b c d] matrix, square it and set the cells equal to the matrix you're taking the root of. ie 4 equations with 4 unknowns.
Well I'm too lazy to do it but would this work?
I think it might be possible, but a huge mess!
Also nice video would be to find a square root through direct decomposition into subspaces and apply it on non-diagonalizable matrices and especially nilpotent matrices. E.g. you can find what is square root of finite first derivative operator matrix applied on polynomials and how to interpret its coeficients. Another way how to handle it would be using exponencial on operator matrix.
As nilpotent matrix can never get back to have non zero diagonal elements, because sqrt(N) has to have all eigenvalues again 0 nilpotent precedent is always nilpotent if it exists. After each sqrt(N), the dim of its null space is at least 1 lower, so if we get to the corner of the chain with only one eigenvector, I suppose the sqrt of that matrix doesn't exist, because it would change nilpotent to regular matrix that can not have any eigenvalue 0.
"What I learning day by day is that everything is possible in math and physics"
what about the other square roots? I can see at least 3 more.
There are at least 4, up to conjugation
MUCHAS GRACIAS, ME FUE DE GRAN AYUDA. ME SUSCRIBO
De nada :)
me some months ago when the course started: Pff, I already did a linear algebra course years ago, how boring this will be
me after the limit of a matrix and this: O.O holly shit, this course is sick!
I want to be a square root
Generally speaking, I want to be a cube root, because that is much better to be then merely a square root, but the grand champ has to be all the complex roots of __.
then dont be a perfect sqaure
0:35 why is this still a condition XD because actually that condition works just as well allowing negative x values, and certainly doing this with matrices would make the distinction very blurry because you haven't defined comparing matrices, as far as i've seen (edit you then go on to define that, but i don't see why eigenvalues can't be negative or complex)
The interesting thing is that a 2 by 2 matrix can have 4 square roots! That is because each eigenvalue has 2 roots and there are 2 eigenvalues, thus 4 possibilities. They can be found much easier using Cayley Hamilton and completing the square with respect to the constant instead of the linear term in the Characteristic equation. And to make things crazier, the unit matrix has infinitely many square roots!
After 40 years of using eigenvectors in my work in optics I never once thought of taking a square root of a matrix. We have the equivalent but it is not called the "square root". In a a periodic transport system we would call that quantity one half of the transport matrix.
I would like to understand this but I'm not prepared even if I don't, I have to say that watching the video was interesting and fun
Very nice
When the original matrix is not positive semi-definite, can you extend this method to have the square root be complex numbers, or does it break down?
Complex numbers, like the square root of -1
But isn't the matrix
[1 2]
[2 1]
an answer also?
Yep! That’s because the columns of P are not unique, your matrix comes from P but flipping the columns, which is also ok. So more than 4 square roots
@Chig Bungus Sort of
They are the "- sqrt(A)" solutions, since they have negative eigenvalues - i.e.
B = -sqrt(A) is not positive definite.
@Chig Bungus its like sqrt(4) has 2 as its unique solution but x^2 = 4 has solutions -2 and 2.
Ok. But, for sqrt for 1 and 9 we get 1 (+/-) and 3 (+/-). So, we have 4 roots for matrix A. Why we have only one solution?
Well...Jordan decomposition is good at something here
I have a question:
for the matrix to be positive, is it required that all its eigenvalues be positive?
If it has positive and negative eigenvalues, can not the definition be applied?
Thank you for the video, very interesting (and sorry for my bad english )
In that case the matrix is neither positive nor negative (some people say it’s hyperbolic)
I think it can still be done, but only in matrices over the complex field, ℂ.
And keep in mind that, in complex vector spaces, the dot product always has a complex conjugate on one of the vectors.
(When you transpose, e.g., to turn a row vector into a column vector, or vice versa, you must also conjugate.)
Is this about right, Dr. ∏M?
Fred
The answer for your question it is not necessarily unless the matrix is selfadjoint
Can someone refresh my memory? Is matrix multiplication associative? That is, does (AB)C = A(BC) for matrices A, B, & C?
Yeah
Thanks!
Associative yes, but AB =/= BA usually
so....clear !!!!!
This 2x2 matrix (which is positive def) has 4 different choices of square root- matrices, and of the four possible choices he showed the one that is the positive definite one.
In fact, the unit matrix as infinitely many square roots.
Just started the introduction to Time Series and found out that somehow no one has tought me how to calculate square roots of matrices. Wish me luck on my colloquium tommorow lol
Good luck!!!
Notice that f(A) where A=([a,b],[b,a]) is equal to ( [f(a+b)+f(a-b) , f(a+b)-f(a-b)],[f(a+b)-f(a-b) , f(a+b)+f(a-b) ]*1/2
So if your matrix is of complex numbers, the requirement for the determinants to be zero goes away, just like with numbers.
The determinant is still zero by definition of an eigenvalue
@@drpeyam I'm dumb, what I wrote wasn't anything like what I meant to say. I meant to say is that the requirement for the eigenvalues to be positive goes away when the matrix is of complex numbers.
That makes more sense :)
How to calculate the square root of a 3x3 matrix? I can't solve it
Same idea, diagonalize it
@@drpeyam for some matrixes it gives complex numbers as sq root. Then if I square that matrix, it doesn't give the real matrix back.
It should give the real matrix back
I really enjoy that "something to a matrix" ideas. what about "cos to a matrix"
Already done ✅
Guessed the result straight away 😎
At 1:14 you say there are at least n^2 values, that should be 2^n instead.
Yeah
(D)^1/2 may be [1 0; 0 3] or [-1 0; 0 3] or [1 0; 0 -3] or [-1 0; 0 -3]
What is that "NUL"? ("noss"??) 3:25
Heyyy Dr.Payam
Hiiii!!!
So now you can show people how to compute the exponential of a matrix (by expanding e^A in a taylor series). Then you can talk about Lie Groups and Lie Group actions !
Already done ✅
I was hoping for one or more families of equations describing all solutions to the problem. I.e. en.m.wikipedia.org/wiki/Square_root_of_a_2_by_2_matrix
That’s more or less what I did, I just used eigenvalues instead of trace and determinants
There has to be certain condition to this. [1 2 ; 2 1] may apply.
Well, up to choice and rearrangement of the eigenvectors
What school do you teach at?
UC Irvine 🙂
@@drpeyam Cool!
[2 1 1 2] works, but so does [1 2 2 1], and so does [-2 -1 -1 -2] and [-1 -2 -2 -1]. All four matrices can be squared and produce [5 4 4 5]. Why can’t all four matrices be considered solutions to sqrt([5 4 4 5])?
I see Orang found these solutions as well...
Sure they can! They are the same up to rearrangement of the eigenvectors
Algebraic closure of the matrices. :3 I don't remember enough about 251 to say if this actually works.
Fresh!
n-th where n is natural number
Same idea
Good explanation. Thank you. :)