"Do what you want with this information. I don't know what this is useful for, and, to be honest, I don't care, because it's just beautiful as it is." Spoken like a pure mathematician. Study math because math is beautiful!
It doesn't need to have any immediate clear uses; it just _might_ turn out to be useful for something at some point, for whatever reason. So math is a little bit like preparing a "toolbox", where things are as general and flexible as possible, just in case they turn out to be needed.
Cool exercise. It teaches us something about the domain of math and how to explore it. Just a small slip of notation there, though: x^(1/n) is not (1/n)√x it is n√x.
I just finished my intro to linear algebra course and I was hoping to never see anything related to linear again but this was really interesting and fun to watch! What's even better is that I actually understood the steps you were taking.
I've been having a bad dwelling anxiety attack and what do I find that saves me from my somber mood? This gem! Genial! The Mad Man did it!I am so happy to see these bizarre beauties on your channel!
I've only studied math until C1 for my business degree, and to be honest, it is not my favorite subject, but is awesome to see how passionate you sound in your videos, keep up the good work, your content is very interesting
I'll have to subscribe after seeing this. I haven't seen the Matrix since college so I will need to go back and review some more of Dr. Peyam's videos.
I'm a math major and just finished my linear algebra sequences. And let me tell you that I've never dreamt that this could be done. It's weird lol. But beautiful
2:00 there is a fair argument you can make in favor of what you are doing. Essentially, a^b is a left-right association, but at the same time you could find a mathematical use for treating roots and powers differently, as the nth root of x is a power-base ordered phrasing, so you could actually want to use e^(n^-1 ln(x)) for roots, and e^(ln(x) n) for regular powers. In this case, it boils down to convention, as long as it's forever consistent.
omg diagonalization is so powerful it seems the main technique in linear algebra invented by Grassman 1848 , the matrix Latin for womb by Sylvester an American Actuary 1848 , with Cayley defining the inverse in the 1860s Another beaudy by Dr Peyam always upbeat and chirpy 😜👏🏿👏🏿👏🏿
Yup, algebra is amazing. It is the most potent form of meta-mathematics that exists, studying decompositions, representations and data compression of structures. It is like a detective game, but within mathematical structures. No math would prosper without algebra ✌
There is a little mistake of the video but it is just notation problem. 1/n root of x is equal to x^(1/n). It is actually is x^n. But the video is very entertainment I have subscribed it to your channel and liked this video. :)
I'm in 12th class currently and I don't carry much knowledge about matrices in this standard but when I saw the thumbnail of the video I just went crazy and tapped on it immediately....This is a truly wonderful clickbait
Perhaps one can derive some kind of rule for similar problems? I notice that the matrix that needs to be taken a root of is simply divided by 2 and 4 at the bottom row, which possibly has something to do with the 2's in the diagonal of other matrix (the one above the root symbol). And it also happens to contain 1,2 and 3 in both matrices.
I agree with your statement about not caring about what this is useful for. but I do think it would be worthwhile to try to obtain some intuition about what this means. what is the meaning of taking the matrix root of something. very strange but the analysis shows that it works and therefore there is probably some meaning behind it. oftentimes things like this can reveal something about the operation in question. we can view root extraction as something far more general than just an operation on vectors. i think a lot of ppl would appreciate if you'd explain a bit more about why you can just apply a function like ln or e^x to a diagonalized matrix the way you did. i know i didn't understand that bit, but my linear algebra is a bit ancient and weak 🥴
I like it when a math person assumes that we know what he talking about. Sounds like my mathematical physics teacher 40 years ago. I was the only student that liked him. Not stated is that from the power series expansion of any function, the eigenvector matrix and its inverse would be end up adjacent to each other given identity, leaving the diagonal matrix of a particular power.
Ok, so if we consider scalars to be 1x1 matrices, then for an nxn matrix, it appears we can define the 1x1 root as well as the nxn root of it. Can this be generalized to any mxm matrix root? Or is there something special about 1 and n in producing the roots?
I believe that there is indeed something special about 1 and n in this context, since we're in an algebra (the algebra of matrices, which is essentially a vector space with an additional product operation, like in a ring), and in this algebra, we can define multiplication either by scalars (1x1 matrices if you like) and other elements of the algebra (nxn matrices). So, in that sense, I can't think of a natural way to generalize this root operation to accept other sizes of matrices
Couldn't we compute the logarithm of a matrix A=RDR^-1 as log(A)=log(RDR^-1)=log(R)+log(D)+log(R^-1)=log(D) ? I know that this would probably hold only if the matrices commuted, but it could be nice.
Not partical fan of these number examples since the small computational problems keeps me distracted to see the big picture. I would rather like a more generalized approach, let say a 2x2 Matrix ([a1,a2], [a3, a4]) or even nxn matrix
@@drpeyam It is completely doable to do matrix-matrix exponentials for normal nonsinguar matrices A,B such that A^B = exp(log(A) B). However, I guess the case where A^(B^-1) is just a matter of handwork. Any idea if diagonalization of B will make it doable?
The physical significance of the matrix root of another matrix is the one to one mapping of the galaxies of one universe onto its neighboring universe assuming that the mapped universe is invertible. The mapping is unique and conforms to the laws of relativity
@@drpeyam Thanks you for the prompt reply sir. specifically I am asking if there is any published book or something. I have studied matrix functions which are extended from real valued functions but I have never seen such thing.
That's great but could eigen value be zero and can you apply such diagonalization for any square matrix? I mean, we have charateristic polynoms for eigenvales and last ones have complexity roots sometimes.
Honestly, I think I wanna know more, like why is the result made up of rational numbers? I'm sure that's not always going to be the case. How did it happen here?
@@drpeyam thanks, i ll check that. Also maybe you have some advice to the efficient way of solving the Leontiev Matrix - the linenar equasion in form X(I-A)=Y, with solution X= Y(I-A)^-1, x,y are vectors and A is very big matrix
No it's correct. Think of √4, it's the same as 4^(1/2) = 2. This is because 4^(1/2)*4^(1/2) = 4^(1/2+1/2) = 4^1 = 4, so it follows that (4^(1/2))^2 = 4, so it is in fact the square root of 4.
"This is math. We can do whatever we want."
As the beaten to death Thanos meme goes, "reality can be whatever I want" - and this is true in linear algebra where you can choose any basis!
Legendary quote. I’m going to put it at the top of my syllabus
@@angeldude101 Nice, I was just watching some of her videos!
challenge accepted
*let 1 = 2*
Matriz that contains matrix as element
"Do what you want with this information. I don't know what this is useful for, and, to be honest, I don't care, because it's just beautiful as it is."
Spoken like a pure mathematician. Study math because math is beautiful!
It doesn't need to have any immediate clear uses;
it just _might_ turn out to be useful for something at some point, for whatever reason.
So math is a little bit like preparing a "toolbox", where things are as general and flexible as possible, just in case they turn out to be needed.
Well… I mean….
This is totally batshit crazy, I love it
Cool exercise. It teaches us something about the domain of math and how to explore it. Just a small slip of notation there, though: x^(1/n) is not (1/n)√x it is n√x.
Thank you!!!
@@drpeyam Dr P is always so courteous 😜
From the category of calculus to the category of linear algebra, there is a fully faithful functor. Perhaps contravariant?
Do you pay property taxes for your forehead? That’s a lot of acres man…
Yep
I just finished my intro to linear algebra course and I was hoping to never see anything related to linear again but this was really interesting and fun to watch! What's even better is that I actually understood the steps you were taking.
Exactly how I felt watching this
I've been having a bad dwelling anxiety attack and what do I find that saves me from my somber mood? This gem! Genial! The Mad Man did it!I am so happy to see these bizarre beauties on your channel!
I've only studied math until C1 for my business degree, and to be honest, it is not my favorite subject, but is awesome to see how passionate you sound in your videos, keep up the good work, your content is very interesting
This is not madness but mathness
So he should be called Mad Maths!
I have used exponential matrices and the logarithm of matrices before. Writing some kind of matrixth root is just a nice possibility to consider.
OK, thank you for blowing my brains out. Linear algebra was one of my favorite subjects in college, but this is exquisite nuts stuff.
I'll have to subscribe after seeing this. I haven't seen the Matrix since college so I will need to go back and review some more of Dr. Peyam's videos.
Thank you!!!
So fitting that December is the release month of the Matrix Resurrections!
Funny because the new trailer just released a few hours ago. After all...I still know math fu...
@@citizencj3389 i know! Are you pumped to go see it?
@@devsquaredTV Yeah I just hope it is at least half as good as the first one. I still liked the other two though.
I'm a math major and just finished my linear algebra sequences. And let me tell you that I've never dreamt that this could be done. It's weird lol. But beautiful
2:00 there is a fair argument you can make in favor of what you are doing. Essentially, a^b is a left-right association, but at the same time you could find a mathematical use for treating roots and powers differently, as the nth root of x is a power-base ordered phrasing, so you could actually want to use e^(n^-1 ln(x)) for roots, and e^(ln(x) n) for regular powers. In this case, it boils down to convention, as long as it's forever consistent.
I think you might want roots to still be the inverses of powers, so you need to keep the convention consistent between them.
omg diagonalization is so powerful it seems the main technique in linear algebra invented by Grassman 1848 , the matrix Latin for womb by Sylvester an American Actuary 1848 , with Cayley defining the inverse in the 1860s
Another beaudy by Dr Peyam always upbeat and chirpy 😜👏🏿👏🏿👏🏿
Yup, algebra is amazing.
It is the most potent form of meta-mathematics that exists, studying decompositions, representations and data compression of structures.
It is like a detective game, but within mathematical structures.
No math would prosper without algebra ✌
This is insane in every definition of the word! Great job :)
I've taken scalar to matrix and matrix to scalar powers before, but never matrix to matrix. Very cool
I never thought the answer would be this but your explanation was so simple that I got it at almost once thank you for interesting video
"I don't know what this is useful for, and to be honest, I don't care" - every mathematician's favorite sentence
Amazing! Can you do the matrixth derivative of a matrix?
Esa es la esencia de un matematico, generalizar los conceptos y las operaciones.
una observación inteligente mi amigo algebraico
WOW THAT IS CRAZY!!!
right is always right
I fucking love how much this guy is enjoying himself. King.
When he said "[two, minus one, minus three, second]th", I felt that.
There is a little mistake of the video but it is just notation problem. 1/n root of x is equal to x^(1/n). It is actually is x^n. But the video is very entertainment I have subscribed it to your channel and liked this video. :)
I'm in 12th class currently and I don't carry much knowledge about matrices in this standard but when I saw the thumbnail of the video I just went crazy and tapped on it immediately....This is a truly wonderful clickbait
Perhaps one can derive some kind of rule for similar problems? I notice that the matrix that needs to be taken a root of is simply divided by 2 and 4 at the bottom row, which possibly has something to do with the 2's in the diagonal of other matrix (the one above the root symbol). And it also happens to contain 1,2 and 3 in both matrices.
since its all based on the diagonalised eigenmatrix maybe you can directly use that?
"I'm sorry ln(DeGeneres) this is my time to shine" - 😂🤣😅🤣😂🤣😅 I can't believe how much I laughed.
🤣💀
I agree with your statement about not caring about what this is useful for. but I do think it would be worthwhile to try to obtain some intuition about what this means. what is the meaning of taking the matrix root of something. very strange but the analysis shows that it works and therefore there is probably some meaning behind it. oftentimes things like this can reveal something about the operation in question. we can view root extraction as something far more general than just an operation on vectors.
i think a lot of ppl would appreciate if you'd explain a bit more about why you can just apply a function like ln or e^x to a diagonalized matrix the way you did. i know i didn't understand that bit, but my linear algebra is a bit ancient and weak 🥴
There’s a video on matrix exponentials that explains this, it basically applies to any function that has a power series
I didn't think he'd actually do it, lol!
"This is Math, we can do whatever we want"! Love it!
"I don't know what this is useful for, to be honest I don't care, because it's just beautiful as it is"
I think that's something my mother says.
Seriously amazing concept
The way to do this is to write X=exp(log(X)), and then use the series expansions for log and exp.
Then you will have to deal with the convergence issues though.
@@hOREP245 Of course but that just falls upon eigenvalues of the matrix.
Foarte interesant! Care este aplicabilitatea practica?
I like it when a math person assumes that we know what he talking about. Sounds like my mathematical physics teacher 40 years ago. I was the only student that liked him. Not stated is that from the power series expansion of any function, the eigenvector matrix and its inverse would be end up adjacent to each other given identity, leaving the diagonal matrix of a particular power.
It’s because i’ve done countless videos on this, check out my eigenvalues playlist
Doctor Peyam I absolutely love your videos!! It's so inspiring to see such a knowledgeable man as you at work! It instantly makes me want to study :p
I just finished my linear algebra final and this… THIS THING! Shows up in my recommended!?
What's next? αth derivative of a matrix function with respect to a matrix variable, where α is also a matrix?
Fractional derivative of the curve integral of homological chain complexes of Lie algebras or some other crazy shit lol
@@Wabbelpaddel something that's more likely to be taught at hogwarts, honestly
This looks like smth you would watch procrastinating at 3am.
mad
absolutely crazy love it
Me not knowing ANYTHING about a mathematical matrix and still watching:
_Interesting_
"...because right is always right"
Just a reminder that Dr Peyam is left handed.
This was recommended to me. Im proud of myself
This is absolutely CRAZY but wonderful!!!
Why didn’t I ever think of this in 6 decades?
I want more insanity!!!
Thanks so much!!!
You're incredibly entertaining to watch! Greetings from Italy ✋🍕🔥
Great job sir
At 0:28, did you mean $\sqrt[n]{x} = x^{1/n}$ rather than $\sqrt[1/n]{x} = x^{1/n}$?
Esto es otro nivel...muchas gracias por dar luz a la caverna
Ok, so if we consider scalars to be 1x1 matrices, then for an nxn matrix, it appears we can define the 1x1 root as well as the nxn root of it. Can this be generalized to any mxm matrix root? Or is there something special about 1 and n in producing the roots?
I believe that there is indeed something special about 1 and n in this context, since we're in an algebra (the algebra of matrices, which is essentially a vector space with an additional product operation, like in a ring), and in this algebra, we can define multiplication either by scalars (1x1 matrices if you like) and other elements of the algebra (nxn matrices). So, in that sense, I can't think of a natural way to generalize this root operation to accept other sizes of matrices
This is insane, I love it
You're a literal god Dr. Peyam
Thanks so much!!!
VERY GREAT EXERCISE SIR
YOU ARE REAL MATHS MASTER SIR
THANK YOU SIR
It's a cool exercise on matrix to the power of matrix. It must have an interesting app some day.
Full immersion i m in love
I love this. Thank you very, very much.
A true mad lad, thanks for this 🤣
" this is math , we can do whatever we do " this statement is mathematically false 😁❤ .... salute to you ❤
Linear algebra final on Wednesday, this is perfect
Bravo, Maestro! Bravissimo! I never even thought of this , let alone how to do it! Live and learn, the Weird!
Sweet.
I'm sorry, we're revoking your math license.
Couldn't we compute the logarithm of a matrix A=RDR^-1 as log(A)=log(RDR^-1)=log(R)+log(D)+log(R^-1)=log(D) ? I know that this would probably hold only if the matrices commuted, but it could be nice.
Sadly logs don’t operate this way for matrices, in fact we don’t even have identities like exp(A+B) = exp(A) exp(B) for matrices
@@drpeyam Sadface
Raiz de uma Matriz. Esse é boa !
Sounds applicable for some tensor calc in GR
Matrixth is my new favorite word
This is sooo crazy!!!
I don’t understand such high level of math…but I fcking loved this. Instant sub
Thank youuuu
Matrices as exponents are in fact useful in machine learning.
excellent thanx a lot!!
This is so cool. I used to philosophize about this kind of shit in hugh school and college. Cool to see that it is possible to do. problem like this.
Makes me wonder... can the gamma function be extended to matrices in order to get a smooth matrix factorial? 🤯
What would be the general form of A^B, where A is the matrix
a b
c d
And B is the matrix
w x
y z
?
Left as an exercise to the reader :)
Loved it! Btw, at 0:35 should it not be "x to the power of n"? Since you're inversing and inverse? Lol
First he wrote wrong...he meant nth root not 1/nth root
Not partical fan of these number examples since the small computational problems keeps me distracted to see the big picture. I would rather like a more generalized approach, let say a 2x2 Matrix ([a1,a2], [a3, a4]) or even nxn matrix
LOL, well good luck with that
@@drpeyam It is completely doable to do matrix-matrix exponentials for normal nonsinguar matrices A,B such that A^B = exp(log(A) B). However, I guess the case where A^(B^-1) is just a matter of handwork. Any idea if diagonalization of B will make it doable?
He is crazy but in a good way!
Thanks!
Omg thanks so much for the super thanks!!!
That was really a funny example!
The physical significance of the matrix root of another matrix is the one to one mapping of the galaxies of one universe onto its neighboring universe assuming that the mapped universe is invertible. The mapping is unique and conforms to the laws of relativity
That was.................interesting
"This is Math, we can do whatever we want. " - Dr. Peyam
So in my history of Math, I was never wrong. I just did whatever I wanted. 😁
Very interesting 👍🏼
Your mathematicians were so preoccupied with whether or not they could, they didn’t stop to think if they should.
Hurting our heads so early in the Holiday season.
Math be crazy
I'd say the best way to find applications for this kind of math is to model it in a simulation.
Great stuff. But how does one cause a matrixth root or a matrixth power of a matrix?
Hi Dr. P. Where can I read more about this? Could you pls help me
Check out the playlist
@@drpeyam Thanks you for the prompt reply sir. specifically I am asking if there is any published book or something. I have studied matrix functions which are extended from real valued functions but I have never seen such thing.
Peyam do you prefer using pens or pencils for doing math?
Pencil for sure
@@drpeyam
Thanks
@@dfdxdfdydfdz all mathematicians love doing analytical math via pencil...and paper respectively.
Considering how they still refuse to define a matrix division, I do think all these other operations are dubiously based.
I think I've seen this type of linear algebra used in Kalman filtering, but I'm not an expert on it. Neat vid though
Ooooh interesting!!
This reminds me of Kalman filters ... if there is any interest, perhaps see if this might apply somehow to moving-target tracking. Cheers.
That's great but could eigen value be zero and can you apply such diagonalization for any square matrix? I mean, we have charateristic polynoms for eigenvales and last ones have complexity roots sometimes.
It’s fine, ln(0+) = - infinity and if you exponentiate that you get 0. And ln(-1) is complex so also ok
Honestly, I think I wanna know more, like why is the result made up of rational numbers? I'm sure that's not always going to be the case. How did it happen here?
It’s just by coincidence since the eigenvalues are so nice. They can certainly be irrational
the best wawy to make a school exam, is suddenly give this task on teh exam and see, if students have real clues in math or not :D
Trueeee
@@drpeyam wow look, can you please make a video about fast way to solve geometrical summ of matrix like 1 + M +M^2+... = 1/(1-M)
There’s a video on the geometric series
@@drpeyam thanks, i ll check that. Also maybe you have some advice to the efficient way of solving the Leontiev Matrix -
the linenar equasion in form X(I-A)=Y, with solution X= Y(I-A)^-1, x,y are vectors and A is very big matrix
Ha ha ha ha. This was so giddy fun. Stuff we do with maths
Right is always right?
isn't the equation in 0:35
wrong?
i think it's x^n
No it's correct. Think of √4, it's the same as 4^(1/2) = 2.
This is because 4^(1/2)*4^(1/2) = 4^(1/2+1/2) = 4^1 = 4, so it follows that (4^(1/2))^2 = 4, so it is in fact the square root of 4.
Yeah he accidentally wrote 1/n on the left side
@@bomboid
Thats it
@@ubs7239 oh yeah you're right, I'm sorry, it is the n-th root yeah (or just x^n as you said).
@@bomboid that would make it a very interesting problem!!
Can we somehow decompose a 3×3 matrix into several 2×2 matrix such that the operation is unique and an inverse decompose yields the same 3×3 matrix?