I've spent 2 days just trying to understand what this process is even attempting to do, and in the first 3 minutes you have totally done it. Thank u so much, sheesh!
I love your videos. You're great explainer. It's awesome to see that things we had to learn by heart on university are in fact so logic and simple when you explain them.
I was reading my textbook and this process looked like magic, I had no idea what was going on and now I understand it. The textbook has a lot of drawings in it but it is simply not a substitute for a professor explaining it while drawing in real time and pointing to the drawing. Textbook can only go so far. Obviously if it is not a good professor it won't matter but Dr Peyam is a good professor.
At first I was confused because I was like "don't we need a set of 4 vectors to make a base of a 4-th dimensions space?" And then I realized we only wanted to find a base for the span of {u1,U2,u3} which is three dimensions... Great video, I love linear algebra. One of my favorite topics in mathematics :)
Great video! I couldn't understand what this process was doing by reading my textbook and now I understand it! Just for those that may not understand (or for future me, I often forget things that I understand once), the reason why û2 = v1 times (u2 dot v1 / (v1 dot v1)) is because of the following: ||û2|| = u2 cosine the angle with v1 ||û2|| = (u2 dot v1) / ||v1|| if you do ||û2|| v1 you will get a vector on the same line as v1 (good), but its length will be ||v1|| multiplied by ||û2||, instead of ||û2||. Therefore, we need to divide the result by ||v1||. So, û2 = v1 ||û2|| / ||v1|| û2 = v1 (u2 dot v1 / ||v1||^2) û2 = v1 (u2 dot v1 / (v1 dot v1))
Linear algebra is amazingly intuitive. I learned about this many years ago, and have not applied it since. Yet, within the time you spent to introduce this (< 1 min), I was able to fully reconstruct how it works. Not sure why this is a good thing tho lol (need to watch more of your videos I ges :P). Also, there are of course, infinitely many orthonomal bases (for the same space). Not so sure what's so special about the one that GS spits out. You could also do the vectors in any order, so for an original basis with 3 vectors, you have 3! = 6 possible basis that GS can generate for you, each giving different "preservation priority" to the original 3 vectors.
I have an interesting question: if I have a set of representation vectors (distributed vectors), what would happen after operating gram-schmidt on those vectors? Let’s assume the dimension of the vector is 300, and we have 10 observations. What’s the best interpretation for these orthonormal vectors?
Thank you Dr. Peyam. For us, Statisticians, your videos are really important. This specially is useful. The one on Lebesgue Integral...Fantastic. Can you do the Riemann-Stieljes Integral? this one is very important to us because it allow us to ignore the difference between a discrete and a continuous random variable without the extreme complications of Lebesgue.
When I was in my first semester my professor once talked about "Otto-normal Basen". I thought "WTF are those? And what is the opposite? Ungewöhnliche Basen?" Boy was I wrong :D (also, this comment probably won't make any sense if your german is not so good^^)
Well, I dont really have something to add this time but i do have a question: why do you instead to write v.v as [....].[....] and not just ||[....]||^2? it is shorter. Oh, I start watching the Measure Theory series of "Fematika", it is pretty good
Dr. Peyam. I love your videos. I am a physics phd too lol. But, I beg you, use lower case letters. It is painful to see everything spelled out like that. It feels like you are yelling at us. There is always time to change :). I beg you, oh please :) Otherwise, I love your work!
I'm a simple man. I see linear algebra, I like the video.
I've spent 2 days just trying to understand what this process is even attempting to do, and in the first 3 minutes you have totally done it. Thank u so much, sheesh!
I love your videos. You're great explainer. It's awesome to see that things we had to learn by heart on university are in fact so logic and simple when you explain them.
I was reading my textbook and this process looked like magic, I had no idea what was going on and now I understand it. The textbook has a lot of drawings in it but it is simply not a substitute for a professor explaining it while drawing in real time and pointing to the drawing. Textbook can only go so far. Obviously if it is not a good professor it won't matter but Dr Peyam is a good professor.
At first I was confused because I was like "don't we need a set of 4 vectors to make a base of a 4-th dimensions space?" And then I realized we only wanted to find a base for the span of {u1,U2,u3} which is three dimensions...
Great video, I love linear algebra. One of my favorite topics in mathematics :)
It was a very big surprise when I found your video. Before that I have seen only Calculus videos! Thank you for your work
The way you mix humor to make it entertaining is amazing!!😅❤
Great video! I couldn't understand what this process was doing by reading my textbook and now I understand it!
Just for those that may not understand (or for future me, I often forget things that I understand once), the reason why
û2 = v1 times (u2 dot v1 / (v1 dot v1))
is because of the following:
||û2|| = u2 cosine the angle with v1
||û2|| = (u2 dot v1) / ||v1||
if you do ||û2|| v1 you will get a vector on the same line as v1 (good), but its length will be ||v1|| multiplied by ||û2||, instead of ||û2||.
Therefore, we need to divide the result by ||v1||.
So, û2 = v1 ||û2|| / ||v1||
û2 = v1 (u2 dot v1 / ||v1||^2)
û2 = v1 (u2 dot v1 / (v1 dot v1))
Super clear and straight to the point, great video!
This video, along with Kahn Academy's video on basis, helped me finally understand the G.S. orthonormalization process. Thank you!
Great video as always!! More linear Algebra!! It's fascinating how much you can achieve with vectors and matrices
Linear algebra is amazingly intuitive. I learned about this many years ago, and have not applied it since. Yet, within the time you spent to introduce this (< 1 min), I was able to fully reconstruct how it works. Not sure why this is a good thing tho lol (need to watch more of your videos I ges :P). Also, there are of course, infinitely many orthonomal bases (for the same space). Not so sure what's so special about the one that GS spits out. You could also do the vectors in any order, so for an original basis with 3 vectors, you have 3! = 6 possible basis that GS can generate for you, each giving different "preservation priority" to the original 3 vectors.
I instantly feel better when I see that smile :)
Super clear. Thank you Dr Peyam
I’ve been trying to understand this topic! Thank you!
I have an interesting question: if I have a set of representation vectors (distributed vectors), what would happen after operating gram-schmidt on those vectors? Let’s assume the dimension of the vector is 300, and we have 10 observations. What’s the best interpretation for these orthonormal vectors?
Very clear presentation. Can you also cover the lanczos method?
3:53 Now I finally find out what WTF means
Thank you Dr. Peyam. For us, Statisticians, your videos are really important. This specially is useful. The one on Lebesgue Integral...Fantastic. Can you do the Riemann-Stieljes Integral? this one is very important to us because it allow us to ignore the difference between a discrete and a continuous random variable without the extreme complications of Lebesgue.
Thank you!!! And I’ll think about it; I can definitely see how Riemann-Stieltjes is an intermediate between the two
you explained it better than my £65 textbook. Kudos
Get the pdf lol.
When I was in my first semester my professor once talked about "Otto-normal Basen". I thought "WTF are those? And what is the opposite? Ungewöhnliche Basen?"
Boy was I wrong :D
(also, this comment probably won't make any sense if your german is not so good^^)
Hahaha, got it 😂😂😂
Use the Chen Lu!
AndDiracisHisProphet
You are right, I didn't get it
Don't be sad. It isn't that funny anyway. More of a situational thing.
oh ok.
Second Like for this lovely video by the lovely Dr. Peyam :)
Could you do a video about the existence of bases for infinite dimensional vector spaces? Your videos are great and always very helpful
That’s a great idea! I’ll think about it :)
I was actually looking for videos about the Gram-Schmidt process a few hours ago. What are the odds??
Is it extraneous to check the given vectors for linear independence first?
Linear independence is important here because otherwise the Gram-Schmidt process will spit out the zero vector!
The horror!! Haha I thought so, and I know all of that is usually implied, I was just making sure :)
Well, I dont really have something to add this time but i do have a question: why do you instead to write v.v as [....].[....] and not just ||[....]||^2? it is shorter.
Oh, I start watching the Measure Theory series of "Fematika", it is pretty good
I like u.u because it looks more like hugging, hahaha.
Dr. Peyam's Show I should have known this...
Peyam, I am afraid to say I think u are making me like linear algebra
Oh no, hahaha 😂
Wonderful, thank you kindly.
Love this topic!!
Could you please solve more PDEs?
There’s a PDE-ish topic coming on Friday!
I like this guy
thank you
shouldn't you rationalize the denominators?
Not necessary for an orthogonal basis. For an orthonormal basis you need to rationalize it, but just at the end!
Dr. Peyam. I love your videos. I am a physics phd too lol. But, I beg you, use lower case letters. It is painful to see everything spelled out like that. It feels like you are yelling at us. There is always time to change :). I beg you, oh please :)
Otherwise, I love your work!
I can’t, unfortunately. If I write in lower-case I’d have to write in cursive, and most people can’t read cursive handwriting.
Try it ;) I know you can do it :D I love your enthusiam!
Yes first!!!
You are talented man but please when you writing something don't stand front of white board.please don't mind I am extremely sorry.
WTF {v1, v2, v3}