Vector addition and basis vectors | Linear algebra makes sense
HTML-код
- Опубликовано: 17 май 2024
- Vectors may seem very difficult when you're first introduced to them, but I hope this video helps you see they're not that scary! This video will be especially useful for vectors in physics. We'll cover vector addition and what vectors are. This is the start of a whole series of linear algebra, and I will cover vectors, adding vectors physics, the scalar product, matrices, eigenvalues/ eigenvectors and Dirac notation.
Next video: • Matrices, matrix multi...
Check out Brilliant: brilliant.org/LookingGlassUni...
DON'T FORGET TO DO YOUR HOMEWORK:
Prove these 2 statements about bases
1. If you have two different bases for the same space, then they must have the same number of basis elements in them. (E.g, there are many different choices of basis for the plane, but no matter what basis you choose, there are only 2 vectors in each basis.)
2. Once you pick a basis (say {v_1, v_2}), there's only one correct way to write another vector as a linear combination of the basis vectors. Eg, say v=a v_1+ b v_2. Then you can't also write v=a' v_1+ b' v_2, where a' and b' are different from a and b.
The multiple choice questions from Brilliant:
Q1. Which of these vectors is redundant (i.e. can be written as a linear combination of the other 2):
i) (1 2 3)
ii) (1 3 5)
iii) (2 5 8)
iv) Each of the above
The full solution is available here: brilliant.org/practice/linear-independence/?p=2
Q2. Consider the following 3 vector spaces:
A= The vector space spanned by {(1 2)}
B= The vector space spanned by {(1 2), (2 3)}
C= The vector space spanned by {(1 2), (2 4), (3 6)}
Question: which of the following is true?
i) A is a subspace of B, which is a subspace of C
ii) C is a subspace of A, which is a subspace of B
iii) B is a subspace of C, which is a subspace of A
iv) A is a subspace of B and C, which are not subspaces of each other
The full solution is here: brilliant.org/practice/subspaces-and-span/?p=6
ANSWERS FOR THE BRILLIANT.ORG QUESTIONS:
*
*
*
*
*
*
*
*
Q1) D
Q2) ii
HINTS FOR PROOF QUESTIONS:
2 is easier, so let's do that first
Hint 2.1 Let's do the case with just 2 basis vectors first. If there are 2 basis vectors v_1 and v_2, the one thing you know about them is that they are not just a multiple of each other (otherwise it wouldn't be a basis). Try and get a contradiction with this fact.
*
*
*
Hint 2.2 Assume v= a v_1+ b v_2 = a' v_1+ b' v_2, but a and a' aren't equal, and b and b' aren't equal.
*
*
*
Hint 2.3 Use the above equation to write a relationship between v_1 and v_2. Oh no, that looks like they are multiples of each other!
*
*
*
Hint 2.4 Now do the case where there are n basis vectors. What you know about them is that you can't write them as linear combinations of the others. Try and get a contradiction with this fact.
*
*
*
Hint 1.1. Imagine you had 3 vectors and they span 2D space. Doesn't that one of them is redundant? The following in this case first:
B1={u_1,u_2}
B2={v_1, v_2,v_3}
Write each u_i in terms of B1.
Remember that there is a redundancy in B2 if you can write c u_1+ d u_2= u_3. So write this, and let's see if we can find a solution for c and d.
Plug in your equations for u_i into c u_1+ d u_2= u_3
You now have a vector on the right hand side in B1 and a vector on the left hand side in B1
Using the result from question 2 (dammit, I really should have swapped the order of these questions), you know that the coefficient in front of v_1 and v_2 must be the same on both sides (since there is only one unique way to write a vector in B1)
So now you have 2 linear equations with 2 unknowns (c and d- everything else is 'known').
Show that they only don't have a solution for c and d if B2 was actually linearly dependent all along. (Yes this will require you to know some linear algebra to do efficiently (although technically possible without)).
*
*
*
When you assume that the two bases can be any size it's most efficient to do this with linear algebra (sorry!!)
*
*
*
The solution: yutsumura.com/if-there-are-more-vectors-than-a-spanning-set-then-vectors-are-linearly-dependent/
Music: Epidemic sound, Summer nights 2 - Наука
making a poll and stuff is brilliant, it makes it hard to lose concentration because youre doing something and not only listening (did not see that pun until the ad rolled in)
I love the way you teach, the way you encourage us to investigate for ourselves to deepen and reinforce our knowledge, and the way you make your sponsors attractive.
You really know about pedagogy and the way the human mind works when learning, being evident in the means you use to convey your ideas.
You are an inspiration for me, because I want to be build myself to be a teacher in the future.
Thanks for your content!
wish my linear algebra professor taught this as perfect as you did.
I like the music you've incorporated in the background, and I appreciate the production effort you're putting in. I've seen a few people comment now that they don't like the music track so I just thought I'd throw this in so that you knew that not everyone thought so. Cheers!
Aww, that's really thoughtful of you! Yay :)
Very excited about this series!! I just started to learn linear algebra as a preparation to study some machine learning later, so this will be really helpful. Amazing work you do, keep up with the awesome content. Cheers from Brazil. :)))
Homework Time!
-I am lazy :P-
First, we start by proving uniqueness of representation. Let the vectors be, *v1* , *v2* , *v3* ... *vN* . Because they are linearly independent,
a1 ⋅ *v1* + a2 ⋅ *v2* ... + aN ⋅ *vN* = 0, only when all of the coefficients are simultaneously zero (this can be seen by taking any one vector to the other side, then dividing by its coefficient and saying that no non-zero solution to that equation exits because linear independence kicks in).
So, once we are given two representations of the same vector, we start be equating both representations, and taking one to the other side resulting in a subtraction
Thus, if a1 ⋅ *v1* + a2 ⋅ *v2* ... + aN ⋅ *vN* and b1 ⋅ *v1* + b2 ⋅ *v2* .+... bN ⋅ *vN* represent the same vector,
then sum[(ai-bi) ⋅ *vi* ] = 0, for all i
say, x + y = 5, it has infinite solutions (every point on that line in cartesian plane)
Which means if we select B' as our basis, then we can represent the same vector in infinitely many ways, which violates uniqueness and it cannot be our basis. Thus, number of elements in our basis must be the same as B (not lower as dim(span B') decreases).
I guess this works. Does it?
In a linear algebra fashion, i would have wanted to go for a transformation matrix which preserves number of vectors and then show redundancy, but I am not sure how to prove everything along that path.
dear miss, I rejoiced with this new video of yours. I've been wanting to learn Dirac notation for years and I have also decided to study linear algebra this summer, because I wanted to audit a course on quantum computation next fall. thank you so much for being back. I can't wait for your next video!
So this enigmatic |v> just means v is a vector!! Curse you Dirac!!!
Yep, |a>, mathematicaly, means "A1+A2+A3+..." and can be rewrited in a Matricial Form:
A1
A2
A3
...
YES! Look forward to the rest of the series! :D The polling thing built into the video was super helpful!
Yay! I really enjoyed using them, so I think I'll do it more often!
Yeah, nice feature!
@@LookingGlassUniverse did you learn algebraic topology for quantum computing
This video is amazing! Thank you so much for doing this.
Looking forward to your lessons on the Dirac notation. That will be very helpful :-)
Hi thank you for all your great videos! Could you tell me how do you make your videos? I've seen your video on how to get started with quantum physics and wanted to try your technique and make a video of the things I dont feel quite comfortable with.
I'm mindblown with your gripping dedication, thank you.
Really original presentation format and very accessible and enjoyable to follow!
This video would have helped me so much in my first-year physics courses at uni!
Really. This was more intuitive than my classes of Analytical Geometry and Linear Algebra. Love.
Very happy to hear it!
now so many things make sense!!!! I love your videos. Keep going. thanks
i’d never thought about vectors having a hidden established basis in 3 directions. thanks for this
great explanation, keep up the great videos (aka: thanks!)
I did the second exercise of making the blue vector out of the purple/pink and green vectors in my head, not on a piece of paper, but doing that did actually help me to understand the problem a lot better. So you were right about that!
Because by doing the problem you posed in my head, I instantly _[or as good as instantly, anyways]_ realized that you can indeed make any vector out of a linear combination of 2 other vectors. Just draw a line through the endpoint of the vector you wish to make, parallel to one of the 2 original vectors. Then extend the other original vector by whatever factor needed so that the extended vector ends on the line that was just drawn. Finally, add whatever positive or negative multiple of the first vector along the parallel line so that you end up with the vector that was to be created.
*GREAT JOB* in picking such a clear exercise that makes it so much easier to see that!!
I had to use a little more than your explanation for the 3D case, though. Maybe it's because 3D-grids with something other than right angles are hard to imagine, but..... For me, I had to remind myself that every 2 planes intersect in a line. So if you do the following:
- Take 2 of the original vectors and consider the plane they are both on.
- Take the other original vector and the "target" vector and consider the plane they
they are on.
- There will be a line -- in other words, a *vector* -- that will lie on both those planes.
From the 2D-case, we know we can make the "intersecting vector" from those first 2 original vectors. Then, again from the 2D-case, we can make the target vector from the 3rd original vector and the "intersecting vector", thus proving we can make anything in 3D.
The first of the questions on the "Brilliant" side is a bit weird. Yes, removing any of the 3 of them from the set would still make the set span the Vector Space, so the answer they are looking for is "All 3 of them are redundant", but once you remove any 1, the others are no longer redundant. So, really, shouldn't the answer be "Whichever one you decide to remove"?
Actually, the other "Brilliant" question is also weird. A and C are the *same* Vector Space. The bases of those Vector Spaces is thus the same Set, and that means that their bases are Subsets of *each other* . So I would say that both answers ii) and iv) are correct. Following the description in the question though, I am guessing they want us to go with answer iv), but I don't think that their reasoning for putting answer ii) as incorrect is valid.
I really loved 3blue1browns series about linear algebra, but I think you're doing it better hahah. But both are amazing, great work :)
I like how you went back to the hand drawn stop-action style.
Love that you're doing this! I'm one of those students who didn't get any real intuition from their linear algebra classes, and after one semester I can only remember some concepts sparsely. Thanks
I really hope I can help at least a little! Thank you :D
The tired gray cells are stimulated again. Dankeschön.
You have a great way of conveying.
:O
cant wait for the quantum computing stuff :)
A nice proof for part 1 that involves a bit more linear algebra than you gave in the video: Let a basis B = {|v_1>, ... , |v_n>} (may aswell use Dirac notation since it's your favourite). Then the identity map is Id = |v_1 X v_1> + ... + |v_n X v_n> and the trace is trace(Id) = n. Since trace is basis independent, all bases have the same size (namely n).
Love your way of teaching... 🤩
This is so wonderful and playful. Presented this way, maybe I would've cared about learning math back in school so many years ago. Instead I have just done my best to ignore or avoid math all those years. And now, approaching 40, a self taught programmer, I decided that I want to move into machine learning/deep learning... So now I'm actually motivated and have a purpose to learning math, from getting more solid basic algebra knowledge to calculus and linear algebra. Can't afford to go back to school so it has to be self studies mostly late nights when the kids have gone to bed. But when the goal is being able to build AI? That's an amazing payoff! Thank you again for all your delightful videos!
keep going. I love your work.
You actually gave homework questions and answers....🎉i have never seen anyone in RUclips doing that....really grt...🎉keep doing it ...
Nice to see you again, and I love the addition of background music.
Yay! I stressed so much about trying to do that right.
You actually made me work on these problems. Good job on you!
YAY! That makes me so happy to hear.
Thank you very much for all the very insightful and informative videos over the past few years.
Thanks very much for sticking around for that long (especially as I've been very inconsistent).
This is a great video, very clear. But I have no clue how to solve that question on Brilliant about finding the redundant vector from the set. How is it possible to figure that out without knowing what the basis are? Help!
Fuar this might be even better than the strang lectures and 3b1b vids!! Thank you for some awesome content!
Only found this channel because of Grant's video with the 3 houses problem, so glad I did! This is great content!
Thank you :D!
Is the basis vector made up of actual concrete values? Like you have been writing the basis vector so it consists of the values v1, v2, v3,... but when you have a matrix with actual numbers do the v's have actual number values that coorespond or do they remain just variables?
you are awesome and thanks for the class.
These videos are amazing.
Hi. You make awesome videos. I love your videos. I have also seen your other videos of quantum mechanics. May you please make some video related to philosophy of why quantum physics came in physics. And also some more topics like what is the meaning of mass, space, time , momentum , force. Please try to make some video on some of the topics.
I found your channel yesterday sis love from India .What software are you using to make these videos???
Thank you ma'am for the video
1:05
I can relate to that so much. Certainly the effort put and output are disproportionate, but for some reason one feels so proud of that creation. This is impossible to convey to the viewer. They shall never share our love for the ephemeral object but we know that thing deserves its own video.
I just watched one of your videos! It was fantastic- and the thing is though, even if people don't consciously understand the effort involved, they can feel it in the quality of the video. So basically, chin up, both of us.
Very nice video!!! love the visuals and the voice!
Thank you :)
Been enjoying your channel for years. You once had a video explaining bra-ket notation which I am unable to find. Can't find it on your site. Will you please post it again?
On what platform do you make these videos? Please let us know. Thank you
We have clearance Clarence.
Roger Roger!
What’s our vector Victor?
This comment is great.
Over Oveur!
This was great, I enjoy your teaching approach. Everyone is making the point tthat the polls were great and I 100% agree, that was a wonderful addition.
One thing I think would hhave been worth mentioning is how to interpret those columns of numbers as arrows, and how to read some of the other ways to write vectors. I think showing that would help a lot, in future, but that might just be me?
Regardless, 10/10 loved it
Great point! I'll see if I can put that into the next video naturally.
Thank you very much!
Thank you.
Great video as always!
Thank you :)
I think you make wonderful videos that are both on point and easy to understand, i hope you will have a great weekend!
That's so sweet! I hope you do too :)
I love this
I'm just happy that something like brilliant actually knows about your videos and like them also
I was just trying to find out more about vectors to try and understand christoffel symbols when this popped up so yeah thanks for the convenient timing also also
can you please do a video or two around the christoffel symbols
Thank you!
Hmm, I wasn't planning to do Christoffel symbols in this since that's more under differential geometry... although it'd be great to learn it eventually so I cover GR. What are you learning them for?
Tks from Brazil
is span only defined for pair of vectors like constant multiple of one vector
Wait, what does the stack shorthand have to do with dirac notation being better than "arrow hat" notation? Arrow hat notation doesn't assume a basis either. It's just a symbol under an arrow. Am I missing something?
AH, you will see the magic of Dirac notation soon...
But basically: Dirac notation never used the column thing. This way the basis always has to be written out. That might sound tedious, and more difficult to do calculations, but actually the way you do matrix multiplication etc is (I think) quite streamlined in that notation.
Dirac notation is the best.
I'm planning on studying more linear algebra for a quantum mechanics course that i'm taking next semester. I was thinking about studying the "Linear Algebra Done Right" by Sheldon Axler. Do you think it is a good idea, or that book has way too much content for what I'll be needing?
There are some short videos by the author based on the book here on YT.
So, I am not a physicist or a QM person, but I really loved Axler for my Linear Algebra class. The thing to be aware of is that it is very focused on abstraction and concepts. There is relatively little interaction with matrices because it focuses on linear transformations (which matrices represent). Determinants are left until the last chapter. However, I think it gives a really great idea of what these things represent to a mathematician and it isn't done in a super computational way.
When I was preparing for this series I wanted to get a good textbook on Linear Algebra, and I wanted to go for that one since it had great reviews. Unfortunately it was out at my library- but now the undergrads are on holiday so it's back in! I'll go have a look.
A book that I did end up getting was 'Linear Algebra: A Pure Mathematical Approach' by Rose. It's pretty good- although a lot drier than usual linear algebra books, since it's all proof based. I think that's great for learning linear algebra a second time, but maybe not for the first?
Thanks.
The book you mention has a review on amazon which says that its end of chapter problems are quite interesting, and I know that doing those kinds of problems can really help one understand the beauty of the subject. So, I'll try to get the book.
Concerning the "Linear Algebra Done Right", I was wondering if I could leave out of my study topics such as - Operators on Real Vector Spaces- since, as I understand, quantum mechanics deals in complex vector spaces.
I had a quick leaf through linear algebra done right today. It looked good, but I didn't see what was covered in the chapter you mentioned so I can't say whether you should skip it. Maybe not though? Good to build intuition in familiar vector spaces. But yes, the important topics for QM are about unitaries, hermitian matrices and eigenvectors. Oh and tensor products are something you need to understand really well. Also, if you go further, Hilbert space theory is also very useful.
I wish I had this video when I was taking Linear Algebra last semester. By the way, I flunked :P
awesome
Discovered your channel this week and really loving your style. (Also, I'm a huge Alice nerd, so that was kinda gonna pull me in...) Do you draw your own lineart for Alice, the White Knight, Chesh, etc? It's very heavily influenced by Tenniel's originals (obvious in this vid where you show them in parallel, but also visible elsewhere eg seeing Alice in the same pose where she was looking up at Chesh, but with her hands placed differently), but not absolutely identical.
Yay! A fellow Alice fan :D Yes, I draw them, but I'm really not very good. I want to get a lot better. I love the original drawings, but I suppose I like to change a couple of things with them. And I've been trying out watercolours a bit recently, and I like the look. We'll see where it goes :)!
@@LookingGlassUniverse "Not very good"? Well, I respectfully disagree there :) Any Alice fan should recognize the Tenniel influence, and it's clear that the Tim Burton Chesh has inspired your depiction of his eyes. And they're integral to the videos. (See, I can do math puns too...) I love 'em!
@@rosuav Oh god, you're far far too kind! Seriously, that's so kind and encouraging :D!
Hurrah!!!
To what extent is Brilliant's quiz model based on Derek Muller's PhD thesis? If i'm not wrong, it was on exactly this topic, and found this model to be most effective
so what is last one's answer?
i think it's (ii) but polls suggest it should be (iv)
I am so thankful for my teachers and for my country's education system that taught me This way. I'm horrified to hear that in other places people are learning it otherwise.
Happy to have the sponsorship, but you might want to also consider Patreon
Thank you :) I've decided I will. But only once I've got my consistency up because I want it to be worth while for anyone who pledges.
I have a question how do we know where we should add the vectors
That was so adorably nerdy. ^^
Yay, glad you liked it!
I remember getting so frustrated with how linear algebra was taught when I did it. It's a beautiful subject but when concepts like linear independence, basis etc are taught without intuition it drove me crazy. 3B1B and yourself deal with this brilliantly. I hate how people say they may have aced exams in college but never really understood the stuff. That baffles me.
Okay, I see a lot of people got the last question wrong (currently 32% correct). So let me explain.
First, we're talking about the space these vectors span. So when talking about the vector space of (1,2) we're not talking about this vector with its length but we're talking about all points which can be reached by a linear combinations of the basis vectors (in this case only one). So with one vector we are bound to going in one direction, but we can stretch, squish and invert it as we want. Therefore we can reach every point that lies on the line on which our vector lies on, i.e. our vector space is a line. Precisely the line that contains all points that can be represented as a multiple of (1,2).
Let's look at the third basis next. I think if you got it wrong, really draw it. (2,4) = 2*(1,2) and (3,6) = 3*(1,2). This means both new vectors are linearly dependant on (1,2). Visually you can see that they lie on the same line. So if we combine those, where can we get? Well, still only to any point on the line. It's a bit like if you had a chess piece that can move up or down as much as it wants, but not left or right. It can make 7 moves from whereever it stands but as all actions viewed as a vector are linearly dependant. It can't go outside this line, so this is its vector space. If we introduce another piece that can move up or down one step at a time it can take only two moves, one in each direction. But eventually it can reach every square in its column, just like our more powerful piece before. So essentially because both can reach only one column they have the same vector space (assuming they start in the same column), although piece one can do all the moves that piece two can do plus some additional moves, because those additional moves don't add anything to the reachable squares of the piece.
The same thing happens with our vectors. "Move" (1,2) can be done by both basis, but the second can also do "moves" (2,4) and (3,6). However, both of these can be "mimiced" by doing move (1,2) two or three times.
As for how to prove this mathematically: Any point in the vector space of the third basis is representable as a*(1/2) + b*(2,4) + c*(3,6), i.e. a linear combination of the basis vectors. We can rewrite this as a*(1/2) + b*2*(1,2) + c*3*(1,2) = (a+2b+3c)*(1,2) because as we just noticed we can rewrite the last two vectors in terms of the new one. But look what we have now: any point in the vector space can be written as (a+2b+3c)*(1,2), i.e. some number times (1,2). This is exactly the same vector space as our first basis! And again, visually it makes sense. All vectors lie on a line so no matter how we combine them, we have to stay on that line. Therefore this vector space is exactly the same line as for the first basis.
The second basis is different. (2,3) and (1,2) are linaerly independent. If you draw it out, you'll notice that they are not on a line anymore. If you want, (2,3) allows us to "escape" the line that was the vector space in the other two basis. With the method shown at 5:55 you can see that with those two vectors you can reach every point on the plane. So our vector space IS the plane.
One last quirk: It's easy to see that A is a subspace of B and C is a subspace of B because A and C are the same subspace and the line that is their subspace is in the 2D plane because every line in 2D is on the 2D plane. But also A is a subspace of C and C is a subspace of A. Why? Well, it's a bit like with sets. The set {1,2} is a subset of {1,2,3} because every element of {1,2} is in {1,2,3}. But by that definition also the set {1,2,3} is a subset of {1,2,3} because every element of {1,2,3} is in {1,2,3}. Similarly the line that is the vector space of A is contained in the line that is vector space of C.
If your not familiar with sets: We can think of "subnumbers" as follows: a is a subnumber of b if a
Thank you for this amazingly thorough explanation! It's fantastic, and the analogy with chess is very interesting. I looked up the Knight's tour- what a fun little problem. I'm going to try solve it at some point so thanks for the entertainment :) Could be a nice video maybe?
If only my teacher would have explained like this
Did you learn algebraic topology for quantum computing
3Blue1Brown's videos are great: very, very visual.
Wow, thanks so much LGU! I really learned a lot with this video and I am thick headed for maths. I'm not sure I got that tricky question though. I thought it was A and saw in the poll more people picked D. So I'm wondering if I missed something there.
B isn't a subspace of C (the (2,3) is not the same as (2,4))... you had to catch that. That was the "tricky bit".
The answer is actually ii). Yeah, as Brent said, B isn't a subspace of C. In fact A=C. That was a tough one!
I guess I should have re-looked at what D said too! :-). P.S. I love your (Kiwi?) accent.
there's a typo in the poll, second question, choice b).
In the description, answer b is (1,3,5), in the poll it's (1,2,5.
Thank you for noticing!
Maam, if I am not not wrong, vector is any physical quantity which has magnitude and direction. But by this defination electric current also is a vector. But it is not. I think, the quantities for being a vector, should obey the laws of vector addition.
For the first brilliant question, I just guessed that any of them is redundant because if one of them is, then the remaining two define a plane, which the third vector must have been on to have been redundant. I also eliminated two of them being on the same line because there aren't any multiple choice answers that say "a or b" or anything like that, but also the second and third coefficients of each vector are coprime, meaning none of them can lie on the same line from the origin (I think, this is hardly a rigorous proof but it feels intuitively true). The entirety of my knowledge on this subject comes from a single day in my high school physics class and this video, so I have no idea if the intuition to which I'm defaulting to solve this problem is reliable or not.
You're absolutely right that none of them are multiplies of the others! You're intuition for that is correct. You're also correct in saying that "if one of them is [redundant], then the remaining two define a plane," but, I think you were saying that this is not the case. Actually it is. Say you have vector v_1, v_2 and v_3 and that v_1 can be written as a linear combo of the previous ones. Then actually, this means that v_2 can be written as a combo of v_1 and v_3 (try prove this!). So actually, and 2 of them define the same plane. So I think you were very close to the right reasoning but missed this point.
The poll usage here is pretty cool. Also, I recently found out about this interactive book immersivemath.com/ila/index.html and while I haven't really taken my time to explore it in depth, it looks like a really nice and hands-on way to learn about linear algebra
also i could probably cook up a 3d graph for the series if you would like to I would give it a shot
That's so cool! For the series itself I've got some ideas for how to do some of the 3D stuff, but some of them might be a bit hard. I'll let you know :) Thanks very very much for offering!
What's the background music?
(8:06) "* use your imagination"
* Use more play dough ! :P
Curious fact:
I've been wanting to see Prof. Gilbert Strang videos on linear algebra at MIT (browser tab open for weeks now), so hopefully your videos will jump start me, or even (who knows) do the job by themselves ?! That would be amazing !!
PS: Please, please, pleeeeaaase... avoid more buzz sounds like at (7:11). I would be deaf by now if I was wearing headphones !
Haha! yup, sorry.
I haven't watched those videos but I bet they're great! I hope this can serve as a gentle intro though :)
Hey, thanks for the great video! From a Physics perspective though, how does geometrically and trigonometricaly manipulating lengths on graphs give us the magnitude of the resultant? Furthermore, what could be the proof or reasoning that the Triangle or Parallelogram laws for vectors actually give us the resultants?
Thanks
idgi, why is the "direction and magnitude" concept any less "just a stack of numbers" than the other format? Maybe it's because I'm already familiar with vectors, and if I weren't it would be more intuitive, but I don't see a meaningful difference between these two ways of representing a vector
I would appreciate if anyone could tell me where I would find the "poll - button"
Here's what I have for the first proof question... except I just kind of wrote myself in circles and didn't actually prove anything. But I spent too much time on this already so I'll submit what I have (which is reminiscent of how I did homework for actual classes in school...):
We have a space with N vectors; call them v_1, v_2, ... v_N (arrows are difficult in typed RUclips comments, sorry!). Let's make a new basis for the same space, with M vectors (w_1, w_2, ... w_M). Our goal is to show that N has to equal M.
We start by noting that if we have an arbitrary vector K and it can be written as a linear combination of v's (uniquely, in fact, in accordance with Proof 2), it must also be able to be written as a linear combination of w's. That's what it means for them to span the same space. So written out, this means:
A_1 v_1 + A_2 v_2 + ... + A_N v_N = B_1 w_1 + B_2 w_2 + ... + B_M w_M
(where A_n and B_m are the coefficients in the respective bases). But w_1 can also be written as a linear combination of v's, too, as can w_2, w_3, ... w_M. If we do this out, on the RHS, for the |v_1〉term (I'm switching to Dirac notation even though you didn't talk about bra-vectors and dot products because I'm a cheater and jumping ahead - though really, 〈 w_1|v_1〉 is just a constant scalar that shows how the component of w_1 in the v_1 direction):
( B_1〈 w_1|v_1〉 + B_2〈 w_2|v_1〉+ ... + B_M〈 w_M|v_1〉) |v_1〉
And of course all the other |v_n〉's have similar terms. Of course, we could've instead expanded out the v_1's in the LHS in terms of w_1. This would give terms of the form
( A_1〈 v_1|w_1〉 + A_2〈 v_2|w_1〉+ ... + A_N〈 v_N|w_1〉) |w_1〉
...But now I'm just stuck on where to go from here. It probably involves equating equal terms on both sides, which I think we can do because our original K was an arbitrary vector, so in order for this equation to hold true always, we have to be able to break it up by basis vectors. So
A_1 = B_1〈 w_1|v_1〉 + B_2〈 w_2|v_1〉+ ... + B_M〈 w_M|v_1〉
etc. for A_2 ... A_N
and
B_1 = A_1〈 v_1|w_1〉 + A_2〈 v_2|w_1〉+ ... + A_N〈 v_N|w_1〉
etc. for B_2 ... B_M
Which naturally can be rewritten as two matrix equations (wherein we resort back to the "meaningless column of numbers" form of linear algebra that you seem to dislike to represent A = [A_1; A_2; ...; A_N] and B = [B_1; B_2; ...; B_N]):
A = P B
B = Q A
where P is the matrix whose (n,m)th element is 〈 w_m|v_n〉 and vice versa for Q. ...Except that means that P is the transpose of Q, which means that I must've accidentally assumed that the basis was orthogonal somewhere because the dot product is commutative. Whoops.
But trudging on anyway, we can compose these two equations (subbing the second into the first where B is, and also the first into the second where A is) to get:
A = PQ A
B = QP B
which means that P and Q are both left inverses and right inverses of each other. I think there's a theorem in linear algebra that says that the only way that matrices can be both left and right inverses of each other is if they are square and of the same size... So QED if I can cheat and say that, but I think that theorem is actually this one I'm trying to prove right now written in different words. So... dunno. I probably should've looked at the hints before trying this.
This is just the best- thank you so much for doing this. You have no idea how glad I am you wrote all this out regardless. I set this question thinking that you must be able to prove it using a very similar way to what you were attempting at the start (write each of the basis elements in terms of the other basis vectors) and I thought you'd get some simple and elegant contradiction somewhere. But then I tried writing it out (after I finished the video- I'm such an idiot) and got myself in a huge knot, and then realised that to solve it you need to resort to writing out matrices and doing some Gaussian elimination and oh my god. I didn't want to set a question about linear algebra that required so much... linear algebra! So yeah... In the description I explained how you might do it for a simple case where n=2, m=3 and hoped people would get the general idea from that. Massive cop out!
So I'm glad you really truly tried it- sorry for setting such a tricky question. I hope it was instructive for you anyway. It certainly was for me. And I do actually believe that getting stuck and trying things is the way you should learn maths, so.... I guess this was educational? I'm trying to justify it to myself here.
I appreciate your use of Dirac notation! It is just so much better right? Did you study physics?
Yep, a physics guy here. Definitely, Dirac notation is definitely super useful in a lot of cases (like this one... if the proof were to have actually gone somewhere). Anyway, thanks so much for your videos; I really like them!
Lol love the bra-ket vectors in the corners!
:D Thank you for noticing!!
Oh, great spot! :)
Victor's vectors vex vexillologists victoriously. Vicariously, Victor veers voyeurs' void vectors.
Love your videos! Can you tell us why your channel is Alice themed?
I don't know but maybe it's that she just likes the book
I've always assumed it's a reference to the surrealness of quantum mechanics.
Lewis Carroll was actually a mathematician and there are plenty of similarities between Alice in Wonderland and quantum 'weirdness'... but they also both just happen to be awesome, don't you think? :)
I should properly tell you guys one day, shouldn't I? But I think all the other replies here basically explain it.
I'm starstruck :D
Can I do this for the proof (here I am representing basis vectors as |1》etc.)
|v》=a|1》+b|2》=a'|1》+b'|2》
Assuming coeeficents are not equal
|v》× |v》=0=
(a|1》+b|2》)×(a'|1》+b'|2》)
=ab'-a'b
a/b=a'/b'
so the coefficents are proportional hence we have a scale factor say k.If the basis vectors have the angle ß b/w them then
|v|=a^2+b^2+2abcosß=k^2 (a^2+b^2+2abcosß)
WLOG we get by dividing and square rooting
k=(+or-)1
But if k=-1 then from the rule
|v》•|v》=|v|^2
We get |v|^2=-(a^2+b^2)
Which is not true since |v|^2 is positive. Hence k=1
Contradiction.
HP
That's a quarter of semester's worth of information in under 10 minutes.
V=av1+av2
a' is defined as a coefficient different in value to a, G is different from G'
If: av1+G'v2=a'v1+G'v2
Then: av1=a'v1, Gv2=G'v2
This is only true when a=a', G=G' since we can only add like terms, and v1 and v2 aren't like terms since they are linearly independent
Since this is the case, a≠a', G≠G' and av1+G'v2≠a'v1+G'v2
We saw her hand and part of an arm!!! PogChamp
Vectors are a mathematical tool. You can use them to represent points in a space (with arrows if you want to be fancy) and that's a nice way to visualize them, but nothing in their definition imply they are arrows. I'm currious what you think a matrix is.
Tungom if they are a point in space, then inherent to their definition is the origin of that space. So, two points. So, an arrow. Also, how do you add two points ? It doesn't really make sense.
No they are not point in space, that is just a way to visualize them. Read again.
There are many interpretations though. Go ahead and check out 3blue1brown's essence of linear algebra series (especially the first episode) for more (also rigorous) interpretations!
Can anyone explain me the Dirac notation? This video got me intrigued.
I will :) Eventually. But if someone else could explain it here that'd be great!
2v2 - v1?
I’m sorry, but I’m not sure if that condition written in cyan (bottom right) on 9:20 is entirely correct. It should be the case that only at least one a_k should be non-zero rather than all of them. This could possibly give some people the wrong idea (that if a basis vector is only linearly dependent on a proper subset of the other vectors, it’s not necessarily redundant).
I love your videos btw 😁
Won’t a vector space with 2 basis vectors always be a subspace of a vector space with 3 basis vectors? Because with 3 linearly independent vectors, you can reach any point in R3 and with 2 linearly independent vectors you can reach any point in R2... Or am I misunderstanding ‘subspace’?
There are trivial (degenerate) cases that make your statement false. For instance imagine vectors A and B spanning a plane. Then imagine vectors C, D, and E that compose a basis, but are co-linear. So span(A, B) not a subset of span(C, D, E).
heyandy x So by co-linear you mean that C=kD=mE (m, k some const.) right? But I thought a basis could not be composed of vectors like this. Coz this would imply redundancy right(coz clearly C can be written as a linear combination of D and of E)?
Oh. Yeah that could be possible. I was not aware of that rule for a basis.
DANGER!
3D Universe about to collapse in 7:11
hmmm i feel like this is going to be very similar to 3b1b's series on this topic, he went about this in a very similar way
I think the start is. But the real reason I wanted to do these videos is because the kind of Linear Algebra you need for QM has a different (and more narrow) focus. It's all about orthonormal bases and unitaries. So I wanted to make a short video series just covering that.
Looking Glass Universe skimming over the wikipedia articles for those it makes even more sense why you chose this introduction. interesting
I think there's is an error in your description of the linear dependence condition you give at 8:55. there is no reason why you should be able to write a vector which when added to your basis makes it linearly dependent in such a way that all the coefficients of your original basis are non-zero.
Oh yeah! Thanks for pointing that out. I was meant to move all the vectors to one side of the equation and write =0 on the other side. Oops....
A hand!
But you said that last time too!
XD
Where is the poll? Has it been disabled?
yes youtube disabled it
So I guess the answer is that C is a subspace of A(actually C and A have the same span, so I’m assuming I can say a vector space is a subspace of itself) and A is a subspace of B. Is that correct?
That's correct!! And yes, you can say a space is a subspace of itself.