Basis and Dimension
HTML-код
- Опубликовано: 19 янв 2025
- MIT 18.06SC Linear Algebra, Fall 2011
View the complete course: ocw.mit.edu/18...
Instructor: Ana Rita Pires
A teaching assistant works through a problem on basis and dimension.
License: Creative Commons BY-NC-SA
More information at ocw.mit.edu/terms
More courses at ocw.mit.edu
When you work with column vectors and perform row operations on matrices, the operations mix the components within each vector. For example, when you add or subtract rows representing column vectors (like [x, y, z, w]), the operations combine elements from the same vector. This can lead to combinations like x + y within the same vector, which doesn't maintain the original components separately. However, when you use row vectors in row operations, these operations don't mix components of the same vector. Each row represents a different vector, and the operations performed on rows don't blend or combine elements within a single vector. you just add or subtract or exchange vectors, and it is fine. So , that's why you cannot use the final columns when you eliminate using column vectors. You do not perform linear operations between different vectors, rather you mix a vector with itself
Thnx 🎉red ...good work it's a basic thought but yeah there should be clarity
Thank you so much for your explanation!
These MIT lectures are too good to be true. Thanks to all behind these videos.
Thanks a lot for showing the case of these vectors as columns. I had solved the matrix for pivots and chosen first three columns of the Echelon matrix as the basis. But clearly, as you pointed it out I was wrong. Awesome tutorial.
But, in the next lecture #10, at ruclips.net/video/nHlE7EgJFds/видео.html Prof Gilbert Strang says that basis is the PIVOT COLUMNS!!
@@bridge5189 Actually, a few seconds before Prof Strang explains by saying "the pivot columns i'm interested in are columns of A, the ORIGINAL A"
ruclips.net/video/nHlE7EgJFds/видео.html
@Indrajeet It is possible to pick the last columns if we performed column elimination. Then we would only have performed linear combinations of columns.
Of-course then it would be same as writing the vectors as rows as doing row elimination which was the first method as explained in the video.
@@bridge5189 The basis could be the pivot columns of the initial matrix, not the pivot columns of the matrix after elimination.
@@bridge5189 pivot columns here mean the position of the initial column, not the column after elimation.
I think this girl and the Asian one are the best TA so far in any MIT ocw
I just fell in love with this teacher ,you gave me such a great understanding
Thanks MIT for sharing such a great teacher and his teaching with the Whole World..A Learner From India..
thanks for pointing out about using the transpose matrix to solve the problem. that was exactly my question
hi, do you know why they have same pivots (a matrix and its transpose)?
zhao peter it just happens to be the same. Or when you do rref, you will always get the same pivots :)
It would be good to know why you can use the rows of the echelon matrix when doing vectors-as-rows, but can't use the columns of the echelon matrix when doing vectors-as-columns. The fact is stated, and justification is given in terms of the example ("not enough numbers"). But the reason why the methods aren't symmetrical is not explained. I believe there ought to be a good geometric explanation for this, or at least something in terms of the definitions of the spaces.
This is because columns in the echelon matrix are not formed by any linear combinations of the original columns. The process of creating echelon matrix is basically a series of row operations (i.e., new rows are formed by linear combinations of original and modified rows), which preserves linear independence of pivot rows (not pivot columns). That's why she said you may even use those original rows that correspond to the pivot rows to form the basis for that space.
@@robertchu4092 Hey! thanks. I had the same doubt. This was a good explaination.
@@robertchu4092 This was helpful! Thanks!
as there are 5 column vectors and each vector belong to R^4(we can have almost 4 linearly independent vectors for R^4) so don't even need to check if they are dependent by doing gaussian elemination.
Why is that the elimination of column vectors changes the column space ( 7:06 )
but the elimination of row vectors doesn't change the space (4:40) ?
I got the answer.
Lec-10, 24:00. Row transformation on a matrix A dosen't change its row space but changes its column space.
@@ashutoshtiwari4398 i was about to comment the same thing :)
@@ashutoshtiwari4398 thanks
Because you are performing the row operations on the column vectors, inevitably changing the column space. If you perform column operations on the column vectors, you would not be changing the column space. The column position of the leading ones after transposing the matrix and performing row operations would correspond to the row position of the original matrix.
So if I wrote the vectors as rows and did the elimination, I can directly use the final 3 rows (with pivots)?
Yep
Why do those vectors become rows instead of columns of the matrix? Is it because they have to follow the rule of forming an mxn matrix where m < n? I'm confused.
We can use any of them either as rows or columns as the rank of the resultant matrix is gonna be same and so the dimension of its vector space is same either way
at 4:17, how can you conclude that the vectors are linearly independent based only on the number of its pivots?
Thank you Ana and MIT!
please confirm (1,1,-2,0,-1) row vector or column vector, while soling TA taken as Row Vector is it correct
thanks a lot for the clarification at the end
Dont We take columns in basis?
Yet I don't understand how the new vectors of the echelon form still span the same space
Can I solve it by finding the rref of the given matrix
excellent way of teaching👏👏👏👏
love this course
Very clear. Thank you very much!
great recite!
Just wooow thank you 😢
Hey, I know this is a stupid question. What is the transpose of this universe?
"esrevinu " 😂
thank you so much
Thank you!
Thank you :)
Vow....
Thankyou very much
Thank you!