Bro you're a goat I never comment but u made everything so much easier to understand than the other tutors who just yap about definitions, but you explain the intuition. Love it def gonna start watching u more for linear.
@@DrewWerbowski Determine if U is or not a subspace with justification. Finding eigenvectors and eigenvalues and diagonalization. Gram-Schmidt Orthogonalization Algorithm and computing a projection Finding a basis for a vector space Finding the matrix that describes the linear transformation (9.1). Least Squares Approximation Singular Value Decomposition Proof of an important Theorem
@@Ahmed-yo7gb thank you for the comprehensive list! Many of those topics I already have videos on my channel, but I will add some of the others to my list
interesting you say that applying a linear transformation is 'shifting space'. So that is one way to think about it, as a mapping between two spaces , the departure space and the arrival space, or as transformation of the departure space. A linear transformation is equivalent to matrix multiplication, and for the null space we are looking for solutions to A*x = 0 , where x is an n x 1 matrix of "solutions" and A is a given m x n matrix. When x varies you have a map from R^n -> R^m , defined by x -> A * x .
Excellent presentation. Thanks. You presented it in consideration of a homogenous system. Could you please add some explanation of this topics in a non-homogenous system? You are a great teacher!
Hey thought the video was great but I think your definition on independence may be off. A matrix is independent if the subsets don’t contain other subset variables. Your first problem you said was independent was actually dependent even though it spanned
so for column space I should use the corresponding column vectors in the original matrix. for row space I should use the row vectors in the RREF matrix?
will the dimensions of basis of col(A) and row(A) always be the same? Does dimensions of basis of null(A) hold any significance with col(A) and row(A)? Thank you! you're blessed.
11:22 I think there is a mistake, it should be the span of {v1, v2, v3, v3} = span {v1, v2} , not span {v1, v2, v3 } = span v1, v2, since there are four vectors we started with in Col(A).
If I perform row operations on a matrix, does it affect its column space? I am asking this because I used to perform row operations on the transposed matrix so that they are basically column operations.
On a matrix after application of row operation the row space stays the same while column space changes , and for application of row operation on its transpose keeps it's column space same but changes row space
@@kushaal1607 how about making the matrices to the transpose form and then you take the original vector as row space after finding the rref. Is it still wrong?
@@kushaal1607 in the video (14:39) he says that row(A) of the original matrix A is equal to the row(A) of the RREF form, so you can use both. Only for columns it doesn't work, as you might end up the standard basis vectors, which is not per definition the same as the basis of col(A) of the original matrix A
Wouldn't the column space be the set of all column vectors, so literally every column is in the span. Whereby the basis is all the literally independent columns
Column space is the linear span of all independent columns of the matrix. So sure, it contains all the columns in the matrix, however its not limited to it.
Bro you're a goat I never comment but u made everything so much easier to understand than the other tutors who just yap about definitions, but you explain the intuition. Love it def gonna start watching u more for linear.
Thank you so much for your comment. Are there any linear algebra topics you would like to see?
@@DrewWerbowski
Determine if U is or not a subspace with justification.
Finding eigenvectors and eigenvalues and diagonalization.
Gram-Schmidt Orthogonalization Algorithm and computing a projection
Finding a basis for a vector space
Finding the matrix that describes the linear transformation (9.1).
Least Squares Approximation
Singular Value Decomposition
Proof of an important Theorem
@@Ahmed-yo7gb thank you for the comprehensive list! Many of those topics I already have videos on my channel, but I will add some of the others to my list
I never comment on videos but you my friend just aced this chapter. Khan academy complicates it for no reasons. Great job
Appreciate the support! Thank you!
Thank you for this; you makes things much easier to understand.
omg i literally have my final tmrw and u just explained the concepts i've been dreading the most in the most understandable way ever omfg ur the goat
Thank you! Hope your final went well!
Thank you so much. Finally understood the concept perfectly
OMG THANK YOU SO MUCH. You are a life saver. I was having so much trouble with a question on MyOpenMath and now I understand 😭
Great! Thanks for this simple and intelligent explanation!
best explanation of topic .... finally i understood the topic ... it is simple but our teacher make it very hard.
Thanks for good explanation,may God bless you abandantly
interesting you say that applying a linear transformation is 'shifting space'. So that is one way to think about it, as a mapping between two spaces , the departure space and the arrival space, or as transformation of the departure space.
A linear transformation is equivalent to matrix multiplication, and for the null space we are looking for solutions to A*x = 0 , where x is an n x 1 matrix of "solutions" and A is a given m x n matrix. When x varies you have a map from R^n -> R^m , defined by x -> A * x .
needed this, thanks for creating this. :)
Excellent presentation. Thanks.
You presented it in consideration of a homogenous system. Could you please add some explanation of this topics in a non-homogenous system? You are a great teacher!
Thank you for teaching. It helps me to solve my homework. And if you don’t mind,please you will suggest the book of Linear Algebra.
Hey thought the video was great but I think your definition on independence may be off. A matrix is independent if the subsets don’t contain other subset variables. Your first problem you said was independent was actually dependent even though it spanned
so for column space I should use the corresponding column vectors in the original matrix. for row space I should use the row vectors in the RREF matrix?
Yes, correct
thank you! my endterm is tomorrow, u helped a lot!
Thanks for making it understand.
will the dimensions of basis of col(A) and row(A) always be the same?
Does dimensions of basis of null(A) hold any significance with col(A) and row(A)?
Thank you!
you're blessed.
Dimensions of Row(A)=Col(A) and Dimensions of Row(A) + null(A) = # of columns
11:22 I think there is a mistake, it should be the span of {v1, v2, v3, v3} = span {v1, v2} , not
span {v1, v2, v3 } = span v1, v2, since there are four vectors we started with in Col(A).
It was an example {v1, v2, v3, v3} = span {v1, v2} stands correct due to {v1, v2, v3 } = span {v1, v2} being correct
THANK YOU!!😀😀😀
thank you so much! btw your voice is super cool
thank you so much.....
youre so good man!
you are a legend thank you so much
If I perform row operations on a matrix, does it affect its column space? I am asking this because I used to perform row operations on the transposed matrix so that they are basically column operations.
On a matrix after application of row operation the row space stays the same while column space changes , and for application of row operation on its transpose keeps it's column space same but changes row space
@@theultimate2345 got it, thanks a lot!!
Thank you, buddy
would be cool if you shared the onenote document so that we could save it for notes :)
You'll learn more efficiently if you listen, understand, then write notes in your own way :) Good luck!
explained it better than my prof and my textbook combined. appreciate it man thank you
great video you deserve more likes and subscribes
thank you!
please suggest any book from where i can get all these things. thnx
thanks!
Can I write the basis row with the original matrix like we did with the columns ? Thanks
I have the same question and exam in 5days
no you can't, i don't know why, but i'm sure you can't write the basis row with the original matrix like we did with the columns
@@kushaal1607 how about making the matrices to the transpose form and then you take the original vector as row space after finding the rref. Is it still wrong?
@@kushaal1607 you can write it that way thought
@@kushaal1607 in the video (14:39) he says that row(A) of the original matrix A is equal to the row(A) of the RREF form, so you can use both. Only for columns it doesn't work, as you might end up the standard basis vectors, which is not per definition the same as the basis of col(A) of the original matrix A
thanks
Thank you so much sir
Wouldn't the column space be the set of all column vectors, so literally every column is in the span. Whereby the basis is all the literally independent columns
Yes
Column space is the linear span of all independent columns of the matrix. So sure, it contains all the columns in the matrix, however its not limited to it.
but the question is asking for a column space of a polynomial. There isn't even a matrix given in the question.
i luv u
Video was lil bit helpful
plz replace my linear teacher 🙏🙏🙏
what is your instagram..
Can I write the basis row with the original matrix like we did with the columns ? Thanks
Same question have exam in 5days
Yes you can
thanks!