This is gold. I was reading multiple articles to understand multivariate linear functions and the concept of hyperplane and couldn’t fully grasp the concepts. You connected many puzzles so well and helped me clear so many doubts. Thanks a ton 🙏
I have searched and watched many videos to figure out few things on vector space, and this video by far is the best. I like how he sequentially challenges his previous step with 'so what' or 'what's next' and follows it up as a logical evolution to the next equation or derivations. That's neat!!
I should have watched this video when i was 10th,i would have scored 100/100 for sure.Awsome really awsome explanation that no one on internet has done.Thank you so much
I don't know if your are still active if so please more on linear algebra for AI in general, I realy find you the most adequate one to explain gilbert strang book on linear algebra big project yes bet you are excellent not every one is good in teaching but you are really one of the best thank you
Bro he is Shrikant Verma he is data scientist at Microsoft he is also teacher at scaler academy, to access his free courses you can checkout scaler topics there you will get his free courses like DBMS , SQL and all ...search on Google or any browser..
Perfect, thank for explanation, but got a question, if W0 is not zero, we can then suppose a X0=1, and then write the equations W_Tranpose*X=0 with W a raw vector containing w0,w1,..wn, at this point doing the same conclusion as u did by the same concluson i mean W is still a unit vector perpendicular to our hyperplane?
Can you please tell me how this X relate to training data? Normally the training data is N by M matrix where each row corresponds to one sample , and each sample is ending up with multiplying the same Coefficient like W sub 1 ?
Just to get this equation back into original linear or whatever form it is: Wo + W1X1 + W2X2+ W3X3 +........... + WnXn... don't forget by default we have column vectors. :) hope it will help :)
Because to multiply you need one as a row and other as a column vector,,, acc. to vector multiplication. For ex. - axn matrix can be multiplied by nxb matrix....note: first one's column is equal to second one's row... :-)
I am a little confused about one thing. I think I understand everything you are trying to say at the end of the video but there is one result that is confusing me. So you know how the dot product of vector w and vector x , in whatever # of dimensional space, is the same thing as W transpose multiplied by the x vector and if we are at the origin then w dot x = 0 defines the plane. And I get that this implies. ||w||||x||costheta = 0 which means theta has to be 90 meaning w is perpendicular to the x vector. But does that mean w is ALSO perpendicular to the PLANE itself and not just the x vector? Or maybe I should be understanding that the x vector always lies on the plane anyway. What does that mean for w? I get that it must be perpendicular to the x vector but what about the plane? Its always perpendicular to the plane? I also know from calc3 that you need just a point and a perpendicular vector in or to define a plane. Somehow these ideas are def all connected. to ask my question again, I just want to see if w vector being perpendicular to x vector will always necessarily mean that the w vector is also perpendicular to the plane. And also will the x vector always lie on the plane?
Oh snap I think I get it. That w vector is precisely that orthogonal perpendicular vector to the plane that I was saying I learned about in calc3 . Of course, thats why we were told that you need a vector perpendicular to the plane and a point to define the plane!!
We cover all of Linear Algebra needed for ML and Deep Learning through the rest of the course where ever the need arises. In th initial chapters, we only cover the very basics. We don't have a seperate course for Linear Algebra.
you said that idea of line in 2D is equivalent the idea of plane in 3D,but I also can define a straight line in 3D space right?which was told us in 12th grade.so I want to know whether I am right or wrong! Thank you,
Yes, there is a line in 3D and higher dimensional space also. We meant the equivalence from a mathematical perspective that a line in 2 D, a plane in 3D and a hyper plane on 4D can all be represented using a simple formula: w.x=0
I can hardly imagine of anyone else who can explain like you!! amazing!!
This is gold. I was reading multiple articles to understand multivariate linear functions and the concept of hyperplane and couldn’t fully grasp the concepts. You connected many puzzles so well and helped me clear so many doubts. Thanks a ton 🙏
I have searched and watched many videos to figure out few things on vector space, and this video by far is the best. I like how he sequentially challenges his previous step with 'so what' or 'what's next' and follows it up as a logical evolution to the next equation or derivations. That's neat!!
You cleared every confusion I had about this topic in one video, you're the real MVP in this game. Thank you!
Happy that you got good clarity at the end of this video.
What a pellucid explanation! You are an awesome teacher; thanks a ton!
Very powerful sets of lectures. Thanks very much. Keep doing the good work!
My mind = "Blown" Excellent explanation. Wow !!!
I should have watched this video when i was 10th,i would have scored 100/100 for sure.Awsome really awsome explanation that no one on internet has done.Thank you so much
My graduate teacher at UCSD fails to make this clear, thank you.
Fantastic lesson. Clear and concise, yet capable of quickly showcasing basic concepts to tie into the bigger picture.
Brilliant!
I am taking ML and this is rly helping me right now thank you. I am giving your videos a second watch !
Super clean and comprehensive explanation!
This should win an Emmy. It’s just beautiful.
Thanks mate, you explained this really well. Appreciate it
I don't know if your are still active if so please more on linear algebra for AI in general, I realy find you the most adequate one to explain gilbert strang book on linear algebra big project yes bet you are excellent not every one is good in teaching but you are really one of the best thank you
Bro he is Shrikant Verma he is data scientist at Microsoft he is also teacher at scaler academy, to access his free courses you can checkout scaler topics there you will get his free courses like DBMS , SQL and all ...search on Google or any browser..
Excellent video, thanks for sharing
Brilliant Explanation. Thanks a lot.
Really well explained, thank you!!!
Nice Video. You know your Math. One person in some other video explained WT as W + (Tau).That didnt make any sense. Thank you very much for sharing.
Amazing sir i never understood the concept of hyperplane before. Thanks for such a great vedio
Nice explanation, thank youu
i will do anything to get my hands on this full lecture series.
Perfect, thank for explanation, but got a question, if W0 is not zero, we can then suppose a X0=1, and then write the equations W_Tranpose*X=0 with W a raw vector containing w0,w1,..wn, at this point doing the same conclusion as u did by the same concluson i mean W is still a unit vector perpendicular to our hyperplane?
Love you use BEAUTY word here!
at 7.36 would you please explain know why did you consider w.1, w2.... wn as row vector and x1, x2 , x3...xn as column vector?
Excellent explanation !
Brilliant!
Made things so simple 🎉✨
Fantastic, your explanation was Clear and concise. thank you
If the equation is wx=b then what will be the geometric interpretation for this equation?
then w0= -b ?
At 16:39 how w.x = w^t . x?
At 17:07 why are you calling w vector as points? Doesn't w suppose to be coefficient vector and x as point vector?
So cool expaination
I cant explain how good it is
Can you please tell me how this X relate to training data? Normally the training data is N by M matrix where each row corresponds to one sample , and each sample is ending up with multiplying the same Coefficient like W sub 1 ?
You are good!
that was gold
this is a fucking great video, truly made my day!
I understood it now. After so much time😰
Sir Why you have taken "w" From orgin why not from any other other on the plane.
WOW!
But I dint get one thing what's diffrance between row vector and column vector if we want to see in the graph? Both have got x and y axis
Maybe those videos could help:
ruclips.net/video/LyGKycYT2v0/видео.html
ruclips.net/video/Eqi20D-b2sU/видео.html (I did not watch this one yet)
nice one! can anyone explain why is w a row vector and x a column vector? thanks!
Just to get this equation back into original linear or whatever form it is: Wo + W1X1 + W2X2+ W3X3 +........... + WnXn... don't forget by default we have column vectors. :) hope it will help :)
Because to multiply you need one as a row and other as a column vector,,, acc. to vector multiplication.
For ex. - axn matrix can be multiplied by nxb matrix....note: first one's column is equal to second one's row... :-)
I am a little confused about one thing. I think I understand everything you are trying to say at the end of the video but there is one result that is confusing me. So you know how the dot product of vector w and vector x , in whatever # of dimensional space, is the same thing as W transpose multiplied by the x vector and if we are at the origin then w dot x = 0 defines the plane. And I get that this implies. ||w||||x||costheta = 0 which means theta has to be 90 meaning w is perpendicular to the x vector. But does that mean w is ALSO perpendicular to the PLANE itself and not just the x vector? Or maybe I should be understanding that the x vector always lies on the plane anyway. What does that mean for w? I get that it must be perpendicular to the x vector but what about the plane? Its always perpendicular to the plane?
I also know from calc3 that you need just a point and a perpendicular vector in or to define a plane. Somehow these ideas are def all connected.
to ask my question again, I just want to see if w vector being perpendicular to x vector will always necessarily mean that the w vector is also perpendicular to the plane. And also will the x vector always lie on the plane?
Oh snap I think I get it. That w vector is precisely that orthogonal perpendicular vector to the plane that I was saying I learned about in calc3 . Of course, thats why we were told that you need a vector perpendicular to the plane and a point to define the plane!!
Why is it, that the linear line or plane equals 0?
Wow!
You have any further courses for linear algebra?
We cover all of Linear Algebra needed for ML and Deep Learning through the rest of the course where ever the need arises. In th initial chapters, we only cover the very basics. We don't have a seperate course for Linear Algebra.
@@AppliedAICourse but let me tell you one thing, I never found any course like yours so simple to understand and visulise
you said that idea of line in 2D is equivalent the idea of plane in 3D,but I also can define a straight line in 3D space right?which was told us in 12th grade.so I want to know whether I am right or wrong!
Thank you,
Yes, there is a line in 3D and higher dimensional space also. We meant the equivalence from a mathematical perspective that a line in 2 D, a plane in 3D and a hyper plane on 4D can all be represented using a simple formula: w.x=0
sir what is a,b and c in equation are they constants ?
Yes, these are constants.
can you please tell me that what is W (transpose ) means ?
and what is W1,W2...Wn ?
Sir why is w horizontal vector and x a vertical vector.
All vectors are by default column vectors unless specified otherwise. Here, W and X are both column vectors. W_T is a row vector
Ist PUC math y=mx+c .
How to find W?
We will see that in the next few videos. This is typically done using ML models like linear regression
Why does your channal's name have "AI" in it?
Artificial Intelligence mate
Good job.