Sir You are really one of the best teachers on youtube , the way you teach and specially they way you create examples shows how much you know about the stuff you are teaching. Thankyou so much Sir. I wish one day I have this amount of knowledge so that I can also make a difference like you .
Sad to Say but It feels Like some People just copied Your youtube content , put it in a course and made hell lot of Money and Launched their Companies. They might have stolen my Money but you Earned My respect !!!!
6:03, PolynomialFeatures class for x1 and x2 and for degree 3, also adds columns (x1)(x2), (x1^2)(x2), (x1)(x2^2). Generally, for PolynomialFeatures set to degree = d, and we originally had n features, we will get a total of (n+d)! / d! n! features. Just thought this'd be useful to know.
Why in polynomial regression with degree 2 is not able to capture the bowl in the training dataset? I think that curve shown was for a higher polynomial.
fit() is used to calculate mean and variance of the data, while transform() is use to transform (scale the data according to our need) the data. When we first use fit_transform() which already calculated mean and variance from 80% of data, which is more reliable than the 20% of data i.e. X_test_trans also by doing so we keep the mean and variance of whole dataset same that's why we only use transform for X_train_trans.
I understood the explanation well.. thanks... but Bhai one doubt in real data set by looking at the data without plotting graph, will I be able to tell we need to use poly reg.. Also if we add features like X^2,X^3, will it not end up with multicollinearity as dependency b/w input features exists now??
Yes it adds multi-collinearity in your model by introducing x^2, x^3 features into your model. To overcome that, you can use orthogonal polynomial regression which introduces polynomials that are orthogonal to each other.
Can we apply polynomial regression on dataframe (let's say df) first then after we split x into train_x or test_x. If yes then why everyone doing split first then transform x_train then x_test.
If anyone wants to learn ML then it is the best channel.. explanation is amazing for all the topics.
Sir You are really one of the best teachers on youtube , the way you teach and specially they way you create examples shows how much you know about the stuff you are teaching. Thankyou so much Sir. I wish one day I have this amount of knowledge so that I can also make a difference like you .
Sad to Say but It feels Like some People just copied Your youtube content , put it in a course and made hell lot of Money and Launched their Companies. They might have stolen my Money but you Earned My respect !!!!
which course you r talking about bro?
Tell which one....we will expose him
@@ug1880I won't name it but you know cheapest things often attract most buyers.
You way of teaching is very very interesting.
great video bro , excellent , perfect , no words to express my gratitude. U covered all the doubts I had w.r.t to polynomial regression
6:03, PolynomialFeatures class for x1 and x2 and for degree 3, also adds columns (x1)(x2), (x1^2)(x2), (x1)(x2^2). Generally, for PolynomialFeatures set to degree = d, and we originally had n features, we will get a total of (n+d)! / d! n! features. Just thought this'd be useful to know.
thanks! I had the same doubt because at 5:52 sir didn't add the additional term x1*x2, so I got confused. Please correct me if I'm wrong.
@@ruchiagrawal6432 it gets features’ high-order and interaction terms. suppose (x1,x1) becomes (1,x1,x2,x1^2,x1x2,x2^2) and so on
from where/how the term (X1)(X2) coming from ? Kindly elaborate
@@subhajitdey4483 formula bro (a+b)^n where n is the degree of polynomial.
1:35-->7:25 What is polynomial Linear Regression?
great explanation !! every student owes you alot !!
Perfect explanation as always.
Your explanation is wonderful. I request you to kindly prepare a video to explain the code.
Watch this video with the subtitles provided by RUclips. You will find another story in the subtitles 😅
Sir you generated data in form X and y, what about, if we have real life dataset, then how to plot data distribution?
Best, simple & easy explanation
Excellent. Very well explained. You should use real world data instead of random numerical values
Best of the best!
Why in polynomial regression with degree 2 is not able to capture the bowl in the training dataset? I think that curve shown was for a higher polynomial.
Yes even I was confused because the poly features should have fit better for degree 2
Thanks for the Video Sirjiiii
Awesome Explanation.
we might have to use "print("Input", poly.n_features_in_)" instead of "print("Input", poly.n_input_features_)"
One qurstion....
10:37 why we are not using fit_transform() for X_test_trans as X_train_trans ??
fit() is used to calculate mean and variance of the data, while transform() is use to transform (scale the data according to our need) the data. When we first use fit_transform() which already calculated mean and variance from 80% of data, which is more reliable than the 20% of data i.e. X_test_trans also by doing so we keep the mean and variance of whole dataset same that's why we only use transform for X_train_trans.
Thank you @@manishsinghmehra7555
I understood the explanation well.. thanks... but Bhai one doubt in real data set by looking at the data without plotting graph, will I be able to tell we need to use poly reg.. Also if we add features like X^2,X^3, will it not end up with multicollinearity as dependency b/w input features exists now??
Yes it adds multi-collinearity in your model by introducing x^2, x^3 features into your model. To overcome that, you can use orthogonal polynomial regression which introduces polynomials that are orthogonal to each other.
Hi Nitish bro,this audio is not clear ..please do new video if possible on same topic..also please cover fearure selection xgboost
you can use earphones, it will be understandable.
Thank u nitish 🎉
Thank you sir
Can we apply polynomial regression on dataframe (let's say df) first then after we split x into train_x or test_x. If yes then why everyone doing split first then transform x_train then x_test.
finished watching
Why do we create X_new and y_new, while we have X_test_trans and y_pred
Same doubt
Infinit time thankyou ir
nice
❤
You should have write the code from scratch for polynomial feature generation
Wow
❤️
Avaj ni saaf
plz spend a little more time on code writing or code explanation 🙏