Polynomial Regression | Machine Learning

Поделиться
HTML-код
  • Опубликовано: 30 дек 2024

Комментарии • 48

  • @MohitSingh-jb9tb
    @MohitSingh-jb9tb 8 месяцев назад +9

    If anyone wants to learn ML then it is the best channel.. explanation is amazing for all the topics.

  • @deeksha-cm8kq
    @deeksha-cm8kq 18 дней назад

    Sir You are really one of the best teachers on youtube , the way you teach and specially they way you create examples shows how much you know about the stuff you are teaching. Thankyou so much Sir. I wish one day I have this amount of knowledge so that I can also make a difference like you .

  • @ankitchoudhary9302
    @ankitchoudhary9302 2 месяца назад +3

    Sad to Say but It feels Like some People just copied Your youtube content , put it in a course and made hell lot of Money and Launched their Companies. They might have stolen my Money but you Earned My respect !!!!

    • @stardusthere
      @stardusthere Месяц назад

      which course you r talking about bro?

    • @ug1880
      @ug1880 Месяц назад +1

      Tell which one....we will expose him

    • @ankitchoudhary9302
      @ankitchoudhary9302 Месяц назад

      @@ug1880I won't name it but you know cheapest things often attract most buyers.

  • @rb4754
    @rb4754 7 месяцев назад +2

    You way of teaching is very very interesting.

  • @sanjaymange6904
    @sanjaymange6904 Год назад +6

    great video bro , excellent , perfect , no words to express my gratitude. U covered all the doubts I had w.r.t to polynomial regression

  • @0Fallen0
    @0Fallen0 2 года назад +6

    6:03, PolynomialFeatures class for x1 and x2 and for degree 3, also adds columns (x1)(x2), (x1^2)(x2), (x1)(x2^2). Generally, for PolynomialFeatures set to degree = d, and we originally had n features, we will get a total of (n+d)! / d! n! features. Just thought this'd be useful to know.

    • @ruchiagrawal6432
      @ruchiagrawal6432 2 года назад +2

      thanks! I had the same doubt because at 5:52 sir didn't add the additional term x1*x2, so I got confused. Please correct me if I'm wrong.

    • @thatsfantastic313
      @thatsfantastic313 2 года назад

      ​@@ruchiagrawal6432 it gets features’ high-order and interaction terms. suppose (x1,x1) becomes (1,x1,x2,x1^2,x1x2,x2^2) and so on

    • @subhajitdey4483
      @subhajitdey4483 Год назад

      from where/how the term (X1)(X2) coming from ? Kindly elaborate

    • @manishsinghmehra7555
      @manishsinghmehra7555 Год назад

      ​@@subhajitdey4483 formula bro (a+b)^n where n is the degree of polynomial.

  • @messi0510
    @messi0510 Год назад

    1:35-->7:25 What is polynomial Linear Regression?

  • @laxmimansukhani1825
    @laxmimansukhani1825 2 месяца назад

    great explanation !! every student owes you alot !!

  • @raj-nq8ke
    @raj-nq8ke 2 года назад +5

    Perfect explanation as always.

  • @saptarshisanyal6738
    @saptarshisanyal6738 2 года назад +1

    Your explanation is wonderful. I request you to kindly prepare a video to explain the code.

  • @hari_thaniwal
    @hari_thaniwal 8 месяцев назад +1

    Watch this video with the subtitles provided by RUclips. You will find another story in the subtitles 😅

  • @zainfaisal3153
    @zainfaisal3153 9 месяцев назад +1

    Sir you generated data in form X and y, what about, if we have real life dataset, then how to plot data distribution?

  • @alihasanmerchant9335
    @alihasanmerchant9335 2 года назад

    Best, simple & easy explanation

  • @amitbaderia6385
    @amitbaderia6385 2 года назад +2

    Excellent. Very well explained. You should use real world data instead of random numerical values

  • @balrajprajesh6473
    @balrajprajesh6473 2 года назад +1

    Best of the best!

  • @Ishant875
    @Ishant875 11 месяцев назад

    Why in polynomial regression with degree 2 is not able to capture the bowl in the training dataset? I think that curve shown was for a higher polynomial.

    • @Tusharchitrakar
      @Tusharchitrakar 11 месяцев назад

      Yes even I was confused because the poly features should have fit better for degree 2

  • @SonuK7895
    @SonuK7895 2 года назад +1

    Thanks for the Video Sirjiiii

  • @colabwork1910
    @colabwork1910 3 года назад +1

    Awesome Explanation.

  • @1981Praveer
    @1981Praveer 9 месяцев назад +1

    we might have to use "print("Input", poly.n_features_in_)" instead of "print("Input", poly.n_input_features_)"

  • @subhajitdey4483
    @subhajitdey4483 Год назад

    One qurstion....
    10:37 why we are not using fit_transform() for X_test_trans as X_train_trans ??

    • @manishsinghmehra7555
      @manishsinghmehra7555 Год назад +2

      fit() is used to calculate mean and variance of the data, while transform() is use to transform (scale the data according to our need) the data. When we first use fit_transform() which already calculated mean and variance from 80% of data, which is more reliable than the 20% of data i.e. X_test_trans also by doing so we keep the mean and variance of whole dataset same that's why we only use transform for X_train_trans.

    • @rohinisingh6916
      @rohinisingh6916 3 месяца назад

      Thank you ​@@manishsinghmehra7555

  • @Sreedeviha
    @Sreedeviha 2 года назад +3

    I understood the explanation well.. thanks... but Bhai one doubt in real data set by looking at the data without plotting graph, will I be able to tell we need to use poly reg.. Also if we add features like X^2,X^3, will it not end up with multicollinearity as dependency b/w input features exists now??

    • @thatsfantastic313
      @thatsfantastic313 2 года назад +1

      Yes it adds multi-collinearity in your model by introducing x^2, x^3 features into your model. To overcome that, you can use orthogonal polynomial regression which introduces polynomials that are orthogonal to each other.

  • @rafibasha4145
    @rafibasha4145 2 года назад +6

    Hi Nitish bro,this audio is not clear ..please do new video if possible on same topic..also please cover fearure selection xgboost

    • @balrajprajesh6473
      @balrajprajesh6473 2 года назад +2

      you can use earphones, it will be understandable.

  • @JACKSPARROW-ch7jl
    @JACKSPARROW-ch7jl Год назад

    Thank u nitish 🎉

  • @heetbhatt4511
    @heetbhatt4511 Год назад

    Thank you sir

  • @sanjeevranjanmishra8811
    @sanjeevranjanmishra8811 3 года назад

    Can we apply polynomial regression on dataframe (let's say df) first then after we split x into train_x or test_x. If yes then why everyone doing split first then transform x_train then x_test.

  • @sandipansarkar9211
    @sandipansarkar9211 2 года назад

    finished watching

  • @manishsinghmehra7555
    @manishsinghmehra7555 Год назад

    Why do we create X_new and y_new, while we have X_test_trans and y_pred

  • @devendrasharma5567
    @devendrasharma5567 Год назад

    Infinit time thankyou ir

  • @namanmodi7536
    @namanmodi7536 2 года назад

    nice

  • @WhySachi
    @WhySachi 8 месяцев назад

  • @anuragkothari8384
    @anuragkothari8384 5 месяцев назад

    You should have write the code from scratch for polynomial feature generation

  • @krishnakanthmacherla4431
    @krishnakanthmacherla4431 2 года назад

    Wow

  • @ahmadansari7708
    @ahmadansari7708 2 года назад

    ❤️

  • @khushisaini2683
    @khushisaini2683 10 месяцев назад

    Avaj ni saaf

  • @rk_dixit
    @rk_dixit 4 месяца назад

    plz spend a little more time on code writing or code explanation 🙏