Gradient Boost Machine Learning|How Gradient boost work in Machine Learning

Поделиться
HTML-код
  • Опубликовано: 23 дек 2024

Комментарии • 418

  • @prasaddalvi3017
    @prasaddalvi3017 4 года назад +48

    By far the best theoretical explanation on Gradient Boosting. Now I am very much clear on how Gradient Boosting works. Thank you very much for this detailed explanation

  • @arjunp3574
    @arjunp3574 3 года назад +2

    This is the most simplified explanation of gradient boosting I've come across. Thank you, sir.

  • @denial4958
    @denial4958 2 года назад +3

    Thank you sir it's the day before my exam and this concept was very unclear to me no matter how much I researched. Simply a life saver👏👏

  • @shashankbajpai5659
    @shashankbajpai5659 4 года назад +2

    by far the simplest and the best explanation i could have for adaboost

  • @callingTardis
    @callingTardis 3 года назад +3

    This channel is my Bible!! Thank you for creating ML content, Aman Sir

  • @sriramapriyar6745
    @sriramapriyar6745 4 года назад +4

    I have no words to thank you for teaching this complex concept in a simple and effective way. My heartfelt thanks and keep going with the same spirit.

    • @UnfoldDataScience
      @UnfoldDataScience  4 года назад +1

      Hello Sri, Thanks for you words. These are my motivation to create more content. Happy Learning. Tc :)

  • @warpathcucucu
    @warpathcucucu 3 года назад

    mate, that's literally the best explanation for this topic on youtube

  • @hvs147
    @hvs147 4 года назад +1

    This is by far the clearest/best explanation on Gradient Boosting. Thanks man. God bless!

  • @Atlas-ck9vm
    @Atlas-ck9vm 3 года назад

    Probably the greatest explanation of gradient boosting on the internet.

  • @GopiKumar-ny3xx
    @GopiKumar-ny3xx 4 года назад +4

    Nice presentation.... Useful information

  • @sandeepnayak2427
    @sandeepnayak2427 4 года назад +1

    Its excellent, very much clearly step by step explained , Highly Appreciable ...You are Awesome ..

  • @preranatiwary7690
    @preranatiwary7690 4 года назад +2

    Gradient boost is clear now! Thanks.

  • @praveenkuthuru7439
    @praveenkuthuru7439 3 года назад +2

    Great work !!! Really helps a common person to learn about the GB Algorithm in action in simple terms....Keep up your good work !!!!

  • @gg123das
    @gg123das 4 года назад

    Best Gradient Boosting video on RUclips!!!!

  • @divyanshumnit
    @divyanshumnit 3 года назад +1

    Thank you so much, Sir. I have watched it so many places but the clarity I got from your video. just watching this video I subscribed to your channel.

  • @praneethaluru2601
    @praneethaluru2601 3 года назад +1

    Very goood and elegant explanation of GBoost than others on RUclips Sir...

  • @divyosmiley9
    @divyosmiley9 3 года назад +1

    Thank You, Sir. I read many papers but was so confused, but you made it clear.

  • @IRFANSAMS
    @IRFANSAMS Год назад

    @Unfold Data Science, Sir the way you explain complex topics in a simple manner is extraordinary

  • @kayodeoyeniran2862
    @kayodeoyeniran2862 Год назад

    Thank you for demystifying such a confusing concept. This is the best explanation by far!!!

  • @OhSohomemade
    @OhSohomemade 4 года назад +1

    Hey Aman ..very well explained ... I am beginner and was looking for a easy and practical way of learning these concepts and you made it easy ..thanks very much ..appreciate the good work ..cheers

  • @RanjitSingh-rq1qx
    @RanjitSingh-rq1qx 2 года назад

    Super explanation with in less time. With mathmatics intuition. Tnq u sir for made this mind-blowing video ❤️🥰😊

  • @amalaj4988
    @amalaj4988 3 года назад +4

    Learning a lot from you sir! Crisp and clear points as usual :)

    • @UnfoldDataScience
      @UnfoldDataScience  3 года назад +1

      Thanks Amol. Your comments motivate me to create more content 😊

  • @dd1278
    @dd1278 Год назад

    Legend you are for explaining this so simply.

  • @adithyaboyapati
    @adithyaboyapati 3 года назад +1

    Explanation is crisp and very clear.

  • @vivekbhatia8230
    @vivekbhatia8230 2 года назад

    Very nicely explained sir.. as u said it was not very clear in net.. after your explanation i can understand the working of the gradient boost model.

  • @amithnambiar9818
    @amithnambiar9818 4 года назад

    Thank you ! Never seen a video so detailed yet understandable about Gradient Boosting

  • @sudhavenugopal3726
    @sudhavenugopal3726 4 года назад +2

    Well explained, where a beginner can understand this, thank you so much

  • @FarhanAhmed-xq3zx
    @FarhanAhmed-xq3zx 3 года назад +1

    Very very simple and clear explanation.really awesome👌👌

  • @josephmart7528
    @josephmart7528 3 года назад

    You have made my day with this Ensemble Explanations

  • @cedwin4
    @cedwin4 5 месяцев назад

    Simple and best. Speechless! Thanks a lot :)

  • @naqiuddinadnan9156
    @naqiuddinadnan9156 3 года назад +2

    I need to study this by myself but mostly the explaination are not soo clear but u give great explaination 👍🏼👍🏼👍🏼

  • @dorgeswati
    @dorgeswati 3 года назад +1

    you are awesome . video shows the depth you have in understanding these algorithms well. keep it up

  • @krishnab6444
    @krishnab6444 2 года назад

    thats a perfect explanation aman sir, in a simplest way, thanks alot sir, your videos are really helpful.

  • @shashireddy7371
    @shashireddy7371 4 года назад +1

    Thanks for sharing your knowledge with great explanation .

  • @snehasivakumar9542
    @snehasivakumar9542 4 года назад +2

    Easy to understand. 😊👍

  • @bangarrajumuppidu8354
    @bangarrajumuppidu8354 3 года назад +1

    superb explanatio fanstastic!!

  • @tejaspatil3978
    @tejaspatil3978 4 года назад

    Sir , it is very really best and very easiest explanation.
    Wait for more videos

  • @sarthakgarg184
    @sarthakgarg184 4 года назад

    I have been searching for a better intuition on Gradient Boosting and this is the first video which gave me the best intuition. I am looking for research projects, can you help me with some topics on Machine Learning and Deep Learning which I could explore and ultimately go for a paper!
    I'm also reaching out to you on LinkedIn for better reach. Thankyou for the video :)

    • @UnfoldDataScience
      @UnfoldDataScience  4 года назад

      Thanks Sarthak, lets connect on LinkedIn and we can discuss more. Stay Safe. Tc.

  • @KenzaLamnabhi-f8l
    @KenzaLamnabhi-f8l Год назад

    Thank you for this video! really amazed by how you siplify complex concepts !
    Keep them going please!

  • @hashir3719
    @hashir3719 2 года назад

    It's crystal clear mahn..! thank you

  • @anilboppanna
    @anilboppanna 4 года назад

    Very nicely explained keep posting on such a quality videos..to unfold the data science Black box

  • @abiramimuthu6199
    @abiramimuthu6199 4 года назад +4

    I was running around so many videos for Gradient boosting.........Than you so much for your detailed explanation.....How does it work for a classsification problem?

    • @UnfoldDataScience
      @UnfoldDataScience  4 года назад +4

      Hi Abirami, thank you for the feedback. It's difficult to explain the classification problem through comment. I ll probably create a video for the same :)

    • @shubhankarde4732
      @shubhankarde4732 4 года назад

      please create one video for classification as well.....

  • @srinikethshankar7525
    @srinikethshankar7525 4 года назад

    At 4:00, the error for the OLS method for linear regression is the vertical distance between the regression line and the data points and not the orthogonal distance between them. :)

    • @UnfoldDataScience
      @UnfoldDataScience  4 года назад +1

      Oops Did I say "orthogonal". I should be vertical. Thanks for pointing that out. happy learning. tc

  • @Sagar_Tachtode_777
    @Sagar_Tachtode_777 4 года назад

    Thank you for sharing such a piece of valuable knowledge in free.
    May God bless you with exponential growth in the audience and genuine learners!!!

    • @UnfoldDataScience
      @UnfoldDataScience  4 года назад +1

      So nice of you Sagar. Thanks for motivating me through comment.

  • @jude-harrisonobidinnu3876
    @jude-harrisonobidinnu3876 Год назад

    Very amazing videos. Concepts worth more than jumping into codes. Well done Sir!

  • @ChengchengCai-v9k
    @ChengchengCai-v9k 2 месяца назад

    Thank you! Best explanation! I can understand it!

  • @mansibisht557
    @mansibisht557 3 года назад +1

    Best video so far ! :') Thank you!!!

  • @MinaGholami-e2u
    @MinaGholami-e2u Год назад

    Thank you. it was perfect explanation of gradient algorithm

  • @tradingdna5706
    @tradingdna5706 2 года назад

    Best teacher 👍🏻

  • @IRFANSAMS
    @IRFANSAMS Год назад

    Aman sir, Allah will give you more success in your life

  • @saravananbaburao3041
    @saravananbaburao3041 4 года назад +2

    One of the best video that I have ever watched for GB . Thanks a lot for the video. Can you please cover one video on Bayesian optimization . Really I find difficult to understand on that topic . Thanks in advance

  • @shikhar_anand
    @shikhar_anand 3 года назад +2

    Hi Aman, Thank you very much for the video. It was by far the clearest explanation for the topic. Just one doubt if you could clear it, How we can decide the number of iterations for any problem? You have iterated this for n=2, so how we can decide that.

    • @UnfoldDataScience
      @UnfoldDataScience  3 года назад +1

      Hi Shikhar, We can pass it as parameter while calling the function.

  • @adityasharma2667
    @adityasharma2667 3 года назад

    very well explained...i could say the best video to understand GB

  • @poetryspeculation2344
    @poetryspeculation2344 Месяц назад

    Brilliant videos man! Thank you so much!

  • @sachinmore8938
    @sachinmore8938 Год назад

    You have got very good explanation skills!

  • @ricardocaballero6357
    @ricardocaballero6357 8 месяцев назад

    This is awesome, excellent explanation, thanks a lot

  • @sandhya_exploresfoodandlife
    @sandhya_exploresfoodandlife 3 года назад +1

    hi Aman.. your explanations are so good! thanks a lot

  • @eric_bonucci_data
    @eric_bonucci_data 3 года назад +1

    Super clear, thanks a lot!

  • @19967747
    @19967747 4 года назад +1

    Very well explained ! Please keep on making such nice videos !
    Hope you reach 100k subscribers soon

  • @parthsarthijoshi6301
    @parthsarthijoshi6301 3 года назад +1

    in gradient descent I read that new parameter = old parameter - step size but in your example of Gradient boosting you have done base value + (lr* 1st res) . IS THE + CORRECT????

    • @UnfoldDataScience
      @UnfoldDataScience  3 года назад

      Hi Parthsarthi, yes step size has same meaning.

    • @parthsarthijoshi6301
      @parthsarthijoshi6301 3 года назад

      Sir , I mean to say that you have done addition not subtraction . Is that correct?

  • @sharmita220
    @sharmita220 8 месяцев назад

    Thank you so much for all the videos. Its so clear

  • @vithaln7646
    @vithaln7646 4 года назад +1

    this is very clear explanation ,

  • @pranjalgupta9427
    @pranjalgupta9427 3 года назад +1

    Awesome video 😍😍

  • @shubhankarde4732
    @shubhankarde4732 4 года назад +1

    great explanation...liked a lot

  • @peaceandlov
    @peaceandlov 3 года назад +1

    Super Awesome mate.

  • @sameerpandey5561
    @sameerpandey5561 3 года назад

    Thank you Aman..It was very crisp and clear...explanation
    Just a request..Please add a Video explaining GBDT in case of classification problems ...That would be very much helpful :-)

  • @firstkaransingh
    @firstkaransingh 2 года назад

    Very good and clear explanation 👍

  • @alealejandroooooo
    @alealejandroooooo 3 года назад +1

    This was great man, thanks!

  • @mdshihabuddin4099
    @mdshihabuddin4099 4 года назад +1

    Thanks a ton for your spotless explanation . I have a question, how many residual models we will compute for getting our expected model, or how we will understand we need to compute that much of residual model.

    • @UnfoldDataScience
      @UnfoldDataScience  4 года назад +1

      Good question Shihab, that number is a hyperparameter that can be tuned however there will be some default value for algorithm in R and Python.

    • @mdshihabuddin4099
      @mdshihabuddin4099 4 года назад

      Thanks for your response.

  • @preetibhatt5085
    @preetibhatt5085 4 года назад +2

    Great explanation ... u said it right , couldn’t find right material for boosting on net . Could u pls make a video on XGBoost as well ??thanks for ur response in advance

  • @MohitGupta-sz4bh
    @MohitGupta-sz4bh 3 года назад

    How the algorithm decides the no of trees in Gradient boosting. And its advantages and disadvantages over Adaptive boosting. When to choose what... Please explain or reply in comments and yours videos are very helpful for someone like me who wants to switch his Career in Data Science field.
    Also Can you please explain why we have the leaf nodes in the range of 8-32 in Gradient boosting and only one leaf node in Adaptive Boosting.

    • @UnfoldDataScience
      @UnfoldDataScience  3 года назад

      # of trees - u can pass as parameter
      AdaBoost vs GB which to choose - depends on scenario
      I dont think there will be only one leaf node

  • @kalluriramakrishna5732
    @kalluriramakrishna5732 Год назад

    excellent aman. Thank you so much

  • @ranajaydas8906
    @ranajaydas8906 3 года назад

    The best explanation I ever saw!

  • @MbaliNene-kf6wk
    @MbaliNene-kf6wk Месяц назад

    Thank you so Aman this really helps!

  • @nurulamin7982
    @nurulamin7982 3 года назад +1

    Awesome.

  • @jagannadhareddykalagotla624
    @jagannadhareddykalagotla624 2 года назад +1

    @aman how to choose learning rate value and how to choose no.of trees

  • @subhajit11234
    @subhajit11234 4 года назад +1

    If you keep on growing the trees, it will overfit. How do you stop that? Will the model automatically stop ? or do we need to tune the hyperparameters? Also, it will be helpful if you can pick a record which we want to predict after training and demonstrate what will be the output, then that will be good. Going by your theory, all records you want to predict will have the same prediction. :)

    • @UnfoldDataScience
      @UnfoldDataScience  4 года назад +1

      Hi Suvajit,
      We must prune decision tree to avoid over fitting. Pruning can be done in multiple ways, like limiting number of leaf nodes, limiting branch size, limting depth of tree etc.
      All these inputs can be passed to model when we call gradient boost. For optimal values, we should tune the hyper parameter.
      Coming to part 2 of the question, all the records will not have the same prediction as error is getting optimized in every iteration. In the same model, If i try to predict for two different records, predictions will be different based on value of independent columns.

  • @himanshupathak3090
    @himanshupathak3090 4 года назад

    Great explanation on gradient boosting.... your videos are making look such complex algorithms so easy :)

  • @aiuslocutius9758
    @aiuslocutius9758 2 года назад

    Thank you very much. Learning a lot from your videos!

  • @abhishekdas2640
    @abhishekdas2640 3 года назад +1

    Hello sir,
    So I have a doubt,
    which type of model we are using while predicting New Residuals. (Is it simple Linear Regression or any thing else)

    • @UnfoldDataScience
      @UnfoldDataScience  3 года назад

      Decision trees.

    • @abhishekdas2640
      @abhishekdas2640 3 года назад

      @@UnfoldDataScience Thank you sir. It would be great if I can see it practically with an example. But the video is really helping me out to understand the core concepts. ❤️❤️❤️

  • @shreyaanand6323
    @shreyaanand6323 2 месяца назад

    simple and best

  • @bhargavdr
    @bhargavdr 3 года назад +1

    Twas very helpful thank you.

  • @surajsthomas
    @surajsthomas 3 года назад

    Awesome video.Very well explained.

  • @SuperShiva619
    @SuperShiva619 4 года назад

    Ur awesome🙂 the intent at which u explain concepts are mind blowing.

  • @subhz1
    @subhz1 4 года назад

    Excellent video!!! Great work!!!!

  • @Gamezone-kq5sx
    @Gamezone-kq5sx 3 года назад

    best explanation....good going

  • @ajaykushwaha-je6mw
    @ajaykushwaha-je6mw 4 года назад

    Thank you so much for such a nice video. It help me to understand thee concept of GB Algo.

  • @bhushanchaudhari378
    @bhushanchaudhari378 4 года назад

    Very well explained sir🎂.. thanks a ton

  • @RaviShankar-jm1qw
    @RaviShankar-jm1qw 3 года назад

    Awesome and super clear explanation. :)

  • @hirdeshkumar4069
    @hirdeshkumar4069 3 года назад +1

    How will you define you learning rate and how did you arrive to the value 0.1?

    • @UnfoldDataScience
      @UnfoldDataScience  3 года назад

      This is just a number I took for explanation however this is a parameter that can be tuned..

  • @chaitanyakaushik6772
    @chaitanyakaushik6772 3 года назад

    Excellent explaination sir.

  • @sudheerrao9820
    @sudheerrao9820 4 года назад +1

    Thank you for the video...very useful

  • @goelnikhils
    @goelnikhils 2 года назад

    Amazing Content. Thanks a lot

  • @dimitarnentchev1107
    @dimitarnentchev1107 4 года назад

    Very clearly explained ! Thank you.

  • @devikasathyan1333
    @devikasathyan1333 4 года назад

    simply super sir . you are indeed a life saver :)

    • @UnfoldDataScience
      @UnfoldDataScience  4 года назад +1

      Thanks Devika, Good to know my efforts are helping you. Happy Learning. Stay Safe. tc

    • @devikasathyan1333
      @devikasathyan1333 4 года назад

      @@UnfoldDataScience yes ofcourse... I was searching for the same ... And only your video has got the correct content🙌😇

  • @pokabhanuchandar9140
    @pokabhanuchandar9140 Год назад

    Hi aman thanks for explaining the concepts. here I have one question for u "will ada boost accept repetitive records like random forest? "

  • @prasadjayanti
    @prasadjayanti 3 года назад

    good work..

  • @soumyagupta9301
    @soumyagupta9301 3 года назад

    I understood how Gradient Boosting works but still not understood why it works. Actually, I am not getting the intuition behind why we are interested in training the model on the residual error rather than the true value of y. Can you please explain this in a bit more detail? Anyway, I am a big fan of your teaching.,.it's so simple and easy to understand. Thank you for teaching so well.

    • @UnfoldDataScience
      @UnfoldDataScience  3 года назад

      Thanks Soumya. you work with data more and you will know.

  • @nikkitha92
    @nikkitha92 4 года назад

    This video is very awesome sir. I was able to understand very well.

  • @ymaheshwarreddy
    @ymaheshwarreddy 4 года назад

    Kudos!! Informative videos. "Data Science made Easy" is what i have to say:)