Forward Stepwise Selection Method

Поделиться
HTML-код
  • Опубликовано: 13 дек 2024

Комментарии • 22

  • @LexieLin-fd5xp
    @LexieLin-fd5xp Год назад

    nice explanation, get to totally understand forward&backward method finally.

  • @erickkaviga6001
    @erickkaviga6001 3 года назад

    Most helpfull things in my study🙏🏿🙏🏿

  • @gulinasirova3257
    @gulinasirova3257 Год назад

    very good explanation !

  • @vikasbhardwaj455
    @vikasbhardwaj455 6 лет назад +1

    Great Explanation.

    • @IQmates
      @IQmates  6 лет назад

      Thanks Vikas Bhardwaj. I’m glad it made sense. Stay tuned for more videos!

  • @lizzy1138
    @lizzy1138 3 года назад +1

    Great video, but you should credit the book you took the method almost verbatim from : introduction to statistical learning, chapter 6, algorithm 6.2 - but again, really helpful, thank you !!! EDIT : IQmates actually credited several sources in an introduction video, my mistake.

    • @IQmates
      @IQmates  3 года назад

      Hi Dr. Lizard. In my introduction video, I do credit the sources I use. Most structure came from ISLR and I talk of sources such as Towards Data Science blog, Analytics Vidhya, ELSR and so on. Thank you for the feedback though. I will edit the caption.

    • @lizzy1138
      @lizzy1138 3 года назад

      @@IQmates My apologies, I had not seen the introduction video, I will edit my comment. Thank you for your reply

  • @saikishore1992
    @saikishore1992 5 лет назад +1

    Really cool explanation. Please keep producing more videos. If I had some investment money, I would have put it right on your channel just to make it more fancier, content is superb!

    • @IQmates
      @IQmates  5 лет назад +1

      Sai Kishore Subramaniam thank you for the positive feedback. I am back to recording now. About to finish the machine learning course and I’ll start on deep learning 😊

    • @Viewfrommassada
      @Viewfrommassada 5 лет назад

      I feel the same too. Great work

  • @MichaelMeighu
    @MichaelMeighu 5 лет назад

    This is an amazing site! Thank you pal. If you have a patreon account you should post it. The other thing - are you sure its B1X2 - or is it B2X2?

  • @sagaradoshi
    @sagaradoshi 2 года назад

    Thanks for the wonderful explanation.. I have a doubt hope you get sometime to clarify. For example when we do RSS or R^2 to find the best model what should be the value of parameter(Beta) we need to substitute with? Normally we find the parameters value like that best fit our data and keep adjusting until our RSS is reduced. But in the above case where in each loop we check for many models (i.e., by adding additional argument) how is our Beta calculated?

    • @yusufansari9159
      @yusufansari9159 Год назад

      First we create a model. This model gives estimate of parameter beta. Then we check the RSS(or R2). If RSS is low among other models that model is selected and hence its beta estimate is for that model is selected.

  • @wahyuwisnuwardana9839
    @wahyuwisnuwardana9839 5 лет назад +1

    Good explanation! But, just wanna give a suggestion: why don't you provide less than 10 features only? Lets say 3 or for is enough. :)

    • @IQmates
      @IQmates  5 лет назад

      Wahyu Wisnu Wardana thank you for the feedback. I will try use fewer features for the explanations. My aim was to show how the method can be difficult computationally as we have more features 😊

  • @morikawadivinda2586
    @morikawadivinda2586 5 лет назад

    Its a good explain. But I still have confuse. When the process will stop?

    • @IQmates
      @IQmates  5 лет назад

      The principle is there is no defined stopping criteria for this. You test all models and compare them to get the best one. For example, when you are doing k = 2, you create all models that have two variables and then you test them to get the best one of those models, then you do the same for k = 3 (all models with 3 variables). You keep doing that, getting the best model for each k. After you are done getting the best models for each k value, you compare those best models using AIC or something to get the best of the best. So there is no stopping criteria. You look at all possible models.

    • @Viewfrommassada
      @Viewfrommassada 5 лет назад +1

      I think it stops when the addition of any variable doesn't augment the model or the additional variables are insignificant

  • @muhammadfarhan-qe6gp
    @muhammadfarhan-qe6gp 5 лет назад

    hi i m farhan i cannot understand your main step when you 3rd model

    • @IQmates
      @IQmates  5 лет назад

      Which part exactly muhammad farhan ? Can you put the time stamp of where you are questioning.

  • @muhammadfarhan-qe6gp
    @muhammadfarhan-qe6gp 5 лет назад

    plzz expiain it