13.0 Introduction to Feature Selection (L13: Feature Selection)

Поделиться
HTML-код
  • Опубликовано: 17 окт 2024

Комментарии • 19

  • @emilwalleser4752
    @emilwalleser4752 2 года назад +3

    You taught my Introduction to Biostats and 451 at UW. You are one of the best professors I have ever encountered. Thank you for providing all of this extra material. It is greatly appreciated.

    • @SebastianRaschka
      @SebastianRaschka  2 года назад +1

      Nice hearing from you, Emil! And thanks so much for these very kind words! You can't imagine how motivating this is to hear :)

  • @pulkitmadan6381
    @pulkitmadan6381 2 года назад +6

    Commitment level: 1000.
    Thanks for adding additional topics and making the lecture videos publicly available :-)
    Really looking forward to the new edition of your book 👀

    • @SebastianRaschka
      @SebastianRaschka  2 года назад +3

      That's great to hear! It's a busy time, but I am hoping to record a video a day on a regular basis until all the additional topics are covered :). Hah, the new edition is also not too far out (I hope) -- currently editing the last chapter!

  • @zaynnicholas9151
    @zaynnicholas9151 2 года назад +6

    Absolutely love your content! Best machine learning lectures I've watched so far on youtube :)

    • @SebastianRaschka
      @SebastianRaschka  2 года назад +1

      Wow, thanks!!!

    • @TrainingDay2001
      @TrainingDay2001 2 года назад +1

      I have to second that! Sebastian please keep continuing the great work :)

  • @rahulpaul9432
    @rahulpaul9432 2 года назад +1

    I have been following your videos since the past month and I can say with confidence that your videos are one of the best I have seen so far on ML! Thanks a lot for all the efforts you have put in making these lectures!!!
    Are there any lectures/books that you have authored which are dedicated to forecasting (both time-series and using regression models?). Would be of great help to me if you could guide me to them.
    Thanks again! :)

    • @SebastianRaschka
      @SebastianRaschka  2 года назад

      Thanks for the kind words! Regarding the textbooks, I actually don't do much/any forecasting and have limited experience in that realm. I would make check out sktime (github.com/alan-turing-institute/sktime) and StumPy (github.com/TDAmeritrade/stumpy) and see if they have any good literature pointers.

  • @1UniverseGames
    @1UniverseGames 2 года назад +1

    Great to find out your YT, seems I'm following you in each platforms. ❤️

  • @HannyDart
    @HannyDart 9 месяцев назад

    Thank you so much for uploading!
    I have a dataset where feature selection/dimensionality reduction is vital, but in my class at university that we were only scatching the surface of feature selection (and we didnt get those add on lectures ;))

  • @keyushshah7295
    @keyushshah7295 2 года назад +1

    I have been enjoying your ML lectures and they are very well explained with a deep understanding of the concepts. Firstly, thank you for putting these up on RUclips.
    I just had one question - so I have heard that dimensionality reduction has principal component analysis and Anomaly detection. So how is Principal component analysis different from feature selection?

    • @SebastianRaschka
      @SebastianRaschka  2 года назад

      Glad you find them useful! So both feature selection and feature extraction are ways to reduce the dimensionality. In feature selection, you usually select a subset of existing features. The features you select are still original features. In feature extraction, however, you create "new" features. In principal component analysis, these "new" feature are linear transformations of the features. For a more simple illustration, consider a dataset with 3 features, x1, x2, and x3. Feature selection would be e.g., selecting feature x1. Feature extraction could be creating a new feature x' = x1 + x2 + 3*x3.

  • @beautyisinmind2163
    @beautyisinmind2163 2 года назад

    Respected professor, your videos are guiding me very well and recently I'm learning how to select features. I have one huge confusion, Anova(F-test) is often used in Filter method for feature selection. Theory says, Anova should be used for feature selection when target is Binary but I saw in some practical use people also uses Anova when target is multi class. So Anova(F-test) can also be applied if our target is not binary and has multiple classes(say data like IRIS)?
    Another question Anova assumes features to be normally distributed, But in practice most of the time we encounter data that are not fully normal in such case does it matter much to apply it in feature selection? or Transformation of data into some distribution is compulsory? Please clear my confusion answering these two questions.

  • @wolfisraging
    @wolfisraging 2 года назад +1

    Amazing content ❤️ keep it up, have learnt so much from you mate 🙂

  • @bogdan3209
    @bogdan3209 2 года назад

    Hello, Are slides from all lectures available somewhere?

    • @SebastianRaschka
      @SebastianRaschka  2 года назад +1

      Yeah, they are under sebastianraschka.com/pdf/lecture-notes/stat451fs20/. Will add links to the video descriptions.

  • @wagutijulius4529
    @wagutijulius4529 2 года назад +1

    same same following you here as well, besides tweerrrra

  • @danieleboch3224
    @danieleboch3224 Год назад

    So based