Feature Selection Wrapper and Embedded techniques | Feature Selection Playlist

Поделиться
HTML-код
  • Опубликовано: 28 мар 2022
  • Feature Selection Wrapper and Embedded techniques | Feature Selection Playlist
    #FeatureSelectionTechniques #FeatureSelection #UnfoldDataScience
    Hello ,
    My name is Aman and I am a Data Scientist.
    About this video,
    In this video, I explain about feature selection techniques under wrapper and embedded methods. I explain what is feature selection techniques under embedded and wrapper method and present python demo of these techniques as well. Below topics are explained in this video.
    1. Feature Selection Wrapper and Embedded techniques
    2. Feature Selection Playlist
    3. Feature Selection in python
    4. Feature Selection unfold data science
    5. rfe vs lasso
    6. rfe vs rfecv
    7. rfe vs ffe
    About Unfold Data science: This channel is to help people understand basics of data science through simple examples in easy way. Anybody without having prior knowledge of computer programming or statistics or machine learning and artificial intelligence can get an understanding of data science at high level through this channel. The videos uploaded will not be very technical in nature and hence it can be easily grasped by viewers from different background as well.
    If you need Data Science training from scratch . Please fill this form (Please Note: Training is chargeable)
    docs.google.com/forms/d/1Acua...
    Book recommendation for Data Science:
    Category 1 - Must Read For Every Data Scientist:
    The Elements of Statistical Learning by Trevor Hastie - amzn.to/37wMo9H
    Python Data Science Handbook - amzn.to/31UCScm
    Business Statistics By Ken Black - amzn.to/2LObAA5
    Hands-On Machine Learning with Scikit Learn, Keras, and TensorFlow by Aurelien Geron - amzn.to/3gV8sO9
    Ctaegory 2 - Overall Data Science:
    The Art of Data Science By Roger D. Peng - amzn.to/2KD75aD
    Predictive Analytics By By Eric Siegel - amzn.to/3nsQftV
    Data Science for Business By Foster Provost - amzn.to/3ajN8QZ
    Category 3 - Statistics and Mathematics:
    Naked Statistics By Charles Wheelan - amzn.to/3gXLdmp
    Practical Statistics for Data Scientist By Peter Bruce - amzn.to/37wL9Y5
    Category 4 - Machine Learning:
    Introduction to machine learning by Andreas C Muller - amzn.to/3oZ3X7T
    The Hundred Page Machine Learning Book by Andriy Burkov - amzn.to/3pdqCxJ
    Category 5 - Programming:
    The Pragmatic Programmer by David Thomas - amzn.to/2WqWXVj
    Clean Code by Robert C. Martin - amzn.to/3oYOdlt
    My Studio Setup:
    My Camera : amzn.to/3mwXI9I
    My Mic : amzn.to/34phfD0
    My Tripod : amzn.to/3r4HeJA
    My Ring Light : amzn.to/3gZz00F
    Join Facebook group :
    groups/41022...
    Follow on medium : / amanrai77
    Follow on quora: www.quora.com/profile/Aman-Ku...
    Follow on twitter : @unfoldds
    Get connected on LinkedIn : / aman-kumar-b4881440
    Follow on Instagram : unfolddatascience
    Watch Introduction to Data Science full playlist here : • Data Science In 15 Min...
    Watch python for data science playlist here:
    • Python Basics For Data...
    Watch statistics and mathematics playlist here :
    • Measures of Central Te...
    Watch End to End Implementation of a simple machine learning model in Python here:
    • How Does Machine Learn...
    Learn Ensemble Model, Bagging and Boosting here:
    • Introduction to Ensemb...
    Build Career in Data Science Playlist:
    • Channel updates - Unfo...
    Artificial Neural Network and Deep Learning Playlist:
    • Intuition behind neura...
    Natural langugae Processing playlist:
    • Natural Language Proce...
    Understanding and building recommendation system:
    • Recommendation System ...
    Access all my codes here:
    drive.google.com/drive/folder...
    Have a different question for me? Ask me here : docs.google.com/forms/d/1ccgl...
    My Music: www.bensound.com/royalty-free...

Комментарии • 41

  • @UnfoldDataScience
    @UnfoldDataScience  2 года назад

    Access English, Hindi Course here - www.unfolddatascience.com/store
    Don't forget to register on the website, it's free🙂

  • @umasharma6119
    @umasharma6119 2 года назад

    Thanku Sir for this great explanation.

  • @leamon9024
    @leamon9024 2 года назад

    Hello Aman, thanks so much for the detailed explanation. Could you also talk about clustering based feature selection technique?

  • @israrobinson5175
    @israrobinson5175 Месяц назад

    Thank you for this

  • @mansibisht557
    @mansibisht557 Год назад

    Thank you Aman!! Such crisp explanation!

  • @subhajitroy4869
    @subhajitroy4869 2 года назад +1

    Awesome Sir!!!! Thanks a lot. You are a perfect Guru for any DS learner. Another request Sir, kindly make a detailed video on SVM. It would be really helpful for many of us.

  • @mohamedesmailelsalahaty6050
    @mohamedesmailelsalahaty6050 2 года назад

    Thanks

  • @Sagar_Tachtode_777
    @Sagar_Tachtode_777 2 года назад

    Great video Aman! Thanks for sharing!
    Can u please tell which algorithm to use for product recommendation using demographic data like age, Salary, Gender, Occupation etc….???

  • @sudhanshusoni1524
    @sudhanshusoni1524 2 года назад

    thanks for the awesome work!

  • @saharyarmohamadi9176
    @saharyarmohamadi9176 Год назад

    Very good explanation Aman, you are a good teacher, I follow your videos, very simple and understanding explanation, good luck!

  • @FranklinKondum
    @FranklinKondum 2 года назад

    This is awesome!
    Please, I have a question:
    In the backward wrapper method of feature selection, how can I use my own "user defined" model. I have an already existing model, but i want to reduce the features. It is a linear equation: Y = 0.22D + 0.19E + 0.16F + 0.15G + 0.16H + 0.12K
    I want to do feature elimination without changing the coefficients.

  • @pavansingara9408
    @pavansingara9408 Год назад

    very good explanation of the concepts

  • @beautyisinmind2163
    @beautyisinmind2163 2 года назад

    Sir, the combination of feature you got in your result is applicable for KNN only or same combination works for other model as well?????

  • @sriamani
    @sriamani 2 года назад

    Very Informative video,i have some doubts regarding forward feature selection
    1. PCA with forward feature selection
    2. feature names we have to select, k_features we have to give exactly 3 or 4, then how algorithm will select,and which features will select

  • @alishaparveen4735
    @alishaparveen4735 2 года назад

    Can we do wrapper method for feature selection in unsupervised learning data?

  • @fahadnasir1605
    @fahadnasir1605 Год назад

    Aman, you said, in RFE, it is internally decided how the variables will be eliminated and in backward selection, we are passing knn model to remove the variables. BUT, in RFE you are passing a Linear Rgeression model, please explain

  • @akjasmin90
    @akjasmin90 2 года назад

    Hy, I really loved your video and appreciate your efforts in making such informative videos. I have 3 questions though.
    1. In the video you have used the methods on numerical data can we use it on categorical?
    2. We should use it before or after feature engineering? Like after making dummy variables and binning are data it requires?
    3. In RFE -CV all the variables were showing as 1 i.e. important. Can you explain it a little bit? Or if you can direct me to some video.

    • @UnfoldDataScience
      @UnfoldDataScience  2 года назад +1

      Thanks Ayushi.
      1. Some test can be used on numerical only.
      2. Before only
      3. Try with other data this will change, here the difference is not that much.

  • @ajaykushwaha-je6mw
    @ajaykushwaha-je6mw 2 года назад

    Hi Aman, once we get the number of importance feature then we have to remove unwanted featured from X_train and X_test right ?

    • @UnfoldDataScience
      @UnfoldDataScience  2 года назад

      Yes, Both places. No need of these features ahead, just keep a track of what all we removed so that next time new data comes we know what to keep/remove.

  • @beautyisinmind2163
    @beautyisinmind2163 2 года назад

    Sir do we need to apply all techniques(filter, wrapper, embedded) and see that which feature is important?

    • @UnfoldDataScience
      @UnfoldDataScience  2 года назад +1

      Yes if you have the infrastructure to support especially if your model is not doing good.

  • @zeeshankhanyousafzai5229
    @zeeshankhanyousafzai5229 2 года назад

    Hello sir
    What if features are categorical and discrete?

    • @UnfoldDataScience
      @UnfoldDataScience  2 года назад +1

      Test like chi square and some model based technique will be used.

    • @zeeshankhanyousafzai5229
      @zeeshankhanyousafzai5229 2 года назад

      @@UnfoldDataScience ok sir can you make a video on that as well?

  • @skvali3810
    @skvali3810 2 года назад

    can we do all this techniques inside a pipeline

  • @umasharma6119
    @umasharma6119 2 года назад

    If we have the domain knowledge I think we don't need to perform feature selection techniques ?

    • @UnfoldDataScience
      @UnfoldDataScience  2 года назад

      Then also we need to see , domain knowledge is what we know, "Data must tell its own story"

    • @umasharma6119
      @umasharma6119 2 года назад

      @@UnfoldDataScience Okay Sir.

  • @Fatima-gw7sm
    @Fatima-gw7sm 2 года назад

    Cost of your data science course?

    • @UnfoldDataScience
      @UnfoldDataScience  2 года назад

      Please fill the form attached in the description of the video.

  • @yachillababa9445
    @yachillababa9445 11 месяцев назад

    hello Aman, pls can I have your personal mail

  • @hari_1357
    @hari_1357 2 года назад

    Thanks