Feature Selection Wrapper and Embedded techniques | Feature Selection Playlist
HTML-код
- Опубликовано: 28 мар 2022
- Feature Selection Wrapper and Embedded techniques | Feature Selection Playlist
#FeatureSelectionTechniques #FeatureSelection #UnfoldDataScience
Hello ,
My name is Aman and I am a Data Scientist.
About this video,
In this video, I explain about feature selection techniques under wrapper and embedded methods. I explain what is feature selection techniques under embedded and wrapper method and present python demo of these techniques as well. Below topics are explained in this video.
1. Feature Selection Wrapper and Embedded techniques
2. Feature Selection Playlist
3. Feature Selection in python
4. Feature Selection unfold data science
5. rfe vs lasso
6. rfe vs rfecv
7. rfe vs ffe
About Unfold Data science: This channel is to help people understand basics of data science through simple examples in easy way. Anybody without having prior knowledge of computer programming or statistics or machine learning and artificial intelligence can get an understanding of data science at high level through this channel. The videos uploaded will not be very technical in nature and hence it can be easily grasped by viewers from different background as well.
If you need Data Science training from scratch . Please fill this form (Please Note: Training is chargeable)
docs.google.com/forms/d/1Acua...
Book recommendation for Data Science:
Category 1 - Must Read For Every Data Scientist:
The Elements of Statistical Learning by Trevor Hastie - amzn.to/37wMo9H
Python Data Science Handbook - amzn.to/31UCScm
Business Statistics By Ken Black - amzn.to/2LObAA5
Hands-On Machine Learning with Scikit Learn, Keras, and TensorFlow by Aurelien Geron - amzn.to/3gV8sO9
Ctaegory 2 - Overall Data Science:
The Art of Data Science By Roger D. Peng - amzn.to/2KD75aD
Predictive Analytics By By Eric Siegel - amzn.to/3nsQftV
Data Science for Business By Foster Provost - amzn.to/3ajN8QZ
Category 3 - Statistics and Mathematics:
Naked Statistics By Charles Wheelan - amzn.to/3gXLdmp
Practical Statistics for Data Scientist By Peter Bruce - amzn.to/37wL9Y5
Category 4 - Machine Learning:
Introduction to machine learning by Andreas C Muller - amzn.to/3oZ3X7T
The Hundred Page Machine Learning Book by Andriy Burkov - amzn.to/3pdqCxJ
Category 5 - Programming:
The Pragmatic Programmer by David Thomas - amzn.to/2WqWXVj
Clean Code by Robert C. Martin - amzn.to/3oYOdlt
My Studio Setup:
My Camera : amzn.to/3mwXI9I
My Mic : amzn.to/34phfD0
My Tripod : amzn.to/3r4HeJA
My Ring Light : amzn.to/3gZz00F
Join Facebook group :
groups/41022...
Follow on medium : / amanrai77
Follow on quora: www.quora.com/profile/Aman-Ku...
Follow on twitter : @unfoldds
Get connected on LinkedIn : / aman-kumar-b4881440
Follow on Instagram : unfolddatascience
Watch Introduction to Data Science full playlist here : • Data Science In 15 Min...
Watch python for data science playlist here:
• Python Basics For Data...
Watch statistics and mathematics playlist here :
• Measures of Central Te...
Watch End to End Implementation of a simple machine learning model in Python here:
• How Does Machine Learn...
Learn Ensemble Model, Bagging and Boosting here:
• Introduction to Ensemb...
Build Career in Data Science Playlist:
• Channel updates - Unfo...
Artificial Neural Network and Deep Learning Playlist:
• Intuition behind neura...
Natural langugae Processing playlist:
• Natural Language Proce...
Understanding and building recommendation system:
• Recommendation System ...
Access all my codes here:
drive.google.com/drive/folder...
Have a different question for me? Ask me here : docs.google.com/forms/d/1ccgl...
My Music: www.bensound.com/royalty-free...
Access English, Hindi Course here - www.unfolddatascience.com/store
Don't forget to register on the website, it's free🙂
Thanku Sir for this great explanation.
Hello Aman, thanks so much for the detailed explanation. Could you also talk about clustering based feature selection technique?
Thank you for this
Thank you Aman!! Such crisp explanation!
My pleasure 😊
Awesome Sir!!!! Thanks a lot. You are a perfect Guru for any DS learner. Another request Sir, kindly make a detailed video on SVM. It would be really helpful for many of us.
Thanks a lot Subhajit, sure
Thanks
Great video Aman! Thanks for sharing!
Can u please tell which algorithm to use for product recommendation using demographic data like age, Salary, Gender, Occupation etc….???
thanks for the awesome work!
Glad you found it helpful.
Very good explanation Aman, you are a good teacher, I follow your videos, very simple and understanding explanation, good luck!
Your comments motivate me. Thank you so much.
This is awesome!
Please, I have a question:
In the backward wrapper method of feature selection, how can I use my own "user defined" model. I have an already existing model, but i want to reduce the features. It is a linear equation: Y = 0.22D + 0.19E + 0.16F + 0.15G + 0.16H + 0.12K
I want to do feature elimination without changing the coefficients.
very good explanation of the concepts
Thank you Pavan
Sir, the combination of feature you got in your result is applicable for KNN only or same combination works for other model as well?????
Very Informative video,i have some doubts regarding forward feature selection
1. PCA with forward feature selection
2. feature names we have to select, k_features we have to give exactly 3 or 4, then how algorithm will select,and which features will select
Can we do wrapper method for feature selection in unsupervised learning data?
Aman, you said, in RFE, it is internally decided how the variables will be eliminated and in backward selection, we are passing knn model to remove the variables. BUT, in RFE you are passing a Linear Rgeression model, please explain
Hy, I really loved your video and appreciate your efforts in making such informative videos. I have 3 questions though.
1. In the video you have used the methods on numerical data can we use it on categorical?
2. We should use it before or after feature engineering? Like after making dummy variables and binning are data it requires?
3. In RFE -CV all the variables were showing as 1 i.e. important. Can you explain it a little bit? Or if you can direct me to some video.
Thanks Ayushi.
1. Some test can be used on numerical only.
2. Before only
3. Try with other data this will change, here the difference is not that much.
Hi Aman, once we get the number of importance feature then we have to remove unwanted featured from X_train and X_test right ?
Yes, Both places. No need of these features ahead, just keep a track of what all we removed so that next time new data comes we know what to keep/remove.
Sir do we need to apply all techniques(filter, wrapper, embedded) and see that which feature is important?
Yes if you have the infrastructure to support especially if your model is not doing good.
Hello sir
What if features are categorical and discrete?
Test like chi square and some model based technique will be used.
@@UnfoldDataScience ok sir can you make a video on that as well?
can we do all this techniques inside a pipeline
You can do.
And take a call in the end based on results
@@UnfoldDataScience thank you man
If we have the domain knowledge I think we don't need to perform feature selection techniques ?
Then also we need to see , domain knowledge is what we know, "Data must tell its own story"
@@UnfoldDataScience Okay Sir.
Cost of your data science course?
Please fill the form attached in the description of the video.
hello Aman, pls can I have your personal mail
Thanks