7. Chi-Square Clearly Explained | Feature Selection Using Chi-Square | Chi-Square by hand |

Поделиться
HTML-код
  • Опубликовано: 11 сен 2024
  • #chisquaretest #statistics #featureselection #chi2 #datapreprocessing #machinelearning #python #scikitlearn #selectkbest #modelperformance #correlatedfeatures #datacleansing #datawrangling
    Welcome to our video on feature selection using Chi-Square! In this tutorial, we'll be exploring a technique for identifying and selecting features that are highly correlated with the target variable. This method can be particularly useful when working with datasets that contain many irrelevant or redundant features.
    We'll be demonstrating how to apply this technique in Python using scikit-learn. We'll start by loading and preparing the dataset, then we'll use scikit-learn's Chi2 method to calculate the Chi-Square statistic for each feature. Next, we'll use the SelectKBest method to select the top performing features based on their Chi-Square score. Finally, we'll evaluate the impact of feature selection on the model's performance.
    By the end of this video, you'll have a solid understanding of how to use Chi-Square to select relevant features and improve the performance of your machine learning models.

Комментарии • 30

  • @bariselmas4809
    @bariselmas4809 5 месяцев назад +2

    This channel should have millions of subscribers!

  • @ParthGupta-my9ox
    @ParthGupta-my9ox 6 месяцев назад +1

    Underrated channel man!!

  • @sohampavaskar4991
    @sohampavaskar4991 6 месяцев назад +1

    Best video. Amazing and clear explanation.

  • @farrugiamarc0
    @farrugiamarc0 8 месяцев назад +1

    Well done. Very well presented and explained.

  • @ParthGupta-my9ox
    @ParthGupta-my9ox 6 месяцев назад +1

    This deserves a subscribe.

  • @ozan4702
    @ozan4702 Год назад +2

    Excellent content!

  • @user-uc9wp8mx3j
    @user-uc9wp8mx3j 9 месяцев назад +1

    Well Explained Vishnu. Really appreciable

  • @ajaykryadav
    @ajaykryadav 10 месяцев назад +1

    too good

  • @hitendrasingh01
    @hitendrasingh01 Год назад +2

    good explanation

  • @vijaybudhewar7014
    @vijaybudhewar7014 Год назад +1

    Very good man..Thanks for making this easy

  • @borhanmukto7770
    @borhanmukto7770 Год назад +1

    great presentation! Please upload other statistical supervised models, Thanks!!!

  • @bellion166
    @bellion166 10 месяцев назад +1

    Thanks for the intuition! Helped a lot during my 0th review today!

  • @datadoctor10
    @datadoctor10 11 месяцев назад +1

    Thanks so much. Wondering why you have very little views…. Your channel deserves to get more subscribers

  • @69nukeee
    @69nukeee Год назад +1

    Nicely explained, great job Vishnu!

  • @kseniiafilippova8540
    @kseniiafilippova8540 Год назад +1

    Thank you!

  • @krishipathak474
    @krishipathak474 Год назад +1

    thank you
    vishnu

  • @thomascotter4141
    @thomascotter4141 Год назад +2

    so the features with higher chi values are the ones you should be using in your ml model?

    • @learnwithvichu
      @learnwithvichu  Год назад +1

      Exactly... Because, if the chi value is high which indicates that there is a huge difference between observed and expected values.

  • @danielyesiduribe2905
    @danielyesiduribe2905 10 месяцев назад +1

    Hello, what technique would work for a situation where my independent variables are numerical and my target variable is categorical?

    • @learnwithvichu
      @learnwithvichu  10 месяцев назад

      Forget about dependent and independent variables.
      Simply consider like this "You want to find out the relationship between numeric and categorical variables".
      Then you can use ANOVA.
      I hope it is clear.. check out my video on ANOVA for better understanding and subscribe for more such a contents

    • @danielyesiduribe2905
      @danielyesiduribe2905 10 месяцев назад +1

      thank you very much, I discovered your channel and I really like your content.@@learnwithvichu

  • @TheKumarAshwin
    @TheKumarAshwin 3 месяца назад

    bro have small doubt, The expected values table is calculated assuming there is no relationship between gender and results?

    • @learnwithvichu
      @learnwithvichu  3 месяца назад

      Yes, if there is no relationship between the variables, how the expected values would look like.
      Then we can compare the observed values against expected values.

  • @siddharth4251
    @siddharth4251 Год назад +2

    after wasting almost hour finally get you and feel satisfied....keep on doing such wonderful work.....nice presentation with animation and excellent explanation..
    you earned one subscription bro!! thanks