Regression analysis in R: backward selection

Поделиться
HTML-код
  • Опубликовано: 4 окт 2024
  • Selecting variables for a linear model is dicey. Let's get started! If this vid helps you, please help me a tiny bit by mashing that 'like' button. For more #rstats joy, crush that 'subscribe' button!

Комментарии • 17

  • @EquitableEquations
    @EquitableEquations  Год назад +3

    You can find materials supporting this vid (and others) at github.com/equitable-equations/youtube.

    • @louiseweschler1746
      @louiseweschler1746 11 месяцев назад

      Got it! Thank you very much. Now I can work through another great vid in your Regression series of vids (*all* your vids are excellent!).

  •  11 месяцев назад

    Really interesting topic. Great video, really well explained!

  • @evanspencer3819
    @evanspencer3819 6 месяцев назад

    Thanks for the great video. Really well done.

  • @Ammarsays
    @Ammarsays 11 месяцев назад

    Great video. Please also make a video on selection in logistic regression. And also how we can automate the removal of variables based on a criteria.

  • @umagenetics3975
    @umagenetics3975 5 месяцев назад

    plz make video abour forward stepwise

  • @DrGKumar-me7bx
    @DrGKumar-me7bx 11 месяцев назад

    Thank you sir.

  • @brazilfootball
    @brazilfootball 11 месяцев назад +2

    I heard you mention this wouldn't be devoted to inference, but wondered if you could help me understand the pitfalls of stepwise elimination as its summarized in this sentence I found: "If you remove the insignificant terms and then refit, the inference results (p-values) would not include the "effect" of the previous selection". I can't wrap my head around what this means practically and what the implications might be. Any corrections or thoughts? Thank you, I really enjoy your videos!

    • @EquitableEquations
      @EquitableEquations  11 месяцев назад +2

      Hi! Yes, that's a reasonable way of describing it. As a result of the process used to get them, the p-values of the remaining terms will be low after variable selection. You shouldn't use their lowness to draw conclusions about statistical significance.

    • @brazilfootball
      @brazilfootball 11 месяцев назад

      Oh, ok! I don’t know why that was so hard for me to get. Thanks again! 😅

  • @DrGKumar-me7bx
    @DrGKumar-me7bx 11 месяцев назад

    My humble request you to make videos regarding ggplot2 with tools geom_ point, geom_bar, geom_line, pie chart, geom_area and others geom relates charts with one single dataset with every charts all the syntax. It will be useful as a beginner. Thank you so much for your great effort. ❤❤❤❤

    • @EquitableEquations
      @EquitableEquations  11 месяцев назад

      Hi! I've got vids on most of those geoms, including point (ruclips.net/video/-k5pvxyyi8o/видео.html) and bar (ruclips.net/video/HvOQFQzIg5c/видео.html). You might also be interested in my ggplot overview (ruclips.net/video/McL9MMwmIZY/видео.html), which covers a lot of geoms using only a few data sets.

  • @louiseweschler1746
    @louiseweschler1746 11 месяцев назад +1

    Please give a link to the data set (performance.csv).

  • @7ooda2620
    @7ooda2620 11 месяцев назад

    Thank you for the very helpful video, and I have a question: Why do we need linear regression when there are machine learning methods?

    • @EquitableEquations
      @EquitableEquations  11 месяцев назад +3

      Regression *is* a machine learning method, in fact the most important one. Anyhow my goal here is understanding, not just black-box prediction.