Multicollinearity - Explained Simply (part 2)

Поделиться
HTML-код
  • Опубликовано: 27 окт 2024

Комментарии • 32

  • @JGCouryo9
    @JGCouryo9 13 лет назад

    A really good, clear and above all PRACTICAL explanation than tons of other videos that get into the maths without explaining simply what it is and how to interpret it. THANK YOU very much!!!

  • @paofmacedon
    @paofmacedon 13 лет назад

    Dear how2stats,
    Thank you so much for your very useful tutorial and all the information you provide in your website.
    Thank you again!

  • @alexmarvin3093
    @alexmarvin3093 2 года назад

    thanks, useful for my current phase of my research project

  • @zengjustin3619
    @zengjustin3619 Год назад

    Thank you very much, clear and copious!!!!

  • @johneunyoung
    @johneunyoung 12 лет назад

    Great. Very helpful. You're a born teacher...

  • @robertpauljuster3545
    @robertpauljuster3545 5 лет назад

    Very helpful! Thanks for all your efforts

  • @Han-ve8uh
    @Han-ve8uh 4 года назад

    What does 5:40 "add more stability" mean? Isn't a regression analysis a one - time calculation? Does this mean there is randomness in repeated runs and each run will give very different coefficient and standard error results? Or this is talking about stability across adding/removing predictors, and that adding/removing predictors is assumed to not affect coefficient and standard error results at all if there's no multicollinearity?

  • @MsWhiteIcy
    @MsWhiteIcy 13 лет назад

    How2stats Rocks!!!
    Thank You very much Sir for such an enriching video :)

  • @ezhumalaigem6034
    @ezhumalaigem6034 3 года назад

    Very clearly explained. 👍👍👍

  • @sammcfarland539
    @sammcfarland539 9 лет назад

    Very helpful overview. Thanks!

  • @kantemallikarjunarao
    @kantemallikarjunarao 3 года назад

    GOOD EXPLANATION....

  • @annikabrundyn8441
    @annikabrundyn8441 9 лет назад

    Really great videos - thanks!

  • @naftalibendavid
    @naftalibendavid 12 лет назад

    Nicely done.

  • @AayanHayat
    @AayanHayat 9 лет назад

    Perfect explanation.

  • @portialambright8068
    @portialambright8068 10 лет назад

    What is the link or address of your website? I would like to view the references you spoke about in your video.

    • @how2stats
      @how2stats  9 лет назад +1

      www.how2stats.net/2011/09/collinearity.html

  • @laurenwjackson9464
    @laurenwjackson9464 11 лет назад

    I am having an issue with multicollinearity in a whole plot factor of a split-plot design. N = 144, with 16 whole plots and 9 subplots in each whole plot. It is 3x3x3x2 with one hard-to-change factor (temp). Under this design, I've reached a predicted VIF of 12.37 for my whole plot factor; while other designs have been between 15-45. I've been unable to find literature that deals specifically with this issue. Do you have any recommendations? I can't increase the number of runs much more!

  • @dechileen
    @dechileen 11 лет назад

    Thank you for this, very clear explanation :)

  • @shubhankarpaul9420
    @shubhankarpaul9420 2 года назад

    Thank You so much

  • @buyka4671
    @buyka4671 8 лет назад +1

    thank you so much your statistical lessons
    I have question. If X1 and X2 are highly correlated to each other. which variable should be dropped X1 or X2
    plz give an answer to my question.

    • @subratkumarsahoo4849
      @subratkumarsahoo4849 5 лет назад +1

      You can drop any one of them , it will work👍

    • @hjh3337
      @hjh3337 3 года назад

      @@subratkumarsahoo4849 So would it be ok to produce a number of simple mixed effects regression models instead of lumping all the IVs into one model? Since my IVs are all measuring the same thing (e.g., response times) I would assume this would be ok after watching your video. Thank you so much!
      Also, would you recommend trying to scale numerical values so they are all similar in value? You could, for example, use more decimal points to reduce the orders of magnitude differences that sometimes occur between DVs and IVs. Keep up the good work!

  • @phantomwizard
    @phantomwizard 8 лет назад +6

    really love all the mouth sounds..

  • @pateaugen5974
    @pateaugen5974 6 лет назад

    if we consider three variables in a multiple regression and only one variable is significant, what does this mean

    • @how2stats
      @how2stats  6 лет назад

      That only one variable contributed to the regression equation in a statistically significant way.

  • @JT2012a
    @JT2012a 10 лет назад

    Whats the rule about averaging VIF and that the averages should be near one. What if the average of all the VIFs are about 1.7? But all other rules are ok.

  • @ygernkot
    @ygernkot 11 лет назад

    amazing! thank you!

  • @MRSNOOPDOGG
    @MRSNOOPDOGG 11 лет назад

    thank you!

  • @enundiasinnombre
    @enundiasinnombre 13 лет назад

    thanks

  • @AK-fj4tb
    @AK-fj4tb 4 года назад

    Hello again. I'm going to comment on your 2nd multicollinearity video as well just to point out that the VIF statistic--which you explain clearly and accurately--will not necessarily catch the kind of problem you illustrate with your v2 and v3: opposite directions, both statistically significant. I demonstrate in a peer reviewed, published research paper that VIFs can be as low as 1.1 and you still can get the problem. So low VIFs should never be used to dismiss multicollinearity concerns. Please watch my video, which touches on the VIF issue, and if interested you can download my 2018 research paper for the mathematical basis for my claim regarding the VIF. I would welcome comments from you or your many viewers. /watch?v=iV8BLOix5KI

    • @Eng.Ali-M
      @Eng.Ali-M 3 года назад

      It is not available (the video), the RUclips link does not work

  • @393Dan
    @393Dan 12 лет назад

    and I thought macroeconomy was hard to understand