Regression Episode 8: Interactions Between Continuous Predictors

Поделиться
HTML-код
  • Опубликовано: 13 окт 2024

Комментарии • 12

  • @joym1107
    @joym1107 4 года назад +2

    So helpful! Thank you.

  • @Xserum73
    @Xserum73 4 года назад +1

    Thank you so much much much

  • @larissacury7714
    @larissacury7714 2 года назад

    31:24 : would you expand on centering, standatizing and rescaling in regression? Thank you for the amazing class!

    • @centerstat
      @centerstat  2 года назад +1

      Hi Larissa -- Briefly, we can use linear transformations to rescale our predictors in any way we please because this doesn't change the correlations among the variables. If we center (that is, subtract the mean of the variable from each observations) this makes the mean equal to zero but keeps the original standardization. We could also standardize (subtract the mean and divide by the standard deviation) and this not only makes the mean equal to zero but also makes the standard deviation equal to 1.0. Raw, centered, and standardized all fit exactly the same but simply rescale parameters in ways that might be of use to us. Leona Aiken and Steve West have a wonderful book on testing interactions from 1991 and they go into these issues in detail. Hope that helps!

    • @larissacury7714
      @larissacury7714 2 года назад

      @@centerstat thank you for the reply and for that reference, I'm definitely going to check that! I'd love a video on that too, I really loved your explanations, thank you for that!

  • @larissacury7714
    @larissacury7714 2 года назад

    Hi, thank you SO MUCH! At 11:00, you mention that our moderator variable is attractivess. Would you expand on that? I.e, if attractiveness interacts with eassiness, why is the former the moderator and not the later? (if they're interacting, couldn't I interpret the other way around too?) Opps, edit: I guess that my question was answered at 16:00 :)

    • @centerstat
      @centerstat  2 года назад +1

      Hi Larissa -- this is a great question. If you have two variables that interact, sometimes it is helpful to think of one as the moderator and one as the focal predictor. So say 'x' is the moderator and 'z' is the focal predictor, we would say that x moderates the relation between z and y. However, this is a completely arbitrary distinction -- we could equivalently reverse these and say that z moderates the relation between x and y. It changes nothing statistically and only impacts how we think about the roles of the variables from a theoretical perspective.

    • @larissacury7714
      @larissacury7714 2 года назад

      @@centerstat Thank you for the reply, I'm glad that that was a great question! Just for further understanding: expanding it to categorical * continous interactions (which is what I'm modelling right now in a Lmer in R), would it also be possible to have the other way way? I'm saying that the moderator is my continuons predictor, in this case, It doesn't seen to make much a sense to say that the categorical predictor is a moderator, right? Or could it be?

  • @madelinerenee1168
    @madelinerenee1168 3 года назад

    I found this video to be VERY helpful. Thanks for sharing! I do have a question.. you mentioned that this type of interaction is a bilinear interaction. Are there other types of reactions where the interaction is nonlinear? And would this still be the best way to analyze a nonlinear reaction, or would you use a different approach?

    • @centerstat
      @centerstat  3 года назад +2

      Hi Madeline -- Thanks for your note. (this is Patrick -- I'm replying on Dan's behalf). You are limited to bilinear interactions using traditional product approaches. However, there are cool ways you can approximate nonlinear relations using mixture models --
      Pek, J., Sterba, S. K., Kok, B. E., & Bauer, D. J. (2009). Estimating and visualizing nonlinear relations among latent variables: A semiparametric approach. Multivariate Behavioral Research, 44(4), 407-436.
      Good luck with your work -- patrick

    • @madelinerenee1168
      @madelinerenee1168 3 года назад

      @@centerstat thanks Patrick! Looking forward to learning about multilevel modeling from y’all next week

    • @centerstat
      @centerstat  3 года назад +1

      @@madelinerenee1168 We look forward to seeing you there! Just thought I'd add that the mixture approach in the paper Patrick posted is nice in that it's semiparametric, so you don't need to know the form of the interaction. It's letting the data speak. And Phil Chalmers and Jolynn Pek developed a nice R shiny package for generating the plots here: cran.r-project.org/web/packages/plotSEMM/plotSEMM.pdf
      If you have a sense of the form of the interaction, though, you might be able to capture that some clever way within the regression framework. For instance including x1, x2, x1^2, x1*x2, and x1^2*x2 would allow for the effect of x2 to change quadratically rather than linearly with the value of x1 (so no longer a bilinear interaction). That might allow, for instance, that there is a big difference in the effect of x2 when x1 is small versus medium, but little difference in the effect of x2 when x1 is medium versus large.
      But, as Patrick said, this goes beyond the simple product variable approach that is typical in regression; that is, you don't see much done but bilinear interactions (maybe because power is typically already low for the bilinear interaction, and would probably get even lower for even higher order effects).