Hierarchical Multiple Regression in SPSS with Assumption Testing
HTML-код
- Опубликовано: 30 июл 2024
- This video demonstrates how to conduct and interpret a hierarchical multiple regression in SPSS including testing for assumptions. A hierarchical multiple regression determines the contribution of predictor variables to an outcome variable while controlling for one or more predictor variables.
Thanks Dr Grande, I have been benefiting a lot from your videos. Keep enriching my statistical analysis skills.
This was a good video on HMR and how to run in SPSS. I like that it is mentioned the similar regression models of multiple linear regression and ANCOVA. It was nice to see the comparison of the variance of all the predictor variables vs the control of the family support variable.
Thank you Dr Todd, your video saved me from trouble of Hirarchical M R problems. Tons and tons thank you and blessings you provide for free learining. Emazing you are!
With all the different kinds of regressions, it can be a little confusing. This is a video that I will have to review when I need it! Great tool.
It was nice to see how to find Hierachicial Multiple Regression in SPSS. The model summary was helpful to have a quick glance of the regression.
+Amanda Sutton I agree. The summary helped me understand the regression.
well, I´ve passed through 4-5 explanations of the hierarchical multiple regresssion here, on youtube. This is the best one.
Good to know that the hierarchical multiple regression is analogous to an ANCOVA. There's so many different regression types, the first challenge is have an idea of what they all do differently.
No waaaay, Dr Grande! I watch your videos on the dark triad/narcissism. Didn't know you did stats as well
May I ask a question? In your video, the value of Beta and t of family support in Model 1 is significant, but wasn't significant anymore in Model 2, how to explain such a phenomenon ? Btw, which value should I choose to report, the significant one or the insignificant one ? Thanks a lot.
Dear Dr Grande, thank you so much for another great and informative video. I have a question about the correlation of the independent variables with the dependent variable (at +/- 7:00). I have coded my categorical variables with more than two levels to dummies. But now, the correlation coefficient is totally not representative for these set of variables. Is there any criteria for including categorical variables into the analysis? For example, should I test the relationship with the dependent variables separately using t-tests or ANOVA?
Very informative and educational video. Would have been truly perfect if you had explained what different assumptions for the MRA you were checking for when you were doing it. Great work
agreed!
Very good explanation. Thank you.
Thanks a lot for the tutorial. I would like to ask a question, i have 4 control variables (ordinal and nominal categorical variables), and 4 independent variables ((ordinal and nominal categorical variables). Can i put all of them at the same time as you explained on the video? Like in the first block put all the control vaariables, and in the second put all the independent variables. Can you please make also a video to teach us how to interpret and report the results about the effect of the controls variables according to the statistical results. Thanks in advance
Thank you for the video!!
Thank you for the explanation. However, I was wondering if I should use a HMR or multiple linear regression when I have 1 independent variable and 1 dependent variable and I would like to control for 1 variable (exact age)?
I do not know what the cooks distance is or how residual stats is involved in reading these results. I DO understand that the HMR is similar to an ANCOVA in that we have a covariate which can help us explain the differences in variables.
Say I have three independent variables, within which two variables do not meet the assumption of linear relationship with the dependent variable. But all the other assumptions of normality and lack of multicollinearity are met. So can I still use the hierarchical regression? Thank you ssssssssssso much.
How do we know whether or not the independent variable causes enough variance in the dependent variable in order to include/exclude it form the final model?
Great work. My question here, is 0.7 the cut point for checking muliticollinearity
Hello Dr. Grande, could you explain a little bit more the case-wise diagnostics and Cook's in this example?
You are my hero :)
Thank you so much!
Hi Dr. Todd, I have one questions the std. residual is min. -5.212 and the maximum is .2969. However, both models are significant does this say anything?
I wonder if you run the same hierarchical multiple regression with "stepwise" method instead of "enter". If the results show that Family Support were not significant and dropped out of the regression model but Anxiety & Depression remained significant. Can you interpret the results as Symptom is significantly predicted by Anxiety & Depression while Family Support is controlled.
Dear Todd, first of all thanks for this great video. Very helpful to me. However, I do have a question. Is it required that all variables are equal oriented (e.g. all positive or all negative) to perform a hierarchical multiple regression?
For example, for my research on the moderating effects of personality characteristics on individual readiness to change, I use the Big Five Inventory. In the BFI four factors are positive oriented (Extraversion, Agreeableness, Conscientiousness and Openness) and one factor is negative oriented (Neuroticism). In confirming the factor (by PCA) I did pay attention to the reverse scores of several items and subsequently recoded them, before I configured the variables. However, is it necessary for conducting a hierarchical multiple regression to recode the variable Neuroticism to the (positive oriented) opposite variable of Emotional Stability? Thanks in advance.
I have no clue and since this comment is 3 years old I'm sure you've sorted it out. The only change I think you'd see is that where correlation is -ve it would reverse sign to being positive which makes it easier to compare to other correlations if they were all positive. I'm really not sure, and would be interested to know the answer
What if the dependant variable is not normally distributed? Then what?
Are the assumptions different if we have 1 mediator and 1 moferating variable
what if my DV is not normally distributed (Shapiro Wilk p< .001) ??
You would transform your variables. For instance, you should do log transformation or square (or square root) the variables.
very useful
Hello Dr. Grande. Plse help me:( How to interpret the models for HMR?
Thanks
Sir , how we use more than 3 moderators in hierarchical multiple regression models...? Is that the same method which we use interaction term in simple linear regression analysis?? Plz some1 explain this i stuck in very complex problem where i have to do 4 moderators..
Hello Dr, I need help for my analysis please guide me
Sir if independent variable is only 1 and dependent variable more than 1 then how we can use regression analysis???? Plz guide me sir
Canonical R ?