Oh my! This was immensely helpful and I appreciate your explanations during the analysis. I'm working on analyzing wetland ecological data for my M.S. and I had only one stats class many years ago. I've been teaching myself and find your videos as some of the best. This video was a lot of help for what I'm needing to do. TY for sharing with others.
Fantastic! I have been using various resources to try to understand these concepts clearly and nothing has been nearly as helpful as this video. Clear, concise, logical explanations throughout. Thank you for making this tutorial!
Super job...this was an excellent tutorial and I found it very helpful. Just a couple of follow up questions: 1. You commented that the Chi-Square critical value for 2 independent variables is 13.82. I can't seem to find that value off of the Chi-Square table with the DF=1 (i.e., # of variables minus 1) and Alpha of 0.05. Could you please elaborate on that in terms of how you obtained that value? 2. If not all of the predictor variables have a linear relationship with the dependent variable, what other analysis type should be run in SPSS? And is there a way to create a model in SPSS that combines linear and non-linear variables. Would appreciate your feedback!
Thank you very much .. This video helped me a lot with my data .. I have some question, I have 12 independent variable and one dependent variable. I am carrying all thing similarly which you have told. In correlation cofficent I am getting very high value even more then .8 for 3 variable. What should I do ?. I have to remove all 3 variable together or one by one and check for the model again for coefficient.. Please reply
I notice that you used "exclude cases pairwise" and explained that by using this method, if an observation has a missing value for a variable, then that observation will be excluded from the analysis. I was under the impression that "exclude cases listwise" does this and that the method you used excludes the missing value and not the entire observation. Thanks!
I apologize if I wasn't clear; "exclude cases pairwise" eliminates the case (subject) if they are missing data from any variable included in a specific analysis. "Exclude cases listwise" will include cases (subjects) only if they have full data on all of the variables in the data set.
I am working on a research proposal and need SPSS but I am kind of lost where to start which of the videos be the best to begin with? I need something like really easy explained, like kindergarten level as I have not done any research in over 30 years and things have changed a lot. :( Thank you .
Hi, thanks for the video it helps a lot. could you tell me how this would be presented in a report? Would i need to report the assumptions or just the evaluation?
I have 2 independent variables, 1 confounding variable, 1 mediator and 1 dependent variables. Can I put the IV, Control & mediator in the "independent variable" box? Or Confounding in box 1 and the two IV's in box 2 and then the mediator in box 3?
Question on the critical values: At about 18:58 you determine the critical value of the IV with chi square. For 3 variables you said it was 16.27. I'm not following. Do you mind explaining or showing. Thanks
+Yvette Wiley These values come from a table in Tabachnick & Fidell (2007). See below:(# of IVs)Critical value : (2) 13.82; (3)16.27; (4)18.47; (5)20.52; (6)22.46; (7)24.32
thanks a tonn for the useful video. i have a doubt. when i do step wise, 2 of my 5variables [grad, LT20] show up & the coefficient for model 2= .56. i wanted to check the co-relation of a 3rd variable [SFR] & It came up to .5 which is high enough.. which gets me wondering what have i missed that i didnt get this variable in the stepwise model... could you suggest pl?
when i tried enter method, the 5 standardised beta coefficients which show up are- SFR [-.052], lt20 [.254], GT50 [-.038], Grad [.291], FRR [.254].. need help to understand the best way ahead
Must the independent variables be normally distributed? i did data transformation becos my 3 of my independent variable are skewed (some highly skewed) and another independent variable is skewed (but skewness score is within acceptable limits). Must I ensure both independent variables and dependent variables are normally distributed? before doing the multiple regression? how do I check: 1) Multi-Collinearity between independent variable 2) Delete outliers? 3) How do I check that the relationship between IV and DV are linear? I have 4 IVs. 4) Must the IVs be normally distributed too? instead of just the DVs?
Liyana Hassim As my 3 of my IVs and 1 DV (I have 4 DVs in total) were previously not normal, i did data transformation on all 4 prior to doing the multiple regression. 1) I checked the correlations table and one of my IV correlate to DV -0.069. 2) An IV correlate to another IV at -0.059. the others correlate with one another at -0.30, -0.454, -0.101, 0.558, 0.606. 0.193, 0.606. How do I know if these are good??
Yes, the assumption is that the independent and dependent variables are normally distributed. At the 11:00 minute mark of the video I describe how to assess for multicollinearity. You can delete cases by hand or choose in the Options section under Missing Values, "Exclude cases pairwise".
Liyana Hassim You like to have the correlations ne between .20 and .70. If they fall in that range they are acceptable. Be sure to check the Tolerance and VIF values as well.
***** Hi:) thanks for your prompt reply. So if they don't fall between 0.20 and 0.70 but their tolerance and VIF values are >0.10 and < 10 respectively, would that be okay? would that be acceptable?
Hi, I had a question, and maybe you could help me: When I am running a linear regression with a continuous IV "A" the effect is significant at p=0.04, R^2 = 15%, and when I run a separate linear regression with another continuous IV "B" the effect is not significant p=0.35, R^2 = 3%. However, when I include both IV's in the "Independent(s)" box as you do in this clip, I get a significant effect of the overall ANOVA, as well as significant effect for each IV separately. Besides, R^2 of the multiple regression explains 50% of variability. I know I use a small sample (N=30) and I'm working on getting more, but I was wondering if you know why I get the big differences in the models. Thanks
When you combine predictors into a model you get the effect of the shared predictive ability of the IVs. One may not be a good predictor by itself but when combined with another they do a good job together of making a prediction.
***** Thanks for the quick reply. So basically it's not that unusual for those 2 predictors to explain so much of the variability in DV when independently they do a poor job.
Hello, I am having issues with my spss analysis. I'd like to do the multiple regression for my likert scale items but my dependent variable has two measurement items. I am testing to see what factors affect mobile payment usage. so for instance the dependent variable behavioural intention has two questions. how do i group this into one variable in order to use it as my dependent variable. and the same for all my independent variables which have been measured using about 3 or 4 questions.
You could add the likert score for your two measurement items and treat the aggregate score as your outcome. You will have to check to be sure the new outcome meets the assumptions of the analysis.
hi there, i got a question regarding model summary and ANOVA, my r square is 0.119 and adj r square is .072. does this mean my model is weak and can I use this in my thesis. another thing is my anova figure for the F part is 2.537 and Sig. figure is .034. Can you tell me what does this mean. Thank you this means alot.
That r squared value would be considered small; whether it is useful or important depends to a certain extent on what it is you are trying to predict. I can't make a judgment on whether it is suitable for your thesis. As for the ANOVA results, the sig. level of .034 indicates that you model is a statistically significant predictor of the outcome at the p
Hi, I am facing big problems in SPSS. My situation is i have one dependent variable and five independent variables. All these variables are done in 5 points Likert Scale. And, each of these variables got at least three questions as measurement. Can I know how could i combine the three questions of dependent variables to test the three questions independent variables?
MCxM3 It's somewhat of an arbitrary criteria but that value means that there is more than a trivial relationship between the variables which is an assumption of the analysis.
Hello, I would to ask if one of my items is removed because it has zero variance does that means it is not reliable and i should therefore remove the items for reg analysis?
It is very helpful :) I have a question: Where do you find your critical values of chi-square with a p-value = ,05? When i look up the critical value of chi-square with 2 degrees of freedom and a p-value of ,05, i find the chi-square value to be 5,99 and not 13,82. Also when i look the chi-square value with 3 degrees of freedom and a p-value of ,05 up - it is 7,81 and not 16,27. I would like to ask where do you find your critical chi-square values?
Best video ever for multiple regression analysis.Thanks a lot
Oh my! This was immensely helpful and I appreciate your explanations during the analysis. I'm working on analyzing wetland ecological data for my M.S. and I had only one stats class many years ago. I've been teaching myself and find your videos as some of the best. This video was a lot of help for what I'm needing to do. TY for sharing with others.
Fantastic! I have been using various resources to try to understand these concepts clearly and nothing has been nearly as helpful as this video. Clear, concise, logical explanations throughout. Thank you for making this tutorial!
The comprehensive explanation that I've seen here. Phenomenally useful.
Super job...this was an excellent tutorial and I found it very helpful.
Just a couple of follow up questions:
1. You commented that the Chi-Square critical value for 2 independent variables is 13.82. I can't seem to find that value off of the Chi-Square table with the DF=1 (i.e., # of variables minus 1) and Alpha of 0.05. Could you please elaborate on that in terms of how you obtained that value?
2. If not all of the predictor variables have a linear relationship with the dependent variable, what other analysis type should be run in SPSS? And is there a way to create a model in SPSS that combines linear and non-linear variables.
Would appreciate your feedback!
Great explanation!. Really helpful for Grad students.
Glad to hear it!
Thank you very much ..
This video helped me a lot with my data ..
I have some question, I have 12 independent variable and one dependent variable. I am carrying all thing similarly which you have told. In correlation cofficent I am getting very high value even more then .8 for 3 variable. What should I do ?. I have to remove all 3 variable together or one by one and check for the model again for coefficient.. Please reply
I would try removing them one by one and recheck the coefficients.
*****
Thank you, I will try to remove variable and will check if any difference ..
Good guide, especially on the assumptions. Thanks!
Thank you for this presentation! Excellent explanations.
Such a useful and succinct presentation. Thank you so much for explaining this consept in a very easy and applicable way.
Glad it was helpful!
Thank you very much for this video and for sharing your knowledge! IT really helped me to conclude with my dissertation data! Good luck to you!
Thank you. That was an excellent video. You really helped me to understand how to analyze my SPSS data.
I'm glad it was helpful.
I notice that you used "exclude cases pairwise" and explained that by using this method, if an observation has a missing value for a variable, then that observation will be excluded from the analysis. I was under the impression that "exclude cases listwise" does this and that the method you used excludes the missing value and not the entire observation. Thanks!
I apologize if I wasn't clear; "exclude cases pairwise" eliminates the case (subject) if they are missing data from any variable included in a specific analysis. "Exclude cases listwise" will include cases (subjects) only if they have full data on all of the variables in the data set.
***** Ok wonderful. Thank you for clearing that up, I thought I had them backwards.
Thanks alot :)
A good thoroughly steady explanation.
Kind regards from a thesis student from Denmark.
Excellent Video... Thank You very much...
I am working on a research proposal and need SPSS but I am kind of lost where to start which of the videos be the best to begin with? I need something like really easy explained, like kindergarten level as I have not done any research in over 30 years and things have changed a lot. :( Thank you .
+Ramona star Start with this playlist: ruclips.net/p/PLtx0cY9iho2_gBTWtWMvHkFdlCAwA_yid
Hi, thanks for the video it helps a lot. could you tell me how this would be presented in a report? Would i need to report the assumptions or just the evaluation?
+patrick williams I would report both.
I have 2 independent variables, 1 confounding variable, 1 mediator and 1 dependent variables. Can I put the IV, Control & mediator in the "independent variable" box? Or Confounding in box 1 and the two IV's in box 2 and then the mediator in box 3?
Thank you very much for the great videos man! You've helped me immensely in my data analysis. Keep up the good work! :)
Yeah I’m okay sorry about about the
Question on the critical values: At about 18:58 you determine the critical value of the IV with chi square. For 3 variables you said it was 16.27. I'm not following. Do you mind explaining or showing. Thanks
+Yvette Wiley These values come from a table in Tabachnick & Fidell (2007). See below:(# of IVs)Critical value : (2) 13.82; (3)16.27; (4)18.47; (5)20.52; (6)22.46; (7)24.32
Awesome vid. Thanks for sharing. Helped alot!!
thanks a tonn for the useful video. i have a doubt. when i do step wise, 2 of my 5variables [grad, LT20] show up & the coefficient for model 2= .56. i wanted to check the co-relation of a 3rd variable [SFR] & It came up to .5 which is high enough.. which gets me wondering what have i missed that i didnt get this variable in the stepwise model... could you suggest pl?
when i tried enter method, the 5 standardised beta coefficients which show up are- SFR [-.052], lt20 [.254], GT50 [-.038], Grad [.291], FRR [.254].. need help to understand the best way ahead
+Piyu n I would suggest you investigate hierarchical multiple regression if you wish to determine the potential covariate effect of your SFR variable.
Thank you so much for this tutorial! :)
Must the independent variables be normally distributed? i did data transformation becos my 3 of my independent variable are skewed (some highly skewed) and another independent variable is skewed (but skewness score is within acceptable limits). Must I ensure both independent variables and dependent variables are normally distributed? before doing the multiple regression? how do I check:
1) Multi-Collinearity between independent variable
2) Delete outliers?
3) How do I check that the relationship between IV and DV are linear? I have 4 IVs.
4) Must the IVs be normally distributed too? instead of just the DVs?
Please help!! my assignment is due soon but I can seem to figure SPSS out despite reading and youtube-ing!:(
Liyana Hassim As my 3 of my IVs and 1 DV (I have 4 DVs in total) were previously not normal, i did data transformation on all 4 prior to doing the multiple regression.
1) I checked the correlations table and one of my IV correlate to DV -0.069.
2) An IV correlate to another IV at -0.059. the others correlate with one another at -0.30, -0.454, -0.101, 0.558, 0.606. 0.193, 0.606. How do I know if these are good??
Yes, the assumption is that the independent and dependent variables are normally distributed. At the 11:00 minute mark of the video I describe how to assess for multicollinearity. You can delete cases by hand or choose in the Options section under Missing Values, "Exclude cases pairwise".
Liyana Hassim You like to have the correlations ne between .20 and .70. If they fall in that range they are acceptable. Be sure to check the Tolerance and VIF values as well.
***** Hi:) thanks for your prompt reply. So if they don't fall between 0.20 and 0.70 but their tolerance and VIF values are >0.10 and < 10 respectively, would that be okay? would that be acceptable?
This is So Damn Perfect!!!!
Hi, I had a question, and maybe you could help me: When I am running a linear regression with a continuous IV "A" the effect is significant at p=0.04, R^2 = 15%, and when I run a separate linear regression with another continuous IV "B" the effect is not significant p=0.35, R^2 = 3%. However, when I include both IV's in the "Independent(s)" box as you do in this clip, I get a significant effect of the overall ANOVA, as well as significant effect for each IV separately. Besides, R^2 of the multiple regression explains 50% of variability. I know I use a small sample (N=30) and I'm working on getting more, but I was wondering if you know why I get the big differences in the models. Thanks
When you combine predictors into a model you get the effect of the shared predictive ability of the IVs. One may not be a good predictor by itself but when combined with another they do a good job together of making a prediction.
***** Thanks for the quick reply. So basically it's not that unusual for those 2 predictors to explain so much of the variability in DV when independently they do a poor job.
Correct.
Hello, I am having issues with my spss analysis. I'd like to do the multiple regression for my likert scale items but my dependent variable has two measurement items. I am testing to see what factors affect mobile payment usage. so for instance the dependent variable behavioural intention has two questions. how do i group this into one variable in order to use it as my dependent variable. and the same for all my independent variables which have been measured using about 3 or 4 questions.
You could add the likert score for your two measurement items and treat the aggregate score as your outcome. You will have to check to be sure the new outcome meets the assumptions of the analysis.
hi there, i got a question regarding model summary and ANOVA, my r square is 0.119 and adj r square is .072. does this mean my model is weak and can I use this in my thesis. another thing is my anova figure for the F part is 2.537 and Sig. figure is .034. Can you tell me what does this mean. Thank you this means alot.
That r squared value would be considered small; whether it is useful or important depends to a certain extent on what it is you are trying to predict. I can't make a judgment on whether it is suitable for your thesis.
As for the ANOVA results, the sig. level of .034 indicates that you model is a statistically significant predictor of the outcome at the p
thanks for that,much appreciated.
Hi, I am facing big problems in SPSS. My situation is i have one dependent variable and five independent variables. All these variables are done in 5 points Likert Scale. And, each of these variables got at least three questions as measurement. Can I know how could i combine the three questions of dependent variables to test the three questions independent variables?
You could simply add the scores of the 3 dependent variable questions to create an aggregate outcome score and then analyze that way.
roughly at 11:17 when you talk about the coefficients table, why is it 0.30 for correlation??
MCxM3 It's somewhat of an arbitrary criteria but that value means that there is more than a trivial relationship between the variables which is an assumption of the analysis.
hii, may I know is there is a way to transform the variables into section? I could not do it in separately.
what is the counterpart of multiple regression if your dependent variable is not continuous , let say i used likert scale for thelevel of preparedness
+Ana Liza Dy You could use linear regression if the outcome is normally distributed but if not then logistic regression will work well.
thank you very much.
Thank you very much
Hi, is it possible to do multiple regression with 2 independent variables and 3 dependent variables?
The multiple regression function in SPSS only allows you to use one dependent variable per analysis.
Thanks! :)
Hello, I would to ask if one of my items is removed because it has zero variance does that means it is not reliable and i should therefore remove the items for reg analysis?
If you have a variable that has zero variance then it cannot serve as a predictor in regression analysis.
***** although it is just for one item?
***** If the variable has no variance it does not have the ability to predict.. Maybe I am confused as to what you mean by "one item"?
It is very helpful :)
I have a question: Where do you find your critical values of chi-square with a p-value = ,05?
When i look up the critical value of chi-square with 2 degrees of freedom and a p-value of ,05, i find the chi-square value to be 5,99 and not 13,82. Also when i look the chi-square value with 3 degrees of freedom and a p-value of ,05 up - it is 7,81 and not 16,27.
I would like to ask where do you find your critical chi-square values?
hi if one of your predictoer variables has 4 levels can you still run a standard multiple regression?
Yes, you can. The more levels the better, but it can be done.
hi, can you please tell me where i get the data set you are using?
+Sharmin begum If you message me your email, I can send it to you.