Such a life-saver!!!! Thank you sir!!! Big help to me!! I was studying for my academic research paper / undergraduate thesis right now. The entry of pandemic in 2019 resulted to a halt in our semester and and so our lectures too. This was one of those we were not able to cover. But now, I have to self-study about it because I am going to need it for my research. Thank you thank you very much!! You’re a gift & a blessing!! You don’t know how much anxiety and worries were released from me. Praise God for your life and knowledge and heart to share knowledge to others
thank you so much. your lecture helped me understand more on logistic regression. Quite easy to understand and make easier for me in my research analysis
boy, this is humbling. I am sure it is me, but I have now looked at half a dozen places, and they all say the same thing regarding number 1 assumption for logistic regression. you said something about an example towards the end of the video. is that coming out or is it already out? I am not sure how to do this. meaning, in the linear case, you check the ACTUAL xs and ys. but correct me if I am wrong, I assume in the logistic case, you check the MODEL logs with the xs, correct ? just not sure how to "bring that home". thanks John
This is exemplary, well done. Quick question, for the linearity assumption, what about for x values that result in very high or very low probabilities? a logit function is relatively linear in the middle (say between probabilities of .2 to .8) but very steep at higher probabilities. In this case if you have x values that span the whole log odds function, wouldn't this naturally deviate from linearity? or am I missing something wrt what a glm logistic model is trying to accomplish? thanks!
I was wondering if I could get your advice on something related. If I were to do logistic regression on time series data and used time as a continuous predictor, and say I was measuring rates of disease that hovered around 50% but then dropped to
Such a life-saver!!!! Thank you sir!!! Big help to me!! I was studying for my academic research paper / undergraduate thesis right now. The entry of pandemic in 2019 resulted to a halt in our semester and and so our lectures too. This was one of those we were not able to cover. But now, I have to self-study about it because I am going to need it for my research. Thank you thank you very much!! You’re a gift & a blessing!! You don’t know how much anxiety and worries were released from me. Praise God for your life and knowledge and heart to share knowledge to others
Thanks, that’s great t hear!
thank you so much. your lecture helped me understand more on logistic regression. Quite easy to understand and make easier for me in my research analysis
Thanks so much Dr.
Quality stuff
boy, this is humbling. I am sure it is me, but I have now looked at half a dozen places, and they all say the same thing regarding number 1 assumption for logistic regression. you said something about an example towards the end of the video. is that coming out or is it already out? I am not sure how to do this. meaning, in the linear case, you check the ACTUAL xs and ys. but correct me if I am wrong, I assume in the logistic case, you check the MODEL logs with the xs, correct ? just not sure how to "bring that home". thanks John
This is exemplary, well done. Quick question, for the linearity assumption, what about for x values that result in very high or very low probabilities? a logit function is relatively linear in the middle (say between probabilities of .2 to .8) but very steep at higher probabilities. In this case if you have x values that span the whole log odds function, wouldn't this naturally deviate from linearity? or am I missing something wrt what a glm logistic model is trying to accomplish? thanks!
actually I think i found it out.. I was constraining my logit with x from 0 to 1... my bad
I was wondering if I could get your advice on something related. If I were to do logistic regression on time series data and used time as a continuous predictor, and say I was measuring rates of disease that hovered around 50% but then dropped to
Best content