Anybody wondering what happened at the and let me explain: betahat = sum(from i = 1 to n) [ (yi - ybar)(xi - xbar) ] / sum(from i = 1 to n) [xi - xbar] here nominator is simply the sample covariance between y and x, and denominator is the sample variance of the x. then betahat = sample cov(y, x)/ sample var(x) => is the slope estimate of the regression that minimizes residuals.
Hie I have a question when you where you substituted alpha with yi did not make a mistake of forgetting to multiply the bracket replacing alpha with a negative be4 collecting like terms
Hi - apologies for the late response! so for this part imagine i had not removed the negative sign and kept it, in this case we could still rearrange our equation to get the same value for Beta at 12:26, if we remove the negative sign or kept it, in both cases we can end up with the same outcome for beta. The only difference if we kept the negative sign is that we would have just not taken B(xi - x) to the other side when rearranging but instead take (yi-y)(xi-x) over to the other side (since this would have a negative sign in front of it so to get it positive take it over to the other side). So in the end keeping the minus sign or not would not effect the outcome.
This second part of the video series is really misleading as you did not clearly explain the summation at the end. Please reupload or remove this. It is not correct.
Much better than what my professor briefly presented as a derivation. Thanks
Thank you so much! This is the easiest explanation I have found of this exercise. Straight to the point and with a clear explanation.
Nice and short derivation. Kudos for the lecturer!
Anybody wondering what happened at the and let me explain:
betahat = sum(from i = 1 to n) [ (yi - ybar)(xi - xbar) ] / sum(from i = 1 to n) [xi - xbar]
here nominator is simply the sample covariance between y and x, and denominator is the sample variance of the x.
then betahat = sample cov(y, x)/ sample var(x) => is the slope estimate of the regression that minimizes residuals.
Very well-explained. Thank you so much. Please keep on making more videos!
What do you do with the summation at the end? I think the video cuts out early. Thank you for your help!
when you insert a^ why dont you factor in the -1 when re arranging the equation? @ 5:25
Could you please explain, what you are doing with the sum operator ?
Thank you for this video! I appreciate it
Hi. Great video. It helped me understand OLS estimator better. But the video is not complete.
Hie I have a question when you where you substituted alpha with yi did not make a mistake of forgetting to multiply the bracket replacing alpha with a negative be4 collecting like terms
At 3:41 of the video
i didnt understand why you are using Yi Instead of Y hat???
kindly u can clear me???
Thank you
Hi. I have a question. When you divide the equation by 2, what happens to the negative sign? (at 10.08 minutes in the video)
Hi - apologies for the late response! so for this part imagine i had not removed the negative sign and kept it, in this case we could still rearrange our equation to get the same value for Beta at 12:26, if we remove the negative sign or kept it, in both cases we can end up with the same outcome for beta. The only difference if we kept the negative sign is that we would have just not taken B(xi - x) to the other side when rearranging but instead take (yi-y)(xi-x) over to the other side (since this would have a negative sign in front of it so to get it positive take it over to the other side). So in the end keeping the minus sign or not would not effect the outcome.
Hey @ezlearningyt, do you think you could post the rest of the vid. just makes learning through visualization easier. cheers
شرح ممتاز
استمر. انت جيد في هذا
I would be great full if you assist me
This second part of the video series is really misleading as you did not clearly explain the summation at the end. Please reupload or remove this. It is not correct.