Derivation of OLS coefficients
HTML-код
- Опубликовано: 10 авг 2015
- The simple maths of OLS regression coefficients for the simple (one-regressor) case.
This video screencast was created with Doceri on an iPad. Doceri is free in the iTunes app store. Learn more at www.doceri.com
Clear and to the point. I'm going to be referencing your videos for my econometrics class! Thank you so much!
In
Sir you are great !!
Very easily explained such a complex derivation..
😍😍
ooooh god ... you saved my degree sir!!!! hats off!!
Excelent derivation!
awesome video sir. Thank you
This is the one that finally made it click. Thank you
well, Explained what books would you you recommend for econometrics?
Thank you for this😊
So clear. GOD u saved my life
Best regards to Mr.Mathematics...thank you
Awesome 🙏🙏
Thank you legend
Very nice 👍
Thanks ... its awesm :) .. thanks :)
Bravo.
Would you take the same steps for the OLS estimate of Beta-hat2 if there was another slope estimator of Beta-hat2 in the regression line?
Yes, only then we have three equations in three unknowns so the formulas get more complicated.
Thank you
exam in 2 weeks. thank you 😀
On 12:00, how did the sum of xi-yi - n*mean(x)*mean(y) turn into the second last row? Didn't quite get it. Thanks in advance
If you take the LHS of the second to last row and expand it, you will see that some terms can be collected and you get the LHS of the third to last row. The derivation of this equivalency is very similar to the derivation around 7:15.
How we can obtain intercept and slope of B0 and B1 after shifting line l to l'
Not exactly sure what you are asking, Jahid. The B0 and B1 derived in this video are estimates completely dependent on the data... so if we have new or different data that imply a different line, we will have different calculated values of B0 and B1.
i want multi regresion derivation
I will be posting a comprehensive proof and derivation of the formulas of the multi regression in a couple of weeks, busy at this moment...
Very clear, brilliant 👍
Excellent explanation. There is just one step that I'm not 100% convinced about. No doubt this has to do with my own ignorance about summation. In the part where he solves for b_0 I can't quite see how he goes from Sigma (b_1 * X_i) to (b_1 * n * mean(X)). With my limited understand I would have thought it should be (b_1 * n^2 * mean(X)). He calculates n but I'm thinking it should be n^2.
This follows from something mentioned around 6:50, namely by definition mean(X) = (1/n)sigma (X_i), so multiplying both sides by n we have sigma (X_i) = n*mean(X). Then note that because b_1 is a parameter we can factor it out of the sum, so Sigma (b_1 * X_i) = b_1 * Sigma (X_i) = b_1 * n*mean(X). There is no square involved because when we take the derivative with respect to b_0 in the minimization, the square term goes away.
@@billsundstrom8948 Thanks, but I'm still not 100% clear. I understood the derivative with respect to b_0. That is not where I got the n squared from. This is my thinking which I know is wrong but I don't quite understand why. If Sigma(b_1) = n*b_1, and Sigma(X_i) = n * mean(X) then wouldn't Sigma(b_1 * mean(X)) be [n * b_1 * n * mean(X)] = n^2 * b_1 * mean(X)?
@@paultoronto42 It helps to write out the sum: Sigma (b_1 * X_i) = b_1 * X_1 + b_1 * X_2 + b_1 * X_3... + b_1 * X_n and now factor out b_1 to get b_1 * (X_1 + X_2 + X_3... + X_n) = b_1 * n* mean(X)
@@billsundstrom8948 Thanks, that does help!
@@billsundstrom8948 Thanks also for you video on Summation Notation. I should have watched that one first.