Never in my life have I made the connection between squaring a value and the area of a square. Explaining least squares this way is so smart! Great job!
Greetings sir, I would like to know whether it is possible to make a video-manual on time series statistics and its models (Such as AR, MA, ARIMA, ARMAX, so on). It would be great. So far I liked your videos about shrinkage methods (the best ones I have seen so far). Cheers from Russia
and this is basically the so called machine learning concept, minimizing the sum of the error square, which means minimizing the loss/cost function, machine learning is just statistics and optimization.
As Sir explained in the end, this is only true in OLS when we have a linear model. If we want to calculate R2 for a quadratic model, we cannot have an assumption that SST=SSR+SSE. Hence we need to generalize the R^2 equal to SSE/SST
Never in my life have I made the connection between squaring a value and the area of a square. Explaining least squares this way is so smart! Great job!
I know me too
Half way though the video I'm like holy shit. This is super intuitive!!!! Thanks a lot!!
Thats a really good intuitive explanation! Good job! :)
Very informative and concise! Thank you for your hard work!
Excellent explanation, thank you so much Ritvik. You have a real talent for teaching.
this was 7 years ago, I can still tell this is a top quality tutorial even the video quality was not that good
Great presentation!
loved it man this is the stuff I was looking for . thanks a lot 🙏🙏
Very helpful, thanks so much for making this series. Only thing that threw me was your Qs look a lot like Zs!.
Amazxingg!! Ur the GOD !!
Thanks for the clear explanation
Greetings sir, I would like to know whether it is possible to make a video-manual on time series statistics and its models (Such as AR, MA, ARIMA, ARMAX, so on). It would be great. So far I liked your videos about shrinkage methods (the best ones I have seen so far). Cheers from Russia
Squares are used to equalize negative error with positive error, not for the “area”.
Great explanation, it all starts with a good geometric visual and example similar to how 3blue1brown does it.
Crazy helpful!
Best explanation.
and this is basically the so called machine learning concept, minimizing the sum of the error square, which means minimizing the loss/cost function, machine learning is just statistics and optimization.
A great one.
very useful video, much appreciated. may i know why OLS cannot be used to solve Moving Average coefficients?
holy shit, this is brilliant wow
Could you put a video out for MLE regression?
Amazing
and kaisai equal hota hai?
perfect.
very well explained sir. now i hope i would be able to pass my exam
what exam were you preparing for ?
nice tutorial
cool
Why not to write R^2=SSE/SST instead of R^2=1-SSR/SST? Both are representing same thing total variation explained by model
As Sir explained in the end, this is only true in OLS when we have a linear model. If we want to calculate R2 for a quadratic model, we cannot have an assumption that SST=SSR+SSE. Hence we need to generalize the R^2 equal to SSE/SST