FRM: Standard error of estimate (SEE)
HTML-код
- Опубликовано: 30 янв 2008
- A linear regression gives us a best-fit line for a scatterplot of data. The standard error of estimate (SEE) is one of the metrics that tells us about the fit of the line to the data. The SEE is the standard deviation of the errors (or residuals). For more financial risk videos, visit our website! www.bionicturtle.com
This was grrreeeeaaat... taking a research stats class and have been trying to wrap my head around what the SEE really means... Thank you.
Beautifully done, easy to follow and to reproduce in excel without any loss of detail. Thank you David.
Crazy that this was recorded a couple months before the downturn of the market.
Your the best! I recently destroyed my exam after so struggling with the material, can't express how much the videos helped, thanks
Thank you for creating this video. It's helped clear up a few questions I had concerning SEE. Thanks again.
Thank you for this video. Your explanations were very clear and easy to understand.
Good video! Very helpful for understanding the SSE/SEE concept! Thank you for making it!
You're welcome, and thank you for watching! We are happy to hear that our video was so helpful :)
Thank u! I love the way that you clearly explain the content!
Thank you for your support, it is nice to get affirmation on youtube
Perfect video, informative and easy to understand
@dinas0r (n-2) is the degrees of freedom, which is the sample size (n) but reduced by the number of coefficients that need to be estimated: in the univariate regression, that is two, the slope and the intercept. So, it's n-k, where k is the number of coefficients or, equivalently and easier to remember, the number of variables (e.g., one independent + one dependent)
Intuitively and fundamentally, Sum of Squared Error (SSE) is similar to variance. It is the square of difference between two variables. Standard Error of Estimate (SEE) is similar to standard deivation; it measures the relative distance of variable from benchmark value.
Hence SSE and SEE are different but related.
Hopefully this explanation helps.
Good job turtle. Just to clear up one thing, the denominator in the SEE formula is (n-k-1), the number of samples minus the number of independent variables minus one.
Thank you David!!, you are the BEST.
Thanks David. You've been a big help.
It helped! Thank you for making this video!
Thank you for this video !
Well done.Thanks for creating this video
Thank you! I was so lost in my dam stats hwk til I found this!
Awesome presentation!!! Thank you!!
Thank you for watching! :)
Thank you so much!
to clarify is this different from sum of squares due to error that in prediction error we are determining the average of expected distance from an observed y value to its corresponding y value on the regression line. so if one was to guess a the y value of an x value that is not part of the graph we would assume if we were to continue the regression line we can find the corresponding y value of this x value ( that was not part of the graph) however we would assume expect on average that the error the distance between this y value ( that was not originaly part of graph) and the regression line would be equal to the standard error or estimate( prediction error) which in this case mentioned in the video is 8.6?
THANK YOU!!!!
THANK YOU!!!!
THANK YOU!!!!
I finally get it.
THANK YOU!!!!
Dude thanks so much, your so helpful!!!!!!!!!
thank you, but please how an x point has more than one or two y points
holy heck THANKYOU!!
why divide by n-2 and not n-1 or n?
Awesome!
why is it divided with "n-2"?not "n-1" or only "n"?
good !!!
I love the sexy jazz music
you look like a young kevin spacey!
sorry for long comment