Some of the simplicity comes from not covering the myriad of different statistics that are all referred to as "r-squared". I think Dustin did a good job of introducing one of them, including covering the fact that the one he covered can take on negative values. However, watch out that people may be talking about different things using the same term.
Hi, I was just wondering if you had a book or a single source of all the tutorials on regression models in R and/or Bayesian statistics thanks! Your explainations are very clear and concise
I don't recall its coverage of the topics in this video, but you may wish to know that Dustin has a free online textbook. For Bayesian statistics I would suggest "Bayesian data analysis" by Gelman and others, or for a different style I would suggest McElreath's "Statistical Rethinking".
This is a great video. Thanks. If I may add, as alluded to, there are valid problems with "standard" R-Squared for mixed models. In my lowly padewan at statistics opinion, it is better handled by using Nakagawa-R2, as found in Nakagawa and Schielzeth 2013 "A general and simple method for obtaining R2 from generalized linear mixed-effects models" and later papers by the lead author. I point you and your viewers to available R functions in Library(performance), eg performance::r2_nakagawa().
If by "rule of thumb" we mean "arbitrary decision thresholds", I don't like them either. However, "rule of thumb" can also refer to practices that are approximately good in practice, which I don't strongly object to.
Repeating again that I'm not a Statistician, but only to share what I understand or not as a layperson. I totally grasped VARIANCE EXPLAINED, but got lost after that. I am convinced that I got lost because I never fully comprehended MIXED MODELS, and gave up. Let me struggle to explain where I got lost with MIXED MODELS even though I'll run the risk of sounding totally silly (but I'm brave enough to accept my shortcomings). A non-mixed model is somewhat straightforward to follow, but when a model is MIXED, it seems "LAYERS of MODELS" are being simultaneously overlayed, and that overlay is happening fast, and getting more complex based on the number of "LAYERS" being modeled. Apart from the "LAYERS" in a Mixed Model, it seems like there's some sort of "INTERACTIONS" going on between the "LAYERS". So the "LAYERS" are performing "COMPARISONS" WITHIN AND BETWEEN themselves; hence a "CLUSTER MEAN" WITHIN each "LAYER", and an "OUTSIDE THE CLUSTER MEAN" BETWEEN "LAYERS". The DOTS in the graph 📉📈 that show "VARIANCE" between the different colored lines on the are therefore showing "VARIANCE WITHIN EACH LAYER" as well as "VARIANCE BETWEEN LAYERS". I'll stop making a fool of myself, but I think that FIRST AND FOREMOST, an understanding of what MIXED MODELS are doing is needed before one can understand the material you've presented here. In other words, what you've presented is completely understandable once one understands what MIXED MODELS are doing, and I clearly and admittedly don't understand the basics of Mixed Models so I got lost. Please excuse me if I haven't articulated my own personal challenge well. Keep up the good work in 2025 and beyond! 👏👍✨
I don't think your that far off from explaining mixed models. Well done! It's good to try to understand them in a more applied sense though rather then dots and graphs. Id recommend watching some of Dr Fifes other videos on them if your still confused. He uses a really good example that is something like SES as predictor of exam grades across multiple schools. To incorporate the inherent differences between the grades at different schools, schools is modelled as the random effect and we are able to see how much this inter-school difference influences the SES effect.
What unraveled much of statistics for me was studying the mathematics without cheating myself with being satisfied with an intuitive "gist". There have been many occasions where I am genuinely confused by someone talking about a "blah blah blah analysis" only to look up a formal mathematical specification of the method leading me to a deflationary, "ah, so that's all that is". A remarkable number of methods are slight variations of using linear algebra, calculus, and measure theory.
@galenseilis5971 I sincerely appreciate your words of wisdom, and your encouragement to persevere. 👍 ✨ Plus, I totally agree with you that lots of folks pretend, and many of them actually fake knowing more than others. In my humble view, the first step to knowing is saying that you don't know. I also think that it's important to try to articulate what it is exactly you don't know rather than simply stating that you don't understand. I'm certain you got my point, and I want to reiterate my appreciation for taking the time to encourage me. Kudos to you. 👏 ✨
@galenseilis5971 You've again hit the nail on the head by saying that it's not enough to be satisfied with the "gist" of something. 👏 I also appreciate how you described what's involved in turning that "gist" into real knowledge. Fantastic insight into Learning and Mastering ANYTHING! 👏👍 Thanks again for taking time to share your thoughts.
Best explaination of R2 I have ever seen. So simple!
Some of the simplicity comes from not covering the myriad of different statistics that are all referred to as "r-squared". I think Dustin did a good job of introducing one of them, including covering the fact that the one he covered can take on negative values. However, watch out that people may be talking about different things using the same term.
Hi, I was just wondering if you had a book or a single source of all the tutorials on regression models in R and/or Bayesian statistics thanks! Your explainations are very clear and concise
I don't recall its coverage of the topics in this video, but you may wish to know that Dustin has a free online textbook. For Bayesian statistics I would suggest "Bayesian data analysis" by Gelman and others, or for a different style I would suggest McElreath's "Statistical Rethinking".
This is a great video. Thanks. If I may add, as alluded to, there are valid problems with "standard" R-Squared for mixed models. In my lowly padewan at statistics opinion, it is better handled by using Nakagawa-R2, as found in Nakagawa and Schielzeth 2013 "A general and simple method for obtaining R2 from generalized linear mixed-effects models" and later papers by the lead author. I point you and your viewers to available R functions in Library(performance), eg performance::r2_nakagawa().
If by "rule of thumb" we mean "arbitrary decision thresholds", I don't like them either. However, "rule of thumb" can also refer to practices that are approximately good in practice, which I don't strongly object to.
Nice
Repeating again that I'm not a Statistician, but only to share what I understand or not as a layperson.
I totally grasped VARIANCE EXPLAINED, but got lost after that.
I am convinced that I got lost because I never fully comprehended MIXED MODELS, and gave up.
Let me struggle to explain where I got lost with MIXED MODELS even though I'll run the risk of sounding totally silly (but I'm brave enough to accept my shortcomings).
A non-mixed model is somewhat straightforward to follow, but when a model is MIXED, it seems "LAYERS of MODELS" are being simultaneously overlayed, and that overlay is happening fast, and getting more complex based on the number of "LAYERS" being modeled.
Apart from the "LAYERS" in a Mixed Model, it seems like there's some sort of "INTERACTIONS" going on between the "LAYERS". So the "LAYERS" are performing "COMPARISONS" WITHIN AND BETWEEN themselves; hence a "CLUSTER MEAN" WITHIN each "LAYER", and an "OUTSIDE THE CLUSTER MEAN" BETWEEN "LAYERS".
The DOTS in the graph 📉📈 that show "VARIANCE" between the different colored lines on the are therefore showing "VARIANCE WITHIN EACH LAYER" as well as "VARIANCE BETWEEN LAYERS".
I'll stop making a fool of myself, but I think that FIRST AND FOREMOST, an understanding of what MIXED MODELS are doing is needed before one can understand the material you've presented here.
In other words, what you've presented is completely understandable once one understands what MIXED MODELS are doing, and I clearly and admittedly don't understand the basics of Mixed Models so I got lost.
Please excuse me if I haven't articulated my own personal challenge well.
Keep up the good work in 2025 and beyond! 👏👍✨
I don't think your that far off from explaining mixed models. Well done! It's good to try to understand them in a more applied sense though rather then dots and graphs. Id recommend watching some of Dr Fifes other videos on them if your still confused. He uses a really good example that is something like SES as predictor of exam grades across multiple schools. To incorporate the inherent differences between the grades at different schools, schools is modelled as the random effect and we are able to see how much this inter-school difference influences the SES effect.
Explaining that you do not know something does not make you a fool. You are de facto wiser than people who simply pretend to know.
What unraveled much of statistics for me was studying the mathematics without cheating myself with being satisfied with an intuitive "gist". There have been many occasions where I am genuinely confused by someone talking about a "blah blah blah analysis" only to look up a formal mathematical specification of the method leading me to a deflationary, "ah, so that's all that is". A remarkable number of methods are slight variations of using linear algebra, calculus, and measure theory.
@galenseilis5971 I sincerely appreciate your words of wisdom, and your encouragement to persevere. 👍 ✨
Plus, I totally agree with you that lots of folks pretend, and many of them actually fake knowing more than others.
In my humble view, the first step to knowing is saying that you don't know. I also think that it's important to try to articulate what it is exactly you don't know rather than simply stating that you don't understand.
I'm certain you got my point, and I want to reiterate my appreciation for taking the time to encourage me. Kudos to you. 👏 ✨
@galenseilis5971 You've again hit the nail on the head by saying that it's not enough to be satisfied with the "gist" of something. 👏
I also appreciate how you described what's involved in turning that "gist" into real knowledge.
Fantastic insight into Learning and Mastering ANYTHING! 👏👍
Thanks again for taking time to share your thoughts.