Это видео недоступно.
Сожалеем об этом.

Decomposition of Variability: Sum of Squares | Statistics Tutorial

Поделиться
HTML-код
  • Опубликовано: 7 авг 2024
  • 👉🏻 Sign up for Our Complete Data Science Training with 57% OFF: bit.ly/2TR8Keb
    👉🏻 Download Our Free Data Science Career Guide: bit.ly/3oUAnBs
    In this tutorial, we will explore the determinants of a good regression - the sum of squares total, the sum of squares regression and the sum of squares error. We dig deep into the concepts to define their role within the regression model and look into how they are connected.
    The sum of squares total (SST) is a measure of the total variability of the dataset, while the sum of squares regression (SSR) describes how well your line fits the data. The sum of squares error (SSE), on the other hand, is the difference between the observed value and the predicted value. And what is the connection between these three concepts? Watch the whole video to find out!
    ► Consider hitting the SUBSCRIBE button if you LIKE the content: ruclips.net/user/365DataScie...
    ► VISIT our website: bit.ly/365ds
    🤝 Connect with us LinkedIn: / 365datascience
    365 Data Science is an online educational career website that offers the incredible opportunity to find your way into the data science world no matter your previous knowledge and experience. We have prepared numerous courses that suit the needs of aspiring BI analysts, Data analysts and Data scientists.
    We at 365 Data Science are committed educators who believe that curiosity should not be hindered by inability to access good learning resources. This is why we focus all our efforts on creating high-quality educational content which anyone can access online.
    Check out our Data Science Career guides: • How to Become a... (Da...
    #SumOfSquares #Statistics #DataScience

Комментарии • 17

  • @365DataScience
    @365DataScience  3 года назад +3

    👉🏻 Sign up for Our Complete Data Science Training with 57% OFF: bit.ly/2TR8Keb

  • @korchageen
    @korchageen 2 года назад +5

    Probably the best visual explanation on the Internet for a non statistical person. Wish many person find this channel.

  • @reuelkarani6284
    @reuelkarani6284 2 года назад +1

    This is the most underrated video ever!!

  • @RonMagsamen
    @RonMagsamen 4 месяца назад

    Short and precise 🫡

  • @larissacury7714
    @larissacury7714 2 года назад

    Loved this video! Do we have pre set parameters for these terms? Ex: what would be a good sum of squares error?

  • @dewantahmid95
    @dewantahmid95 2 года назад +3

    you made it very very clear. thank you...

  • @polarequatorial709
    @polarequatorial709 3 года назад

    Is it ok to speculate on what explains the residuals?

  • @ojassaxena2891
    @ojassaxena2891 2 года назад

    What is orthogonal sum of squares

  • @mariosanchez5392
    @mariosanchez5392 21 день назад

    my head is still scrambled lol

  • @hariharansubramanian8988
    @hariharansubramanian8988 11 месяцев назад

    How is it a measure of dispersion or variability? Suppose, I have 2(upwards) as a difference(Yactual - Ymean = 2) and 3(downwards) as a difference(Yactual - Ymean = 3) . Then 2 power of 2 i.e 4 + 3 power of 3 i.e 9. That is, 4+9 is 13?! So, how does this imply variability or dispersion. Yeah. A lot of books talk about converting negative to positive. But that is not the correct logic. Errors are magnified, in fact, and may imply distortion.

  • @timacorn2536
    @timacorn2536 5 месяцев назад

    You sound like this: 🤓

  • @dukeworthington483
    @dukeworthington483 3 года назад +9

    Thanks for nothing. I got nothing

  • @EdoardoMarcora
    @EdoardoMarcora 11 месяцев назад

    This is plainly wrong! Regression does not imply causation