Lecture43 (Data2Decision) Comparing Models

Поделиться
HTML-код
  • Опубликовано: 13 дек 2024

Комментарии • 22

  • @HarpreetSingh-ke2zk
    @HarpreetSingh-ke2zk 7 лет назад +10

    This is what a seekers needs. Short details, straight to point, using examples, behavior of model and others. Thanks.

    • @kashtonnoah2975
      @kashtonnoah2975 3 года назад

      I guess im randomly asking but does anyone know a tool to log back into an Instagram account??
      I somehow forgot the login password. I love any assistance you can give me!

    • @otisaldo4178
      @otisaldo4178 3 года назад

      @Kashton Noah instablaster :)

    • @kashtonnoah2975
      @kashtonnoah2975 3 года назад

      @Otis Aldo i really appreciate your reply. I found the site through google and Im trying it out atm.
      Takes quite some time so I will reply here later when my account password hopefully is recovered.

    • @kashtonnoah2975
      @kashtonnoah2975 3 года назад

      @Otis Aldo It worked and I finally got access to my account again. I am so happy:D
      Thank you so much you saved my account :D

    • @otisaldo4178
      @otisaldo4178 3 года назад

      @Kashton Noah Glad I could help xD

  • @Вадим-е2ш2л
    @Вадим-е2ш2л Год назад

    Could you provide a link to the literature where the Akaike criterion is calculated using the formula you cite?

  • @jualiet2212
    @jualiet2212 5 лет назад

    you teach so much better than my prof... thank you!!

  • @Double_M_gl_SJ
    @Double_M_gl_SJ 3 года назад

    Can anyone explain where the 'n' comes from (14:07. second row)?
    It is my understanding that the likelihood inside the log should be maximum likelihood under the assumption of Gaussian distribution.
    Then, the maximum likelihood estimator is equal to the mean square estimator, leading to -2ln(L) = nlog(SSE) -nlog(n) +2p. I am not sure where the extra n comes from

  • @danielmurillo1954
    @danielmurillo1954 5 лет назад

    Hello
    I was wondering if my statiscal analysis of the residuals or errors, reveals that they are Normalized but lack a Constant Variance, can I still assume that the L can be calculated by SSE/n?

  • @krishnaiyer2556
    @krishnaiyer2556 3 года назад

    sir good work, but confusion confounded on your explanation at 11.37/17.19, can you elaborate sir

  • @MrSaeedAta
    @MrSaeedAta 6 лет назад

    Great lecture.
    One question: Where do these parsimony metrics have an advantage over 'noise-sensitivity' metrics like the PRESS statistic?
    The standard guide that my work perscribes is the press statistic. I've been going on a crusade of all the perscriped methods that come w/o a justification and comparison of alternatives.

    • @ChrisMack
      @ChrisMack  6 лет назад +1

      I use PRESS as well. There is no one metric that is best at all things. I typically compute many metrics when comparing models. When they agree, of course there is no question. When the "best" few models are ranked differently by the different metrics, I tend to use judgment based on my knowledge of the problem (that is, non-statistical criterion).

  • @frankh2141
    @frankh2141 5 лет назад

    Hello Sir, can I use Aic or Bic for robust Standard errors (NewyWest) ? How can I applied in R? Tranks 😉

  • @ankushjamthikar9780
    @ankushjamthikar9780 6 лет назад

    Can you discuss any numerical examples using these criteria? I am a biomedical student and I want to compare two risk prediction models. From your video, it is clear that AIC or BIC can be used to select a good model. But, Can you share any numerical example related to this video?

  • @kalyanasundaramsp8267
    @kalyanasundaramsp8267 6 лет назад

    super sir...u r great

  • @PedroRibeiro-zs5go
    @PedroRibeiro-zs5go 5 лет назад

    Thanks for the video! ✌🏻

  • @MrHugosky1
    @MrHugosky1 7 лет назад

    Nice lecture. Thank you very much!

  • @ЛюдмилаКшнясева-я5ш

    I prefer Bozdogan IC = -2LL+K(1+ln(N)) for strogest penalty term and most "stretchig" among c- models!

  • @carlosacosta1453
    @carlosacosta1453 6 лет назад

    Thanks a lot

  • @yangddo
    @yangddo 6 лет назад

    Thank you~!

  • @ricardog4459
    @ricardog4459 4 года назад

    very well explained. thanks for sharing!!