Proof Gauss Markov Theorem (Regression - OLS)

Поделиться
HTML-код
  • Опубликовано: 22 окт 2024

Комментарии • 12

  • @peterwestfall6924
    @peterwestfall6924 6 лет назад +2

    The "no multicollinearity" assumption is not needed. With perferctly correlated regressors (eg, when you leave all dummy variables in), the (unique) OLS estimate of any estimable function of the parameters is BLUE under the G-M conditions.

  • @dennisomarcoz8320
    @dennisomarcoz8320 2 года назад

    Well understood ,thank you so much

  • @chunhochan7174
    @chunhochan7174 Год назад

    At 14:24, why DXB=0 can deduce DX=0? As B should be a column matrix, we cannot simply to make this deduction. Anyone can make some further explanation about this? Thanks.

  • @sreekanthkrishna6678
    @sreekanthkrishna6678 7 лет назад +2

    while substituting the expression for beta hat at 10:47, why is there no x transpose? you initially wrote it correct and then ignored the transpose. Am I missing something here?

    • @perfectscorespremium
      @perfectscorespremium 7 лет назад

      It's trivial if you make it transpose or not because eventually it gets cancelled out.

    • @xiranwang1068
      @xiranwang1068 6 лет назад +1

      same question.

  • @omarlaham4101
    @omarlaham4101 3 года назад

    Thanks!

  • @mehdiali1694
    @mehdiali1694 6 лет назад

    Nice explanation . thankyou.

  • @hellosoupy
    @hellosoupy 3 года назад

    Where the random sampling assumption at? :)

  • @rainardmutuku8476
    @rainardmutuku8476 6 лет назад

    Thanks

  • @jhansiraniemmadi351
    @jhansiraniemmadi351 5 лет назад

    Perfect explanation but it’s vry confusing n consideration part

  • @rohitpant6473
    @rohitpant6473 2 года назад

    senseless approach