ECO375F - 2.3 - Unbiasedness of the Slope Estimator (β1)

Поделиться
HTML-код
  • Опубликовано: 29 сен 2015
  • To review the derivation of the OLS Estimator:
    • ECO375F - 1.0 - Deriva...

Комментарии • 22

  • @clara___3374
    @clara___3374 4 года назад

    I love all your videos. I have been struggling with some concepts

  • @shehryar221
    @shehryar221 4 года назад +1

    I genuinely love you for this

  • @loveconomics
    @loveconomics 6 месяцев назад

    Why did you assume \sum(x_i - x_bar) = 0 in @5:00 but then take a conditional expectation of the same variables in @8:00 ? I would have just assumed they both equal to zero and have the \sum(xi-x_bar)^2 cancel. You would be left with beta_1_hat = beta_1.
    Great video either way. I am just not sure why you didn't assume \sum(x_i - x_bar) = 0 for both cases.

  • @gmayank32
    @gmayank32 4 года назад +1

    Thank you for this video, really helpful. Just a small doubt, in the video summation of (xi - xbar) is constant because for the condition on x (which has a constant value, say C), (xi - xbar) = (c - xbar) and thus we are left with E[u|x].

  • @nithyamayananda6964
    @nithyamayananda6964 4 года назад

    Does this apply to multivariate regressions as well as univariate? Thank you, great videos!

    • @RemiDav
      @RemiDav 4 года назад

      I am not sure I understand your question ? Are you asking if the proof applies to multivariate regression ?
      For multivariate regression, there is a similar proof but using Matrix algebra.

    • @TheVista255
      @TheVista255 4 года назад

      @@RemiDav Hi, have made any video to apply this method for multivariate regression, for example, 2 variables regression? I think we need to use matrix like your suggestion, but it is too complicated to me to make the calculation. Thank you

    • @RemiDav
      @RemiDav 4 года назад

      @@TheVista255 I didn't make any matrix version, sorry.

  • @razvanpaun8629
    @razvanpaun8629 5 лет назад

    6:12 How Do you get to the last line from the previous one? What is the operation?

    • @RemiDav
      @RemiDav 5 лет назад

      distribute the 1/sum(...) in the parenthesis and simplify.

    • @razvanpaun8629
      @razvanpaun8629 5 лет назад +1

      @@RemiDav Thanks for that! It was under my nose really! Just got it 5 minutes ago after a closer look.
      I really enjoy your videos!
      Cheers!

  • @hipolitdobrohna8279
    @hipolitdobrohna8279 5 лет назад

    can we consider elements in quatation of estimator b1 as random variables?

    • @RemiDav
      @RemiDav 5 лет назад

      I don't understand your question. You would have to tell me which elements exactly you are referring to.
      Note that if you are a Bayesian statistician, everything can be considered a random variable.

    • @hipolitdobrohna8279
      @hipolitdobrohna8279 5 лет назад

      @@RemiDav I mean that in general case every observation from random sampling is random variable; so at counting can we consider observations as random variables?

    • @hipolitdobrohna8279
      @hipolitdobrohna8279 5 лет назад

      @@RemiDav i meant exactly what this guy did on his video ruclips.net/video/5tMMESxjDBg/видео.html :)

  • @leeneshila1642
    @leeneshila1642 7 лет назад +1

    you made me understand, thank you...do you have any Video on LM statistics to shARE PLEASE

    • @RemiDav
      @RemiDav 7 лет назад

      Are you talking about the Lagrange Multiplier test ?

    • @leeneshila1642
      @leeneshila1642 7 лет назад

      yes sir

  • @user-yp1rg2jr5z
    @user-yp1rg2jr5z Год назад

    Why the sum of (xi-x(bar)) is constant? Here, xi takes different values!!
    And why you take conditional expectation?

  • @yilingshen7753
    @yilingshen7753 5 лет назад

    Why the sample y is equal to the population Y

    • @RemiDav
      @RemiDav 5 лет назад

      Please indicate a time in the video