Ordinary Least Squares Estimators - derivation in matrix form - part 2

Поделиться
HTML-код
  • Опубликовано: 30 янв 2025

Комментарии • 36

  • @malikfayiz8259
    @malikfayiz8259 11 лет назад +12

    I'm a legally blind student who does not see the board in the regular classroom. I'd like to thank you for sharing these nice videos. I'd recommend them to all of my classmates!

  • @SpartacanUsuals
    @SpartacanUsuals  11 лет назад +11

    Hi, many thanks for your message. Glad to hear it helped! Best, Ben

  • @aurachii
    @aurachii 4 года назад

    God Bless You, Ben Lambert

  • @sushantvaidik
    @sushantvaidik 6 лет назад +1

    Great explanation. You forgot to remind that X'X is indeed a symmetric matrix, to make the analogy with A matrix that was symmetric in previous video. That's from going to last line from the second last. Cheers!

  • @shunxu
    @shunxu 8 лет назад +1

    Puff, you are awesome, thank you very much!

  • @AJ_42
    @AJ_42 5 лет назад

    Great material put together!

  • @jjj-tr9hi
    @jjj-tr9hi 6 лет назад

    indeed, really helpful contents! thx from Korea

  • @BalajiSankar
    @BalajiSankar 5 лет назад

    Thank you for explaining in such detail

  • @Athens1992
    @Athens1992 3 года назад +1

    Hello,
    I have a question about x'Ax matrix the derivative is equal 2x'x this applies only when our matrix is symmetrical, but in example if we have 4 features and 1k tests how this matrix is symmetric is not square, I am bit confused this in real life example in ML

    • @hnull99
      @hnull99 10 дней назад

      X might not be symmetric but X^T X ensures it is symmetric. Just build your design matrix X as n x p sized with n beeing the number of observations and p = k + 1 with k beeing the number of regressors.

  • @TheGodSaw
    @TheGodSaw 8 лет назад

    Outstanding work you helped me a lot!

  • @davidarnold4149
    @davidarnold4149 10 лет назад +1

    Isn't Beta-Hat=[Beta-Hat_0, Beta-Hat_1, Beta-Hat_2, ... , Beta_Hat_p]', so aren't the dimensions 1 x (p+1) ?

  • @MrLunchun
    @MrLunchun 9 лет назад +1

    thanks these are sooo helpful!!! but why is this considered graduate when im doing this in an undergraduate course..

  • @modupeolukanni2027
    @modupeolukanni2027 6 лет назад +1

    Is there a simpler way to remember the rules when differentiating because it will be long to do all this in an exam just to differentiate, so is there like a simple rule to follow?
    Thank you for your videos!

  • @kevinyong932
    @kevinyong932 6 лет назад

    brilliant !!! This is very helpful !!

  • @shujiezhang3494
    @shujiezhang3494 Год назад

    Very helpful.

  • @Uu-hs3os
    @Uu-hs3os 5 лет назад

    Really helpful! Thanks a bunch!

  • @antoniaaguilera1669
    @antoniaaguilera1669 6 лет назад

    You da bomb Ben!

  • @nickmillican22
    @nickmillican22 3 года назад

    Hi Ben, I would love to see the analagous derivation for multiple correlated output variables--for multiple-output regression

  • @asifstat6683
    @asifstat6683 8 лет назад

    its very helpfill sir...tnx for uploading

  • @Paskovitchz
    @Paskovitchz 5 лет назад

    So for the last one - Is the derivative of B'X'X the same as the derivative of X'XB?

  • @arlenalem
    @arlenalem 8 лет назад

    Hi , k have a doubt. i found in theory being w= y'Ax, so dw/dy= x'A' , hence in this video being the third term S= -B'x'y , so the dS/dB should have been -y'x? sorry if my question is too basic.

  • @akarshagrawal2831
    @akarshagrawal2831 8 лет назад

    Thank you soo much for your video

  • @abderrahimba7390
    @abderrahimba7390 2 года назад

    Can we use the same matrix form when we have categorical variables?

  • @purduetrafficlab
    @purduetrafficlab 11 лет назад

    In this video, you say that dS/dB' for the -B'X'y term is just -X'y because dy/dX' = a. But didn't we find in the previous video that dy/dX' = a'? In that case, shouldn't the derivative be -y'XB? I realize that in the end, it doesn't make a difference, but I'm just trying to clear up my understanding...

    • @SpartacanUsuals
      @SpartacanUsuals  11 лет назад +1

      Hi, thanks for your message. I think say in the video that the derivative of y wrt x is a - which makes sense dimensionally. Hence, hopefully everything else should follow. Hope that helps. Best, Ben

  • @LuMaZZ93
    @LuMaZZ93 7 лет назад

    Hi, thanks a lot for your videos! They are helping a lot! :)
    But could someone tell me why he is underlining some of the variables? Is there a meaning of underlining (straight line or this waved line)?

    • @SpartacanUsuals
      @SpartacanUsuals  7 лет назад +4

      Hi, thanks for your comment. The variables with straight lines below them are vectors. Those with curvy lines are matrices. Best, Ben

  • @josephknutson1700
    @josephknutson1700 7 лет назад

    Saving my life

  • @wlsl8529
    @wlsl8529 6 лет назад

    Thanks a lot!!

  • @charliebaker8071
    @charliebaker8071 6 лет назад

    is beta 0 (the intercept) included in the beta hat vector?

    • @JMRG2992
      @JMRG2992 4 года назад

      beta 0 is included in the vector of beta estimates.

  • @ireneisme8747
    @ireneisme8747 5 лет назад

    Thank you so much but i still could not understand why the differentiation of X'a=a