Partial Least Squares Regression 1 Introduction (3/4)

Поделиться
HTML-код
  • Опубликовано: 11 сен 2024

Комментарии • 8

  • @thrustin64
    @thrustin64 4 года назад

    Thank for this! Helped a lot with the understanding of making prediction using the regression (B) matrix.

  • @HIZIBIZI987
    @HIZIBIZI987 6 лет назад +1

    which software this is?

    • @RasmusBroJ
      @RasmusBroJ 6 лет назад

      There's not much software here in this presentation, but the most common ones used in chemometrics would be Unscrambler, Simca and PLS_Toolbox. But there are many other nice packages as well

  • @ayoubmarah4063
    @ayoubmarah4063 4 года назад

    what does R matrix stands for ?

    • @RasmusBroJ
      @RasmusBroJ 4 года назад

      That's the matrix containing the inner relation regression coefficients on the diagonal

  • @Novatures
    @Novatures 9 лет назад

    I don't understand why T=X*P, since X = T*P'+E :(

    • @nanfengliu1027
      @nanfengliu1027 9 лет назад +3

      +Novatures I guess it is because P is an orthogonal matrix, i.e., P*P'=P'*P=I. And this is an constrain condition when we optimizing P (the projection matrix).

    • @thelittlekid2358
      @thelittlekid2358 4 года назад +2

      If you check out the gray line in gray font at the upper right corner of the slide 9:00, it says there's a simplification for weights (also rotations) and loadings. For simplicity, the tutorial use the same symbol P to denote both the loading and rotation. In reality (for example in scikit-learn.org/stable/modules/generated/sklearn.cross_decomposition.PLSRegression.html), it should be T = XP' + E and T = X W(P'W)^{-1}, in which W denotes the weights for X. The difference between a loading and a weighting is documented at wiki.eigenvector.com/index.php?title=Faq_difference_between_a_loading_and_a_weighting. I would agree with the tutorial on simplifying those symbols when introducing PLSR at the first time (especially following PCA), which makes it easier to understand PLSR easier at a high level.