Eigen values and Eigen vectors (PCA): Dimensionality reduction Lecture 15@ Applied AI Course

Поделиться
HTML-код
  • Опубликовано: 19 ноя 2024

Комментарии • 14

  • @rajmaheshwarreddy2859
    @rajmaheshwarreddy2859 6 лет назад +2

    nice explanation of eigen vectors and its corresponding values; also the geometric picture of spread of the information on 2dto1d!!

  • @AJ-fo3hp
    @AJ-fo3hp 3 года назад

    Thank you verymuch
    It covers
    1. column std. matrix
    2.Co variance matrix
    3.eigen vector
    4.eigen value
    5.Orthogonal matrix of v1, v2, .... vd
    6.projection
    7.projection 100%, 75%,60%,50%
    In general, for any matrix, the eigenvectors are NOT always orthogonal. But for a special type of matrix, symmetric matrix, the eigenvalues are always real and the corresponding eigenvectors are always orthogonal.

  • @MuctaruKabba
    @MuctaruKabba 3 года назад

    Thank you sir. Brilliant methods and explanations to help a near beginner trying to grasp the nuance of eigenvalues and eigenvectors as applied in PCA and unsupervised machine learning. I'm not there yet, but I sure feel like I can and will after listening to you. Who needs machines ... lol?

  • @rishabhshirke1175
    @rishabhshirke1175 3 года назад +1

    7:57 Always true only if it's a symmetric matrix

  • @debanjandas4877
    @debanjandas4877 5 лет назад +3

    Applied AI Course
    In 1:57 covariance matrix says Cov of X(S)= Transpose(X).X But in previous vedio we saw Cov of X(S)=1/n( Transpose(X).X ) ..Why this difference??

  • @msravya7694
    @msravya7694 6 лет назад +1

    Sir, what is the purpose of using covariance matrix?

    • @AppliedAICourse
      @AppliedAICourse  5 лет назад +1

      The goal of PCA is to find projection directions along which the variance of the projected data points is maximum. If you would write this goal in mathematical terms, and simplify the resulting objective, you would find a matrix in the resulting eigenvalue equation. This matrix is nothing but the covariance matrix. Hence, the covariance matrix naturally arises from the goal/objective of PCA. Please go through the link:
      www.visiondummy.com/2014/04/geometric-interpretation-covariance-matrix/

  • @akhilkrishna8521
    @akhilkrishna8521 4 года назад

    sir why cant we take v2 as our vector having maximum variance in a particular direction

    • @AppliedAICourse
      @AppliedAICourse  4 года назад +1

      As V1 has higher variance than V2. If you were to pick a single dimension with maximum variance, we pick V1 and not V2.

  • @gurudakshin6420
    @gurudakshin6420 5 лет назад +1

    I dint understand anything from this. Do I need to watch any prior video relates to pca in order to understand this videos concepts

    • @AppliedAICourse
      @AppliedAICourse  5 лет назад

      Yes, these lectures are in a sequence. You need to first watch the videos for linear algebra and then watch PCA for a clear understanding. You can watch these videos in order on our website as these are our sample/free videos. Please login to AppliedAICourse.com and check out the sample videos which are in clean order starting from basics of Python

  • @prativadas4794
    @prativadas4794 5 лет назад

    can we get the answer for why u1=v1?

    • @questforprogramming
      @questforprogramming 5 лет назад

      Check optimization videos...which are paid in their Website...