Principal Components Analysis Using R - P2

Поделиться
HTML-код
  • Опубликовано: 18 сен 2024

Комментарии • 35

  • @timerwentoff
    @timerwentoff 11 лет назад

    Thank You. Certainly one of the better tutorials on PCA here.

  • @sternblume
    @sternblume 11 лет назад

    This is my very first approximation to PCA, which I need to learn by myself for my thesis. I found it extremely useful!! Very well explained, thank you so much :-)

  • @toletasah
    @toletasah 10 лет назад +1

    very useful breakdowns, part I and II, rare and clear explanations.

  • @danielalejandrocordovadela657
    @danielalejandrocordovadela657 3 года назад

    Amazing video, I was not sure about the relationship between scores and loading vectors, and finally, I found this video which corroborates my idea. Thanks a lot

  • @ofdomejean
    @ofdomejean 9 лет назад

    So very good explanation, Now I understand PCA, Thank you!!!

  • @meshackamimo1945
    @meshackamimo1945 10 лет назад

    Prof,
    I also find your examples quite easy to follow...
    Kindly do a talk on Markov chain Monte Carlo in r....mcmc...
    And if u oblige, the kalman filters....in r.
    Once again, thank u for the intuitive/ingenious postings on R.
    God bless u.

  • @careyduryea6010
    @careyduryea6010 9 лет назад

    Thanks Steve!!! I love that you use R!

  • @StevePittard
    @StevePittard  12 лет назад

    I recommend using prcomp instead of princomp - If you look in the help pages for princomp you will see the following: "The calculation is done using eigen on the correlation or covariance matrix, as determined by cor. This is done for compatibility with the S-PLUS result. A preferred method of calculation is to use svd on x, as is done in prcomp." Hence my decision to focus on prcomp.

  • @bubiloo11
    @bubiloo11 11 лет назад

    Hello,
    Fantastic explanation of PCA, appreciate it...
    Babak

  • @abdourakibmama4274
    @abdourakibmama4274 4 месяца назад

    Very interested ❤

  • @ssniegula81
    @ssniegula81 12 лет назад

    good stuff; recommend to interested in PCA

  • @StevePittard
    @StevePittard  11 лет назад +1

    Check the "more information" section for the URL

  • @Mrnka218
    @Mrnka218 12 лет назад

    very good video. recommended.

  • @corcho1956
    @corcho1956 9 лет назад

    Great tutorial. However, I believe line 49 is incomplete. The calculation of the singular value "sd" should include an adjusting factor. The line should be written as follows:
    sd = sqrt(my.eigen$values)*sqrt(nrow(my.scaled.classes)-1). The Covariance approach uses this factor, which has to be removed when calculating the singular values.
    I used the Singular Value Decomposition approach to double check the singular values and found the discrepancy in the results.

  • @Actanonverba01
    @Actanonverba01 4 года назад

    Excellent breakdown of PCA, Thanks
    Are you going to be doing any more?

  • @alkobut6642
    @alkobut6642 9 лет назад

    Great video! My question concerns the slope of pc's. For two variables the slope is calculated thus: pc1.slope = my.eigen$vectors[1,1]/my.eigen$vectors[2,1]. How do you calculate the slope when there are more than two variables and therefore more eigenvectors (for instance, three or four)? Thanks!

  • @ncamposa
    @ncamposa 12 лет назад

    Thanks for the video.
    The question is how do you do a PCA when more than two variables are involved? but not using prcomp or princomp.
    For example using chemical analysis of water or soil samples including SiO2, Al2O3, Fe2O3, MgO, MnO, etc.

  • @JosefHabdank
    @JosefHabdank 10 лет назад

    sad that I can only click thumbs up once. Great videos!

  • @ehsannajafi635
    @ehsannajafi635 7 лет назад

    Thank you, clear and well explained!

  • @PhucNguyen-fx1vg
    @PhucNguyen-fx1vg 8 лет назад

    Thank you for the great tutorial! So if there were data on characteristics of these students, could PCA be used to find out which characteristics contribute most to PC2?

  • @JermanJackass
    @JermanJackass 9 лет назад +5

    That guy has 2542 unread emails... Must be a professor

  • @shaoxinluan3391
    @shaoxinluan3391 7 лет назад

    Prof,
    I have found that your "pc1.slope" is calculated as "my.eigen$vectors[1,1]/my.eigen$vectors[2,1]". Why is the slope calculated as such? Should the slope be the inverse of the value you used (i.e., my.eigen$vectors[2,1]/my.eigen$vectors[1,1])? Thanks in advance.

  • @ysiquita65
    @ysiquita65 12 лет назад

    Thank you for all the videos !!! Do you think upload a video of LDA ?? It will be so helpful !!!!

  • @m.noelserra5551
    @m.noelserra5551 8 лет назад

    Hi. Just one question. How I can do to change the x axis of a DCA? thanks!

  • @therimalaya
    @therimalaya 11 лет назад

    Is is same to do prcomp for a "correlation matrix for a dataset" and "raw dataset with scale and center both TRUE"?

  • @meshackamimo1945
    @meshackamimo1945 10 лет назад

    Thanks for great tutorial, prof.
    In part 1, there's a command I didn't follow....
    Text(xy, 12, 10)...........I don't understand the origin of 12 and 10....
    Kindly explain.
    Otherwise, I don't have much problems following the commands in this part 2!
    God bless.

  • @alejandrogonzaleztrevino8990
    @alejandrogonzaleztrevino8990 10 лет назад

    "lengthlwd" is not a graphical parameter

  • @ysiquita65
    @ysiquita65 12 лет назад

    this video really help me .. thank you !!!

  • @RefUser
    @RefUser 12 лет назад

    great video. could you demo the princomp R function please?

  • @jesbuddy07
    @jesbuddy07 10 лет назад

    OH MY GOD!!! THANK YOU!!!

  • @DoiLamung
    @DoiLamung 8 лет назад

    Thank you very much

  • @eak092
    @eak092 11 лет назад

    Thank a lots.

  • @abdourakibmama4274
    @abdourakibmama4274 4 месяца назад

    Can we get the scripts please?

  • @ferniize
    @ferniize 11 лет назад

    Hi. Where can I access the code? Thanks .

  • @abdullahallah6231
    @abdullahallah6231 3 года назад

    Amazing can you provide me the codes