Thanks Mostapha. The video was quite helpful. I have a short question. When I use MATLAB build in command [vec, scores]=pca(X); the values of scores are different from values of variable z in your code? Don't they suppose to be the same? scores in matlab function is not the projection of the data on each components? I would appreciate it if you could respond to my question. Cheers.
Great video! What i miss is u comparing the the clusters at the end of the video( pca() function) and finding what component is separation the two clusters so that you may track it over time to identify exact time that clusters switches
Excellent tutorial! If we write pca = PCA(0.95) in Python, then 95% of variance is retained. How can we do the same thing in MATLAB? I don't want to specify the number of components but want a fixed variance.
hello mostapha, thanks for the video. please, why do we have try 3 colours in the figure? even though m = 2 Which 2 among them is the PCA? thank you in anticipation for the reply.
two video requests 1. use of PCA to analyse/reduce a dataset for regression(only numerical variables) instead of iris/digits (classification) 2. biplot based interpretation/analysis of PCA/SVD
Hey this was an amazing video with really clear explanations. However, around 32:14, you confuse the term Eigenvalues with Eigenvectors. Please correct me if I'm wrong! :)
57:30 sorry, but can you tell me, why look for the z matrix and then deduce the pca y matrix from there? i read the theory and only understood the step of finding the covariance matrix and the eig command, but in theory it just says : 'find the image of the matrix A^T. X^ of vector X^...." i don't understand the parts after that, i not good at English and i must use gg translate, it s really hard for me, hope u answer soon, this Pca is a homework for my team to get points for a year :((
Dude, don't stop making videos like this! You are very talented teacher! Your explanations are extremely useful! Thanks a lot!!!
why do we need covariance of x 8:20
The clearest explanation of PCA I've found! Thank you Yarpiz
Thanks Mostapha. The video was quite helpful. I have a short question. When I use MATLAB build in command [vec, scores]=pca(X); the values of scores are different from values of variable z in your code? Don't they suppose to be the same? scores in matlab function is not the projection of the data on each components? I would appreciate it if you could respond to my question. Cheers.
Thank you very much for the tutorial in PCA
Great video! What i miss is u comparing the the clusters at the end of the video( pca() function) and finding what component is separation the two clusters so that you may track it over time to identify exact time that clusters switches
this video its very interesting but i doont know how to get digit data.csv and what component the digit data.csv?? thank you
Excellent tutorial! If we write pca = PCA(0.95) in Python, then 95% of variance is retained. How can we do the same thing in MATLAB? I don't want to specify the number of components but want a fixed variance.
Thank you for the great tutorial. How can I plot my PCA in 3d in MATLAB (do you have the code)?
hello mostapha,
thanks for the video.
please, why do we have try 3 colours in the figure? even though m = 2
Which 2 among them is the PCA?
thank you in anticipation for the reply.
two video requests
1. use of PCA to analyse/reduce a dataset for regression(only numerical variables) instead of iris/digits (classification)
2. biplot based interpretation/analysis of PCA/SVD
Hey this was an amazing video with really clear explanations. However, around 32:14, you confuse the term Eigenvalues with Eigenvectors. Please correct me if I'm wrong! :)
What an explanation! The concept is explained thoroughly and neatly.
Thank you very much..
how can I rotate pca's in matlab?
Really, it is amazing!
your lesson helped me a lot. thank you, sir.
Sir please upload videos on economic dispatch problem
Hello, can you show how to do curvilinear component analysis on matlab?
57:30 sorry, but can you tell me, why look for the z matrix and then deduce the pca y matrix from there? i read the theory and only understood the step of finding the covariance matrix and the eig command, but in theory it just says : 'find the image of the matrix A^T. X^ of vector X^...." i don't understand the parts after that, i not good at English and i must use gg translate, it s really hard for me, hope u answer soon, this Pca is a homework for my team to get points for a year :((
Welcome back Yarpiz, could you please do more videos about deep learning?
Thanks a lot for your video man, it helped me a lot. Press F for respect.
8:03 it should be capital X bar not small x.
great video, thanks
What is g value and why we have to calculate that?
Sorry z not g value.
Thanks a lot man! really well explained
Thank you, sir. This video helped a lot.
This video really helps me, thanks a lot :)
Hey man, I really like your videos, so informative and nice :)
Please send listing program for me