To install ggbiplot, the code is now (17, Jan, 2020): library(devtools) install_github("vqv/ggbiplot") source: github.com/vqv/ggbiplot Excellent video and well explained these concepts. Thanks.
Thanks a lot Sir for your nice presentation. You saved my time. Earlier I used your R codes on Kohonen NN and now for PCA for my training lectures. Your explanation is so lucid. I appreciate your noble service of sharing knowledge
Your videos have been constant companions during the last months of my master thesis. It seemed as if every time I had to switch to another analysis technique you were allready waiting here. So thank you a lot for your guidance and clear explanations! The only thing I would appreciate would be if you could provide the basic R scripts. Even though the copying process might help with understanding each command due to step by step application, to type text of a tiny youtube screen shown in one half of my monitor to r studio in the other half is troublesome. Thanks!
Thank you for the material. It is very clear and actually very relevant to my current work. As I understand, the conversion of the data comprises addition products of notmalized predictors and loadings. Maybe you would have time to post a PLS regression video, please? The intriguing part is the explanation of the model itself
Thank you Dr. Bharatendra Rai for explaining PCA in detail. Can you please explain how to find weights of a variable by PCA for making a composite index? Is it rotation values that are for PC1, PC2, etc.? For example, if I have (I=w1*X+w2*Y+w3*Z) then how to find w1, w2, w3 by PCA.
Very informative and nice presentation sir, sir can we estimate PCA for factor (for eg species) with unequal no. of observation. And we want to see the correlations in terms of each species viz for setosa or other two, how to do it? Please explain...Thank You
For machine learning such random forest, neural networks, support vector machines, and extreme gradient boosting, you can refer to following: ruclips.net/p/PL34t5iLfZddu8M0jd7pjSVUjvjBOBdYZ1
Thank you for sharing, I get an error "Error in plot_label(p = p, data = plot.data, label = label, label.label = label.label, : Unsupported class: prcomp"", when I try to run the ggbiplot. Would you please advise how to fix it?
19:12 It is only for purpose to show another way to get the principal component related to training because : identical(pc$x, predict(pc,training)) gives TRUE meaning that pc$x is same as predict(pc,training).
Great video. Do you have a suggested package for running binary logistic regression? From a brief scan of nnet it appears to only have arguments for multinomial response variables. Thank you.
@@bkrai sorry I was unclear in my message. I was hoping for a suggested package to run a binary logistic regression using PCA components as predictors - similar to what you have done here with multinomial. Any suggestions are welcome.
Thanks for this video sir, very good class but I can´t get it. because Error ... could not find function "ggbiplot". Excuse me, which is your R version ?
If I just use addEllipses =TRUE, what determines the size of those ellipses? Also, if I specify ellipse.type = “confidence”, what confidence level is used to generate the ellipses? I used factoextra if that helps.
Thank you for this nice video Dr. Rai. I have a doubt. Why the predict function was used multiple times. After the prcomp function, all the data of Principle components were available in: pc$x. Why do we have to do: trg
Awesome video! Could you plz add Partial least squares regression and principal components regression to your playlist! That would be of great help. Thanks in advance!
Nice video and very helpful, I have challenges while installing the ggbiplot and mnet packages (am using R version 3.6.3) please any advice on how to over come such challenge?
Hi Sir,Could you take one session on SVD in R and also some theoretical explanation on it. I m finding it very difficult to understand it with most of the material available on the net.
In universities, business students usually use R and computer science students mostly use Python. If you are mainly looking to apply various machine learning and statistical methodologies, R is perfect.
Sir why have you predicted the training and test data with respect to PC? can use trg data for making neural model and test using tst data set? and find correlation b/w act and predicted values?
When there are many variables, chances of having multicollinearity problem increases. And PCA helps to solve that problem. And yes, you can use neural network model.
@@bkrai sir can you please explain me the significance of the lines under the heading: prediction with principle components.As I am unable to understand why we are predicting twice on test data set. Please explain sir
Hello. I dont know anything about Principal Component Analysis in R: Example with Predictive Model & Biplot Interpretation and i will never need to since thats not in my line of work. I Appreciate your Intromusic though. You are a true champ Bharatendra and enrich this world with your presence. Also that intro music fucking slaps.
Sir, I am doing PCA analysis on DJ 30 Stocks and when I view pca$loadings for 30 variables, I noticed that some were not displayed. For example, Component 1 has -0.218 for Apple but then shows none for JPM, what does this mean?
Firstly thank you for your helpful video. I have problem to add ellipse in the plot. I have 30 variables, first 29 is the numeric and last one is the factor variables. But i can,t plot the ellipse in the PCA plot. How can i solve this? Please help.
Thanks for the video Please publish video on Exploratory Factor Analysis,Confirmatory Factor Analysis application in a model Also please explain the difference from PCA
Awesome video sir...kudos... :) 1 doubt though .... 20:48 - why are we using 2 components only? How do we know how many principal components to use?(species ~ PC1 + PC2)
2 PCs capture more than 95% of the variability in the data. Other 2 only add about 5%. So you can choose to have PCs that capture over 80% or 90% of the variability.
Dear Respected Sir, I wanted to install ggbiplot using the command you provided with us. but it gives me another message. The message is (Installation failed: SSL certificate problem: self signed certificate in certificate chain Warning message: Username parameter is deprecated. Please use vqv/ggbiplot) I used vqv/ggbiplot as well, but no good results. please guide me what shall I do?
Dr. Rai, Thanks for this informative video. I am having a problem getting the predict function to work with the model created on the training dataset. I am getting two errors(paraphrased): 1. NAs not allowed in subscripted assignments; 2. newdata has 1900 rows but variables found have 8100 rows. I think it is looking for the same number of rows in the test dataset. Is there something I am doing wrong? Appreciate any feedback.
Hello great video as always! However one question i had (even though you warned against hard interpretability of results) relates to how to interpret the coefficients. If we look at the coefficient table and read the first line (after the intercept), does that mean that with every increase of Sepal.Length there is a log odd increase of 14.05 in the probability of categorizing the specie as Versicolor, relative to a Setosa? Thanks!
Great video. I have one doubt. What does the stddev attribute of PC contain? Standard deviations of the variables are already in scale..so what does stddev represent? Thanks a lot
Bharatendra Rai sorry it’s sdev attribute of pc and in 9:48 while showing the summary of pc, I would like to know what the standard deviation row denote..thanks a lot
I'm using stata, are there any specific commands for principal component analysis PCA in PANEL DATA Or Just simply run PCA after standardizing variables?
Thanks, I also found other way to plot the PCA: library(ggfortify) autoplot(pc, data = training_set, colour = 'Species', loadings = TRUE, loadings.colour = 'blue', loadings.label = TRUE, loadings.label.size = 3)
Dear Sir..thanks for a wonderful video. I have some questions. 1) At 20:18, why did u choose to reorder by setosa? 2)Why did you choose to use data as trg and not training to build mymodel given that trg has predictions from training 3) Can PCA be used to choose k in kmeans. If so, how to go about it? Thanks again. Regards
Hi Bharatendra, nice video. I have got couple queries. If there are large no of numeric variables and through PCA we find that they are highly correlated then before going for model building 1) Do we need to remove highly correlated variables ! 2) which one to remove ! Thanks
Many thanks for you Dr. God bless you.
You are most welcome!
This is the best PCA explanation I have seen anywhere so far. Thank you for sharing your knowledge.
Thanks for the feedback!
I revisited your video for interpretation of biplots in PCA. Many thanks.
You are welcome!
Thank you so much Professor🙏
You are very welcome!
Awesome video. Every R enthusiast needs to keep an eye on your channel. Thank you and keep up with great work!
+Model Michael thanks👍
Sir,
Can we get code file ?
The Bio-plot was explained very clearly, thank you Dr. Rai!
You are welcome!
Awesome Explanation
make sure you run following before installing:
library(devtools)
To install ggbiplot, the code is now (17, Jan, 2020):
library(devtools)
install_github("vqv/ggbiplot")
source: github.com/vqv/ggbiplot
Excellent video and well explained these concepts. Thanks.
Thanks for the update!
Thanks a lot Sir for your nice presentation. You saved my time. Earlier I used your R codes on Kohonen NN and now for PCA for my training lectures. Your explanation is so lucid. I appreciate your noble service of sharing knowledge
You are most welcome!
Thank you!!Best explanation on Biplot on RUclips .
Glad it was helpful!
Thank you for this extremely helpful, and easily understood tutorial, particularly the clear interpretation of the Bi-Plot. Much appreciated
You're very welcome!
Great Video! Excellent walk though on PCA and how it can be useful for actual classifications. Thanks for the upload.
+theeoddname thanks for the feedback!
Great Explanation....
Thanks!
This is great. I was looking for PCA and you have done it. Many many thanks to you sir.
one really good video i have found. After watching few of your video now your videos are becoming a "turn to" when require. thanks
Glad to hear that!
I really like your explanations in your videos. Keep them coming! Thanks
Thanks for the feedback!
Fantastic session.Perfectly understood Biplot
Thanks for comments!
One of the best PCA videos i ever seen, Thank you Mr. Rai.
Thanks for comments!
Thanks for the video! It helped me a lot doing the forecasting for future values using PCA.
Very welcome!
You are too good sir. An absolute treat for ML enthusiasts.
Thanks for your comments!
R PCA IS VERY GOOD PACKAGE AND VERY HELPFULL
Yes, I agree!
Fabulous work in PCA ! Keep it up
Thanks for the feedback!
Thank you for this amazing video. Better than my university lectures
Thanks for comments!
Really really great explanation sir, Thank you so much for making it very simple
Thanks for comments!
Thank you so much Dr. Rai. Detailed teaching
Thanks for comments!
This video is worth its weight in gold
Your videos have been constant companions during the last months of my master thesis. It seemed as if every time I had to switch to another analysis technique you were allready waiting here. So thank you a lot for your guidance and clear explanations!
The only thing I would appreciate would be if you could provide the basic R scripts. Even though the copying process might help with understanding each command due to step by step application, to type text of a tiny youtube screen shown in one half of my monitor to r studio in the other half is troublesome. Thanks!
Thanks for the feedback!
Seriously awesome explanations! Thank you again.
Thanks!
Thank you for the material. It is very clear and actually very relevant to my current work.
As I understand, the conversion of the data comprises addition products of notmalized predictors and loadings.
Maybe you would have time to post a PLS regression video, please? The intriguing part is the explanation of the model itself
Wonderful job explaining the material.
Thanks for your comments and finding it useful!
Very useful video sir. Could you explain me what is the need to partition the data into training and testing data?
You may review this:
ruclips.net/video/aS1O8EiGLdg/видео.html
@@bkrai thank you sir.
Great video! Thanks for sharing your knowledge.
Thanks for comments!
your videos are great :)
Thank you!
Thank you. Learned a lot from your channel
Thanks!
Thank you Dr. Bharatendra Rai for explaining PCA in detail. Can you please explain how to find weights of a variable by PCA for making a composite index? Is it rotation values that are for PC1, PC2, etc.? For example, if I have (I=w1*X+w2*Y+w3*Z) then how to find w1, w2, w3 by PCA.
For calculations you can refer to any textbook.
Very informative and nice presentation sir, sir can we estimate PCA for factor (for eg species) with unequal no. of observation.
And we want to see the correlations in terms of each species viz for setosa or other two, how to do it? Please explain...Thank You
sir, please make a session on factor analysis with prediction
Thanks for the suggestion!
great lecture..please share your thoughts on machine learning introduction too
For machine learning such random forest, neural networks, support vector machines, and extreme gradient boosting, you can refer to following:
ruclips.net/p/PL34t5iLfZddu8M0jd7pjSVUjvjBOBdYZ1
too good!! plz make more such videos...plz!
Thanks for comments! You may find this useful too:
ruclips.net/p/PL34t5iLfZddu8M0jd7pjSVUjvjBOBdYZ1
Thank you for sharing, I get an error "Error in plot_label(p = p, data = plot.data, label = label, label.label = label.label, : Unsupported class: prcomp"", when I try to run the ggbiplot. Would you please advise how to fix it?
Awesome video. Thank you. As time permits can you do a video on use of caret package? thank you
Saw this today. Thanks for comments!
19:12 It is only for purpose to show another way to get the principal component related to training because :
identical(pc$x, predict(pc,training)) gives TRUE meaning that pc$x is same as predict(pc,training).
That's correct!
Thank you, this video will be really helpful to complete my thesis :)
Good luck!
Good evening
If you want to show the first dimension (Dim1) and the third dimension (Dim3)
What to do or if you can provide the code for that
Thanks
Thanks sir, why in this video use linear regression? Can i use k means to clustering from pc1 and pc2?
Which line are you referring to?
Sorry, i mean logistic regression in line 59
Great video. Do you have a suggested package for running binary logistic regression? From a brief scan of nnet it appears to only have arguments for multinomial response variables. Thank you.
You can refer to this:
ruclips.net/video/AVx7Wc1CQ7Y/видео.html
@@bkrai sorry I was unclear in my message. I was hoping for a suggested package to run a binary logistic regression using PCA components as predictors - similar to what you have done here with multinomial. Any suggestions are welcome.
Yes, you can use the PCA components as predictors and run binary logistic regression as shown in the link that I sent earlier.
Great video.. What if we want to include factor-like "Control and Heat" for genotypes? Please suggest
It should work fine.
Thank You - this was extremely useful.
Very nice channel you have here - easy sub.
Thanks for comments!
Thanks for this video sir, very good class but I can´t get it. because Error ... could not find function "ggbiplot". Excuse me, which is your R version ?
Try this:
library(devtools)
install_github("vqv/ggbiplot")
Can you please help with combined pca and ann model?
I'm adding to the list of future videos.
If I just use addEllipses =TRUE, what determines the size of those ellipses? Also, if I specify ellipse.type = “confidence”, what confidence level is used to generate the ellipses? I used factoextra if that helps.
Thank you for this nice video Dr. Rai.
I have a doubt. Why the predict function was used multiple times. After the prcomp function, all the data of Principle components were available in:
pc$x.
Why do we have to do:
trg
In R you can get same thing in multiple ways. This is just for illustration.
@@bkrai Thank you Sir. That makes it clear.
@@abhishek894 You are welcome!
Great lecture. Thanks.
Thanks!
Awesome video! Could you plz add Partial least squares regression and principal components regression to your playlist! That would be of great help. Thanks in advance!
Thanks for suggestions!
Nice video and very helpful, I have challenges while installing the ggbiplot and mnet packages (am using R version 3.6.3) please any advice on how to over come such challenge?
OK for the nnet package it was successfully installed. but still struggling with the ggbiplot (despite using your codes). thanks
Do you have a video on PCA for unsupervised learning via clustering and similarity ranking?
not yet.
Excellent demonstration of PCA, really helpful. I just don't understand why in pc object, you use only training data instead of the entire data.
We only use training data so that we can later use test data to assess prediction model.
Hi Sir,Could you take one session on SVD in R and also some theoretical explanation on it. I m finding it very difficult to understand it with most of the material available on the net.
brilliant sir..simple and sweet..thanks...nice music....if i have 10 DISCRETE VARIABLEShow to reduce to 2 or 3 components, please explain?
Thanks for comments! Note that this method is only for numeric variables.
Add a video on non negative matrix factorization like intNMF
Thanks, I've added it to my list of future videos.
Can you upload a video describing independent component analysis in R
I've added it to my list.
Hi Dr, How to I use PCA to generate a score based on several variables? Regards
Can you please show back propagation algorithm in r
Refer to this:
ruclips.net/video/-Vs9Vae2KI0/видео.html
Sir can I use boruta function instead of pca in r..
Yes certainly. Here is the link:
ruclips.net/video/VEBax2WMbEA/видео.html
@@bkrai sir what do you like between r and python..i find r code more easy to understand and write..
In universities, business students usually use R and computer science students mostly use Python. If you are mainly looking to apply various machine learning and statistical methodologies, R is perfect.
Great video, thanks for uploading.
Thanks for comments!
is there any other alternative package for ggbiplot ?
Try this for biplot ( I just now ran this in RStudio cloud, and it worked fine):
library(devtools)
install_github("fawda123/ggord")
library(ggord)
Sir why have you predicted the training and test data with respect to PC? can use trg data for making neural model and test using tst data set? and find correlation b/w act and predicted values?
When there are many variables, chances of having multicollinearity problem increases. And PCA helps to solve that problem. And yes, you can use neural network model.
@@bkrai sir can you please explain me the significance of the lines under the heading: prediction with principle components.As I am unable to understand why we are predicting twice on test data set. Please explain sir
To avoid over-fitting where you get very good result from training data but not so from testing.
Orthogonality of principal component- 10:17
Thx
thank you for the amazing video!
Thanks for comments!
Hello. I dont know anything about Principal Component Analysis in R: Example with Predictive Model & Biplot Interpretation and i will never need to since thats not in my line of work. I Appreciate your Intromusic though. You are a true champ Bharatendra and enrich this world with your presence. Also that intro music fucking slaps.
Thanks for comments!
Awesome explanation sir...👍👍can you make a video for independent component analysis using r in the same way sir?
Thanks, I've have added it to my list.
Sir, can you please suggest how I can perform PCA on my Panel Data? -Regards
Great work! Thank you
Sir, I am doing PCA analysis on DJ 30 Stocks and when I view pca$loadings for 30 variables, I noticed that some were not displayed. For example, Component 1 has -0.218 for Apple but then shows none for JPM, what does this mean?
Firstly thank you for your helpful video. I have problem to add ellipse in the plot. I have 30 variables, first 29 is the numeric and last one is the factor variables. But i can,t plot the ellipse in the PCA plot. How can i solve this? Please help.
Thanks for the video
Please publish video on Exploratory Factor Analysis,Confirmatory Factor Analysis application in a model
Also please explain the difference from PCA
Thanks for the suggestion, I've added this to my list.
can a dataset consisting of the principal components and the target variable be used to perform machine learning techniques?
Yes, this video shows an example of doing it.
It was a fruitful video.Can you please share the code.
Hi, I want to know from where can I get the iris example data ? thank you!
It's inbuilt in R itself. You can access it by running first 3 lines shown in the video.
Awesome video sir...kudos... :)
1 doubt though .... 20:48 - why are we using 2 components only? How do we know how many principal components to use?(species ~ PC1 + PC2)
2 PCs capture more than 95% of the variability in the data. Other 2 only add about 5%. So you can choose to have PCs that capture over 80% or 90% of the variability.
Great job, same as always. Can I use PCA for 2 or more categorical variables? Can I define those variables as 0 and 1 in PCA?
You can only use numeric variables. You can try using 0 and 1 and see if it works ok.
How to know the exact names of the variables after doing PCA like they are before
Each pc is a combination of all variables and all variables retain their original name.
Thank you
Welcome!
sir my data is showing [ reached getOption("max.print") -- omitted 10 rows ]. the last 10 rows are omitted, how to fix this, please
That's just how much gets printed. But all data still remains intact.
scatter Plat and Correlation- 2:04
Thx
ggbiplot not getting installed when tried the way in the video,please advise how to install
You can try this:
library(devtools)
install_github("vqv/ggbiplot")
@@bkrai
try this
install.packages("remotes")
remotes::install_github("vqv/ggbiplot")
it will help.
Thanks!
Cool video! Can you do a video about Multiple Correspondance Analysis(MCA) for cualitative data? It would help me a lot
Thanks, I've added this to my list.
Thanks sir .....can u please tell me how start learning on R from beginning?
You can start with this playlist:
ruclips.net/p/PL34t5iLfZddv8tJkZboegN6tmyh2-zr_T
Dear Respected Sir,
I wanted to install ggbiplot using the command you provided with us. but it gives me another message. The message is (Installation failed: SSL certificate problem: self signed certificate in certificate chain
Warning message:
Username parameter is deprecated. Please use vqv/ggbiplot) I used vqv/ggbiplot as well, but no good results.
please guide me what shall I do?
Not sure what went wrong. May be some typo or something else. Probably you can try running commands using my R file.
Dr. Rai,
Thanks for this informative video. I am having a problem getting the predict function to work with the model created on the training dataset. I am getting two errors(paraphrased): 1. NAs not allowed in subscripted assignments; 2. newdata has 1900 rows but variables found have 8100 rows. I think it is looking for the same number of rows in the test dataset. Is there something I am doing wrong? Appreciate any feedback.
NAs occur when there is missing data. For handling missing values, refer to:
ruclips.net/video/An7nPLJ0fsg/видео.html
Hi..good day bharatendra..I want to replace one my columns with value 1 for all its elements,what is the code in R studio..thanks for your time?
suppose you are using following data:
data(iris)
To add what you indicated to a "new" column, you can use:
iris$new
thanx for ur ans ..I do already have a column with different values,I wanna replace all values on that column with just 1
So for iris data if you want to change all values for Sepal.Length variable to 1, you can use:
iris$Sepal.Length
Hello, you put training [5] to reference the column on trg variable....
shouldn't it be training[ , 5]?
It is training[ , 5] in the video.
Hello great video as always! However one question i had (even though you warned against hard interpretability of results) relates to how to interpret the coefficients. If we look at the coefficient table and read the first line (after the intercept), does that mean that with every increase of Sepal.Length there is a log odd increase of 14.05 in the probability of categorizing the specie as Versicolor, relative to a Setosa? Thanks!
Your interpretation is correct.
Thank you! Keep up the good work! Your r videos are great!
Sir ..ggbiplot is not installed hence cant work on this ..though i followed the video throughly
Great video. I have one doubt. What does the stddev attribute of PC contain? Standard deviations of the variables are already in scale..so what does stddev represent? Thanks a lot
At what point in time do you see this?
Bharatendra Rai sorry it’s sdev attribute of pc and in 9:48 while showing the summary of pc, I would like to know what the standard deviation row denote..thanks a lot
It is standard deviation related to principal components. It helps to estimate what percentage of variability is captured by each principal component.
Bharatendra Rai thanks a lot. I understand this now
Thank you so much for this video. Will you please make a video on Broken-line regression in R?
Thanks for the suggestion, I've added this to my list.
I'm using stata, are there any specific commands for principal component analysis PCA in PANEL DATA Or Just simply run PCA after standardizing variables?
I've not used stata, so difficult to say what command will be correct.
Dear Teacher, I can`t install ggbiplot from github, is there other way to install it?
My R version is 3.6.0
you can try this:
library(devtools)
install_github("vqv/ggbiplot")
Thanks, I also found other way to plot the PCA:
library(ggfortify)
autoplot(pc, data = training_set, colour = 'Species',
loadings = TRUE, loadings.colour = 'blue',
loadings.label = TRUE, loadings.label.size = 3)
Thanks for the update!
Dear Sir..thanks for a wonderful video. I have some questions.
1) At 20:18, why did u choose to reorder by setosa?
2)Why did you choose to use data as trg and not training to build mymodel given that trg has predictions from training
3) Can PCA be used to choose k in kmeans. If so, how to go about it?
Thanks again.
Regards
Interesting
thanks!
Hi Bharatendra, nice video. I have got couple queries. If there are large no of numeric variables and through PCA we find that they are highly correlated then before going for model building
1) Do we need to remove highly correlated variables !
2) which one to remove ! Thanks
You don't need to remove if you are using the components for developing a prediction model. This video provides a similar example.
thanks
Principal components are orthogonal to each other, saying differently they are uncorrelated and can be used as is in model building.
Thanks!