Principal Component Analysis in R: Example with Predictive Model & Biplot Interpretation

Поделиться
HTML-код
  • Опубликовано: 21 авг 2024

Комментарии • 356

  • @shawnmckenzie8699
    @shawnmckenzie8699 4 года назад +1

    To install ggbiplot, the code is now (17, Jan, 2020):
    library(devtools)
    install_github("vqv/ggbiplot")
    source: github.com/vqv/ggbiplot
    Excellent video and well explained these concepts. Thanks.

    • @bkrai
      @bkrai  4 года назад +1

      Thanks for the update!

  • @philipabraham5600
    @philipabraham5600 6 лет назад +2

    This is the best PCA explanation I have seen anywhere so far. Thank you for sharing your knowledge.

    • @bkrai
      @bkrai  6 лет назад

      Thanks for the feedback!

  • @ramram2utube
    @ramram2utube 2 года назад +1

    Thanks a lot Sir for your nice presentation. You saved my time. Earlier I used your R codes on Kohonen NN and now for PCA for my training lectures. Your explanation is so lucid. I appreciate your noble service of sharing knowledge

    • @bkrai
      @bkrai  2 года назад

      You are most welcome!

  • @ramram2utube
    @ramram2utube 9 месяцев назад +1

    I revisited your video for interpretation of biplots in PCA. Many thanks.

    • @bkrai
      @bkrai  9 месяцев назад

      You are welcome!

  • @jacklu1611
    @jacklu1611 2 года назад +1

    The Bio-plot was explained very clearly, thank you Dr. Rai!

    • @bkrai
      @bkrai  2 года назад

      You are welcome!

  • @modelmichael1972
    @modelmichael1972 7 лет назад +6

    Awesome video. Every R enthusiast needs to keep an eye on your channel. Thank you and keep up with great work!

    • @bkrai
      @bkrai  7 лет назад +1

      +Model Michael thanks👍

    • @padhanewalaullu
      @padhanewalaullu 7 лет назад

      Sir,
      Can we get code file ?

  • @saurabhkhodake
    @saurabhkhodake 7 лет назад +2

    This video is worth its weight in gold

  • @Dejia_Space
    @Dejia_Space 4 года назад +1

    Thank you!!Best explanation on Biplot on RUclips .

    • @bkrai
      @bkrai  4 года назад

      Glad it was helpful!

  • @NIKHILESHMNAIK
    @NIKHILESHMNAIK 4 года назад +1

    You are too good sir. An absolute treat for ML enthusiasts.

    • @bkrai
      @bkrai  4 года назад +1

      Thanks for your comments!

  • @jonm7272
    @jonm7272 4 года назад +3

    Thank you for this extremely helpful, and easily understood tutorial, particularly the clear interpretation of the Bi-Plot. Much appreciated

    • @bkrai
      @bkrai  4 года назад

      You're very welcome!

  • @babadrammeh656
    @babadrammeh656 2 года назад +1

    R PCA IS VERY GOOD PACKAGE AND VERY HELPFULL

    • @bkrai
      @bkrai  2 года назад

      Yes, I agree!

  • @flamboyantperson5936
    @flamboyantperson5936 7 лет назад +2

    This is great. I was looking for PCA and you have done it. Many many thanks to you sir.

  • @nyatonkitnya4267
    @nyatonkitnya4267 3 года назад +1

    one really good video i have found. After watching few of your video now your videos are becoming a "turn to" when require. thanks

    • @bkrai
      @bkrai  3 года назад

      Glad to hear that!

  • @galk32
    @galk32 5 лет назад +1

    One of the best PCA videos i ever seen, Thank you Mr. Rai.

    • @bkrai
      @bkrai  5 лет назад

      Thanks for comments!

  • @srujananeelam6547
    @srujananeelam6547 4 года назад +1

    Fantastic session.Perfectly understood Biplot

    • @bkrai
      @bkrai  4 года назад

      Thanks for comments!

  • @theeoddname
    @theeoddname 7 лет назад +2

    Great Video! Excellent walk though on PCA and how it can be useful for actual classifications. Thanks for the upload.

    • @bkrai
      @bkrai  7 лет назад

      +theeoddname thanks for the feedback!

  • @abdullahmohammed8521
    @abdullahmohammed8521 3 года назад +1

    Many thanks for you Dr. God bless you.

    • @bkrai
      @bkrai  3 года назад

      You are most welcome!

  • @affyy04
    @affyy04 2 года назад +1

    Thank you for this amazing video. Better than my university lectures

    • @bkrai
      @bkrai  2 года назад

      Thanks for comments!

  • @ConeliusC33
    @ConeliusC33 6 лет назад +3

    Your videos have been constant companions during the last months of my master thesis. It seemed as if every time I had to switch to another analysis technique you were allready waiting here. So thank you a lot for your guidance and clear explanations!
    The only thing I would appreciate would be if you could provide the basic R scripts. Even though the copying process might help with understanding each command due to step by step application, to type text of a tiny youtube screen shown in one half of my monitor to r studio in the other half is troublesome. Thanks!

    • @bkrai
      @bkrai  6 лет назад

      Thanks for the feedback!

  • @bucklasek1
    @bucklasek1 2 года назад +1

    Thanks for the video! It helped me a lot doing the forecasting for future values using PCA.

    • @bkrai
      @bkrai  2 года назад

      Very welcome!

  • @jonimatix
    @jonimatix 7 лет назад +2

    I really like your explanations in your videos. Keep them coming! Thanks

    • @bkrai
      @bkrai  7 лет назад

      Thanks for the feedback!

  • @koparka112
    @koparka112 2 года назад

    Thank you for the material. It is very clear and actually very relevant to my current work.
    As I understand, the conversion of the data comprises addition products of notmalized predictors and loadings.
    Maybe you would have time to post a PLS regression video, please? The intriguing part is the explanation of the model itself

  • @eldrigeampong8573
    @eldrigeampong8573 4 года назад +1

    Thank you so much Dr. Rai. Detailed teaching

    • @bkrai
      @bkrai  4 года назад

      Thanks for comments!

  • @WahranRai
    @WahranRai 2 года назад +2

    19:12 It is only for purpose to show another way to get the principal component related to training because :
    identical(pc$x, predict(pc,training)) gives TRUE meaning that pc$x is same as predict(pc,training).

    • @bkrai
      @bkrai  2 года назад

      That's correct!

  • @Rutvi_patel_1111
    @Rutvi_patel_1111 7 лет назад +2

    Fabulous work in PCA ! Keep it up

    • @bkrai
      @bkrai  7 лет назад +1

      Thanks for the feedback!

  • @ashishsangwan5925
    @ashishsangwan5925 5 лет назад +1

    Awesome Explanation

    • @bkrai
      @bkrai  5 лет назад

      make sure you run following before installing:
      library(devtools)

  • @upskillwithchetan
    @upskillwithchetan 4 года назад +2

    Really really great explanation sir, Thank you so much for making it very simple

    • @bkrai
      @bkrai  4 года назад

      Thanks for comments!

  • @donne4real
    @donne4real 4 года назад +1

    Wonderful job explaining the material.

    • @bkrai
      @bkrai  4 года назад

      Thanks for your comments and finding it useful!

  • @PrimoSchnevi
    @PrimoSchnevi 3 года назад +1

    Hello. I dont know anything about Principal Component Analysis in R: Example with Predictive Model & Biplot Interpretation and i will never need to since thats not in my line of work. I Appreciate your Intromusic though. You are a true champ Bharatendra and enrich this world with your presence. Also that intro music fucking slaps.

    • @bkrai
      @bkrai  3 года назад

      Thanks for comments!

  • @siddharthadas86
    @siddharthadas86 6 лет назад +2

    Seriously awesome explanations! Thank you again.

    • @bkrai
      @bkrai  4 года назад

      Thanks!

  • @jinnythomas9815
    @jinnythomas9815 3 года назад +1

    Great Explanation....

    • @bkrai
      @bkrai  3 года назад

      Thanks!

  • @asiangg
    @asiangg 6 лет назад +2

    Thank you. Learned a lot from your channel

    • @bkrai
      @bkrai  6 лет назад

      Thanks!

  • @murilocintra180
    @murilocintra180 6 лет назад +2

    Excellent demonstration of PCA, really helpful​. I just don't understand why in pc object, you use only training data instead of the entire data.

    • @bkrai
      @bkrai  6 лет назад

      We only use training data so that we can later use test data to assess prediction model.

  • @LlamaFina
    @LlamaFina 5 лет назад +1

    Great video! Thanks for sharing your knowledge.

    • @bkrai
      @bkrai  5 лет назад

      Thanks for comments!

  • @jinnythomas9815
    @jinnythomas9815 3 года назад +1

    Thanks for the video
    Please publish video on Exploratory Factor Analysis,Confirmatory Factor Analysis application in a model
    Also please explain the difference from PCA

    • @bkrai
      @bkrai  3 года назад

      Thanks for the suggestion, I've added this to my list.

  • @siddharthabingi
    @siddharthabingi 7 лет назад +2

    Great lecture. Thanks.

    • @bkrai
      @bkrai  4 года назад

      Thanks!

  • @maf4421
    @maf4421 3 года назад +1

    Thank you Dr. Bharatendra Rai for explaining PCA in detail. Can you please explain how to find weights of a variable by PCA for making a composite index? Is it rotation values that are for PC1, PC2, etc.? For example, if I have (I=w1*X+w2*Y+w3*Z) then how to find w1, w2, w3 by PCA.

    • @bkrai
      @bkrai  2 года назад

      For calculations you can refer to any textbook.

  • @vishalaaa1
    @vishalaaa1 3 года назад +1

    Hi, Most of the people who are seeing the videos are new to data science. Please explain the parameters of each function. Just typing wont help them. The more detail and the more slow, the better the no of views. I myself was a trainer.
    The difference between 1000 views and a million views is the clarity and completeness

    • @bkrai
      @bkrai  3 года назад

      Thanks for the feedback!

  • @andreafiore8373
    @andreafiore8373 3 года назад +1

    Thank you, this video will be really helpful to complete my thesis :)

    • @bkrai
      @bkrai  3 года назад +1

      Good luck!

  • @adityapatnaik7078
    @adityapatnaik7078 6 лет назад +2

    too good!! plz make more such videos...plz!

    • @bkrai
      @bkrai  6 лет назад

      Thanks for comments! You may find this useful too:
      ruclips.net/p/PL34t5iLfZddu8M0jd7pjSVUjvjBOBdYZ1

  • @souvikmukherjee7977
    @souvikmukherjee7977 Год назад +1

    sir, please make a session on factor analysis with prediction

    • @bkrai
      @bkrai  Год назад

      Thanks for the suggestion!

  • @sebvangeli
    @sebvangeli 7 лет назад +2

    Great work! Thank you

  • @mohammadj.shamim9342
    @mohammadj.shamim9342 7 лет назад +1

    Dear Respected Sir,
    I wanted to install ggbiplot using the command you provided with us. but it gives me another message. The message is (Installation failed: SSL certificate problem: self signed certificate in certificate chain
    Warning message:
    Username parameter is deprecated. Please use vqv/ggbiplot) I used vqv/ggbiplot as well, but no good results.
    please guide me what shall I do?

    • @bkrai
      @bkrai  7 лет назад

      Not sure what went wrong. May be some typo or something else. Probably you can try running commands using my R file.

  • @numitayogesh9280
    @numitayogesh9280 6 лет назад +2

    great lecture..please share your thoughts on machine learning introduction too

    • @bkrai
      @bkrai  6 лет назад

      For machine learning such random forest, neural networks, support vector machines, and extreme gradient boosting, you can refer to following:
      ruclips.net/p/PL34t5iLfZddu8M0jd7pjSVUjvjBOBdYZ1

  • @kashgarinn
    @kashgarinn 5 лет назад +1

    Great video, thanks for uploading.

    • @bkrai
      @bkrai  5 лет назад

      Thanks for comments!

  • @sainandankandikattu9077
    @sainandankandikattu9077 5 лет назад +1

    Awesome video! Could you plz add Partial least squares regression and principal components regression to your playlist! That would be of great help. Thanks in advance!

    • @bkrai
      @bkrai  4 года назад

      Thanks for suggestions!

  • @samdavepollard
    @samdavepollard 7 лет назад +2

    Thank You - this was extremely useful.
    Very nice channel you have here - easy sub.

    • @bkrai
      @bkrai  4 года назад

      Thanks for comments!

  • @manpreetkaur7716
    @manpreetkaur7716 Год назад +1

    Add a video on non negative matrix factorization like intNMF

    • @bkrai
      @bkrai  Год назад

      Thanks, I've added it to my list of future videos.

  • @deepikachandrasekaran3554
    @deepikachandrasekaran3554 3 года назад +1

    Very useful video sir. Could you explain me what is the need to partition the data into training and testing data?

    • @bkrai
      @bkrai  3 года назад

      You may review this:
      ruclips.net/video/aS1O8EiGLdg/видео.html

    • @deepikachandrasekaran3554
      @deepikachandrasekaran3554 3 года назад

      @@bkrai thank you sir.

  • @rainbowdu509
    @rainbowdu509 7 лет назад +2

    Thanks much appreciated..
    it worked

  • @saifsplaka
    @saifsplaka 7 лет назад +1

    Hi Sir,Could you take one session on SVD in R and also some theoretical explanation on it. I m finding it very difficult to understand it with most of the material available on the net.

  • @MinhasA
    @MinhasA 5 лет назад +1

    thank you for the amazing video!

    • @bkrai
      @bkrai  5 лет назад

      Thanks for comments!

  • @sidraghayas8583
    @sidraghayas8583 4 года назад +2

    Can you please help with combined pca and ann model?

    • @bkrai
      @bkrai  4 года назад

      I'm adding to the list of future videos.

  • @raisulalam6051
    @raisulalam6051 4 года назад +1

    Thank you

    • @bkrai
      @bkrai  4 года назад

      Welcome!

  • @katherinechau5594
    @katherinechau5594 3 года назад +1

    your videos are great :)

    • @bkrai
      @bkrai  3 года назад

      Thank you!

  • @nyadav378
    @nyadav378 10 месяцев назад

    Very informative and nice presentation sir, sir can we estimate PCA for factor (for eg species) with unequal no. of observation.
    And we want to see the correlations in terms of each species viz for setosa or other two, how to do it? Please explain...Thank You

  • @jayashriraghunath3210
    @jayashriraghunath3210 4 года назад +1

    Awesome explanation sir...👍👍can you make a video for independent component analysis using r in the same way sir?

    • @bkrai
      @bkrai  4 года назад

      Thanks, I've have added it to my list.

  • @sathishrs3
    @sathishrs3 7 лет назад +2

    Hi Sir, your materials are simple and wonderful. Pls do one video for xgboost. that would be great.

    • @bkrai
      @bkrai  7 лет назад +1

      Thanks for the suggestion!

    • @sathishrs3
      @sathishrs3 7 лет назад

      Bharatendra Rai Thanks a lot sir.

    • @flamboyantperson5936
      @flamboyantperson5936 7 лет назад

      I agree with sathish ravi, Sir please make a video on xgboost. You are one stop solution for every problem and I will remember you all my life.

  • @ramp2011
    @ramp2011 7 лет назад +1

    Awesome video. Thank you. As time permits can you do a video on use of caret package? thank you

    • @bkrai
      @bkrai  4 года назад

      Saw this today. Thanks for comments!

  • @Pankajjadwal
    @Pankajjadwal 7 лет назад +2

    It was a fruitful video.Can you please share the code.

  • @tesfayewoldesemayate4506
    @tesfayewoldesemayate4506 Год назад +1

    Nice presentation. when you are coding line 8 you said a sample of size 2, which size are you referring to? Thanks

    • @bkrai
      @bkrai  Год назад

      For partitioning the data in to two, training and testing.

  • @SaranathenArun11E214
    @SaranathenArun11E214 6 лет назад +2

    brilliant sir..simple and sweet..thanks...nice music....if i have 10 DISCRETE VARIABLEShow to reduce to 2 or 3 components, please explain?

    • @bkrai
      @bkrai  6 лет назад

      Thanks for comments! Note that this method is only for numeric variables.

  • @abiani007
    @abiani007 4 года назад +1

    Can you upload a video describing independent component analysis in R

    • @bkrai
      @bkrai  4 года назад +1

      I've added it to my list.

  • @janardhankadari3286
    @janardhankadari3286 2 года назад +1

    Interesting

    • @bkrai
      @bkrai  2 года назад

      thanks!

  • @anuraratnasiri5516
    @anuraratnasiri5516 4 года назад +1

    Thank you so... much!

    • @bkrai
      @bkrai  4 года назад +1

      Thanks for comments!

  • @abhiagni242
    @abhiagni242 7 лет назад +2

    thanks for the video sir... helped a lot :)

    • @bkrai
      @bkrai  7 лет назад

      Thanks for the feedback!

  • @bindumadhavi6259
    @bindumadhavi6259 5 лет назад +1

    Sooo much love you sir.This helped me perfect

    • @bkrai
      @bkrai  5 лет назад

      Thanks for comments!

  • @sunilbobb
    @sunilbobb 6 лет назад +1

    Sir - Requesting you to kindly give a lecture advanced r programming like on H20 packages etc..

    • @bkrai
      @bkrai  6 лет назад

      Thanks for the suggestion, I've added this to my list.

  • @wani212
    @wani212 6 лет назад +1

    Thank you so much for this video. Will you please make a video on Broken-line regression in R?

    • @bkrai
      @bkrai  6 лет назад

      Thanks for the suggestion, I've added this to my list.

  • @abhishek894
    @abhishek894 2 года назад +1

    Thank you for this nice video Dr. Rai.
    I have a doubt. Why the predict function was used multiple times. After the prcomp function, all the data of Principle components were available in:
    pc$x.
    Why do we have to do:
    trg

    • @bkrai
      @bkrai  2 года назад

      In R you can get same thing in multiple ways. This is just for illustration.

    • @abhishek894
      @abhishek894 2 года назад +1

      @@bkrai Thank you Sir. That makes it clear.

    • @bkrai
      @bkrai  2 года назад

      @@abhishek894 You are welcome!

  • @statistician2856
    @statistician2856 2 года назад +1

    sir my data is showing [ reached getOption("max.print") -- omitted 10 rows ]. the last 10 rows are omitted, how to fix this, please

    • @bkrai
      @bkrai  2 года назад +1

      That's just how much gets printed. But all data still remains intact.

  • @seaatm
    @seaatm 5 лет назад +2

    Cool video! Can you do a video about Multiple Correspondance Analysis(MCA) for cualitative data? It would help me a lot

    • @bkrai
      @bkrai  5 лет назад

      Thanks, I've added this to my list.

  • @prithvivasireddy5564
    @prithvivasireddy5564 4 года назад +1

    Awesome video sir...kudos... :)
    1 doubt though .... 20:48 - why are we using 2 components only? How do we know how many principal components to use?(species ~ PC1 + PC2)

    • @bkrai
      @bkrai  4 года назад

      2 PCs capture more than 95% of the variability in the data. Other 2 only add about 5%. So you can choose to have PCs that capture over 80% or 90% of the variability.

  • @inesceciliacardonadevoz5072
    @inesceciliacardonadevoz5072 4 года назад +1

    Thanks for this video sir, very good class but I can´t get it. because Error ... could not find function "ggbiplot". Excuse me, which is your R version ?

    • @bkrai
      @bkrai  4 года назад

      Try this:
      library(devtools)
      install_github("vqv/ggbiplot")

  • @anigov
    @anigov 6 лет назад

    Dear Sir..thanks for a wonderful video. I have some questions.
    1) At 20:18, why did u choose to reorder by setosa?
    2)Why did you choose to use data as trg and not training to build mymodel given that trg has predictions from training
    3) Can PCA be used to choose k in kmeans. If so, how to go about it?
    Thanks again.
    Regards

  • @azzeddinereghais7494
    @azzeddinereghais7494 3 года назад

    Good evening
    If you want to show the first dimension (Dim1) and the third dimension (Dim3)
    What to do or if you can provide the code for that
    Thanks

  • @alessandrorosati969
    @alessandrorosati969 Год назад +1

    can a dataset consisting of the principal components and the target variable be used to perform machine learning techniques?

    • @bkrai
      @bkrai  Год назад

      Yes, this video shows an example of doing it.

  • @ainli4125466
    @ainli4125466 2 года назад

    Thank you for sharing, I get an error "Error in plot_label(p = p, data = plot.data, label = label, label.label = label.label, : Unsupported class: prcomp"", when I try to run the ggbiplot. Would you please advise how to fix it?

  • @johnstevenson6458
    @johnstevenson6458 2 года назад +1

    Great video. Do you have a suggested package for running binary logistic regression? From a brief scan of nnet it appears to only have arguments for multinomial response variables. Thank you.

    • @bkrai
      @bkrai  2 года назад

      You can refer to this:
      ruclips.net/video/AVx7Wc1CQ7Y/видео.html

    • @johnstevenson6458
      @johnstevenson6458 2 года назад +1

      @@bkrai sorry I was unclear in my message. I was hoping for a suggested package to run a binary logistic regression using PCA components as predictors - similar to what you have done here with multinomial. Any suggestions are welcome.

    • @bkrai
      @bkrai  2 года назад

      Yes, you can use the PCA components as predictors and run binary logistic regression as shown in the link that I sent earlier.

  • @garykuleck1320
    @garykuleck1320 2 года назад +1

    Dr. Rai,
    Thanks for this informative video. I am having a problem getting the predict function to work with the model created on the training dataset. I am getting two errors(paraphrased): 1. NAs not allowed in subscripted assignments; 2. newdata has 1900 rows but variables found have 8100 rows. I think it is looking for the same number of rows in the test dataset. Is there something I am doing wrong? Appreciate any feedback.

    • @bkrai
      @bkrai  2 года назад

      NAs occur when there is missing data. For handling missing values, refer to:
      ruclips.net/video/An7nPLJ0fsg/видео.html

  • @dejunli6417
    @dejunli6417 Год назад +1

    Hi, I want to know from where can I get the iris example data ? thank you!

    • @bkrai
      @bkrai  Год назад

      It's inbuilt in R itself. You can access it by running first 3 lines shown in the video.

  • @md.tabibulislam9740
    @md.tabibulislam9740 6 лет назад

    Firstly thank you for your helpful video. I have problem to add ellipse in the plot. I have 30 variables, first 29 is the numeric and last one is the factor variables. But i can,t plot the ellipse in the PCA plot. How can i solve this? Please help.

  • @earlymorningcodes6100
    @earlymorningcodes6100 4 года назад +1

    Orthogonality of principal component- 10:17

  • @sonalichakrabarty1618
    @sonalichakrabarty1618 2 года назад +1

    Can you please show back propagation algorithm in r

    • @bkrai
      @bkrai  2 года назад

      Refer to this:
      ruclips.net/video/-Vs9Vae2KI0/видео.html

  • @harishnagpal21
    @harishnagpal21 5 лет назад +1

    Hi Bharatendra, nice video. I have got couple queries. If there are large no of numeric variables and through PCA we find that they are highly correlated then before going for model building
    1) Do we need to remove highly correlated variables !
    2) which one to remove ! Thanks

    • @bkrai
      @bkrai  5 лет назад +1

      You don't need to remove if you are using the components for developing a prediction model. This video provides a similar example.

    • @harishnagpal21
      @harishnagpal21 5 лет назад

      thanks

    • @desert00200
      @desert00200 5 лет назад +2

      Principal components are orthogonal to each other, saying differently they are uncorrelated and can be used as is in model building.

    • @bkrai
      @bkrai  4 года назад

      Thanks!

  • @golumworks
    @golumworks 2 года назад

    If I just use addEllipses =TRUE, what determines the size of those ellipses? Also, if I specify ellipse.type = “confidence”, what confidence level is used to generate the ellipses? I used factoextra if that helps.

  • @BbakMs
    @BbakMs 6 лет назад

    Sir, I am doing PCA analysis on DJ 30 Stocks and when I view pca$loadings for 30 variables, I noticed that some were not displayed. For example, Component 1 has -0.218 for Apple but then shows none for JPM, what does this mean?

  • @francisattahegwumah2047
    @francisattahegwumah2047 5 лет назад +1

    Thank you very much for the video... I am in interested in learning R program from the basic. Please, can you teach me using some of your videos?

    • @bkrai
      @bkrai  5 лет назад

      Here are some playlists that you can choose from based on your interest:
      Machine Learning videos: goo.gl/WHHqWP
      Becoming Data Scientist: goo.gl/JWyyQc
      Introductory R Videos: goo.gl/NZ55SJ
      Deep Learning with TensorFlow: goo.gl/5VtSuC
      Image Analysis & Classification: goo.gl/Md3fMi
      Text mining: goo.gl/7FJGmd
      Data Visualization: goo.gl/Q7Q2A8
      Playlist: goo.gl/iwbhnE

  • @mukeshchoudhary2842
    @mukeshchoudhary2842 3 года назад +1

    Great video.. What if we want to include factor-like "Control and Heat" for genotypes? Please suggest

    • @bkrai
      @bkrai  2 года назад

      It should work fine.

  • @earlymorningcodes6100
    @earlymorningcodes6100 4 года назад +1

    scatter plot & correlation coefficients 2:05

  • @earlymorningcodes6100
    @earlymorningcodes6100 4 года назад +1

    scatter Plat and Correlation- 2:04

  • @soumyanayak445
    @soumyanayak445 5 лет назад +1

    Sir why have you predicted the training and test data with respect to PC? can use trg data for making neural model and test using tst data set? and find correlation b/w act and predicted values?

    • @bkrai
      @bkrai  5 лет назад +1

      When there are many variables, chances of having multicollinearity problem increases. And PCA helps to solve that problem. And yes, you can use neural network model.

    • @soumyanayak445
      @soumyanayak445 5 лет назад +1

      @@bkrai sir can you please explain me the significance of the lines under the heading: prediction with principle components.As I am unable to understand why we are predicting twice on test data set. Please explain sir

    • @bkrai
      @bkrai  4 года назад

      To avoid over-fitting where you get very good result from training data but not so from testing.

  • @dioagusnofrizal9773
    @dioagusnofrizal9773 3 года назад +1

    Thanks sir, why in this video use linear regression? Can i use k means to clustering from pc1 and pc2?

    • @bkrai
      @bkrai  3 года назад

      Which line are you referring to?

    • @dioagusnofrizal9773
      @dioagusnofrizal9773 3 года назад

      Sorry, i mean logistic regression in line 59

  • @Jubo256
    @Jubo256 5 лет назад +1

    Hello, you put training [5] to reference the column on trg variable....
    shouldn't it be training[ , 5]?

    • @bkrai
      @bkrai  4 года назад

      It is training[ , 5] in the video.

  • @indranipal8131
    @indranipal8131 4 года назад +1

    Do you have a video on PCA for unsupervised learning via clustering and similarity ranking?

    • @bkrai
      @bkrai  4 года назад

      not yet.

  • @indian-de
    @indian-de 3 года назад +1

    thank you a lot for this support sir.
    If you could provide further guidance it would be very helpful. I am trying to build a models for metastasis prediction using single cell gene expression levels.
    kindly let me know if it would be possible for you. thanks again

    • @bkrai
      @bkrai  3 года назад +1

      You may find this useful:
      ruclips.net/video/Uil2GZa8gbg/видео.html

  • @nahalhoghooghi8575
    @nahalhoghooghi8575 5 лет назад +1

    Great job, same as always. Can I use PCA for 2 or more categorical variables? Can I define those variables as 0 and 1 in PCA?

    • @bkrai
      @bkrai  5 лет назад

      You can only use numeric variables. You can try using 0 and 1 and see if it works ok.

  • @parametersofstatistics2145
    @parametersofstatistics2145 4 года назад +1

    Thanks sir .....can u please tell me how start learning on R from beginning?

    • @bkrai
      @bkrai  4 года назад

      You can start with this playlist:
      ruclips.net/p/PL34t5iLfZddv8tJkZboegN6tmyh2-zr_T

  • @k5555-b4f
    @k5555-b4f 7 лет назад +1

    Hello great video as always! However one question i had (even though you warned against hard interpretability of results) relates to how to interpret the coefficients. If we look at the coefficient table and read the first line (after the intercept), does that mean that with every increase of Sepal.Length there is a log odd increase of 14.05 in the probability of categorizing the specie as Versicolor, relative to a Setosa? Thanks!

    • @bkrai
      @bkrai  7 лет назад

      Your interpretation is correct.

    • @k5555-b4f
      @k5555-b4f 7 лет назад +1

      Thank you! Keep up the good work! Your r videos are great!

    • @VenkateshDataScientist
      @VenkateshDataScientist 6 лет назад

      Sir ..ggbiplot is not installed hence cant work on this ..though i followed the video throughly

  • @rainbowdu509
    @rainbowdu509 7 лет назад +1

    Hi..good day bharatendra..I want to replace one my columns with value 1 for all its elements,what is the code in R studio..thanks for your time?

    • @bkrai
      @bkrai  7 лет назад

      suppose you are using following data:
      data(iris)
      To add what you indicated to a "new" column, you can use:
      iris$new

    • @rainbowdu509
      @rainbowdu509 7 лет назад

      thanx for ur ans ..I do already have a column with different values,I wanna replace all values on that column with just 1

    • @bkrai
      @bkrai  7 лет назад +1

      So for iris data if you want to change all values for Sepal.Length variable to 1, you can use:
      iris$Sepal.Length