1 Principal Component Analysis | PCA | Dimensionality Reduction in Machine Learning by Mahesh Huddar

Поделиться
HTML-код
  • Опубликовано: 4 окт 2024

Комментарии • 97

  • @kenway346
    @kenway346 Год назад +128

    I noticed that your channel contains the entirety of Data Mining taught at the Master's level! Thank you very much, subscribing immediately!

    • @MaheshHuddar
      @MaheshHuddar  Год назад +12

      Welcome
      Do like share and subscribe

  • @ishu_official
    @ishu_official 9 месяцев назад +27

    Super explanation.. today is my machine learning paper

  • @junaidahmad218
    @junaidahmad218 9 месяцев назад +5

    This man has depth knowledge of this topic.

    • @MaheshHuddar
      @MaheshHuddar  9 месяцев назад

      Thank You
      Do like share and subscribe

  • @gamingaddaaa2141
    @gamingaddaaa2141 14 дней назад +1

    Thank you sir for the information

  • @VDCreatures-kc6uf
    @VDCreatures-kc6uf 10 месяцев назад +7

    Super explanation..the best channel in RUclips to learn machine learning and ann topics ❤❤

    • @MaheshHuddar
      @MaheshHuddar  10 месяцев назад +1

      Thank You
      Do like share and subscribe

  • @TrueTalenta
    @TrueTalenta Год назад +23

    Amazing step-by-step outline!
    I love it💌, so I subscribe!

    • @MaheshHuddar
      @MaheshHuddar  Год назад

      Thank You
      Do like share and subscribe

  • @varshabiradar1482
    @varshabiradar1482 День назад

    You have explained very well. It will be good if you explain, "What is the use of calculating the pca in the geometrical representation you explained at the last?"

  • @User22_2g
    @User22_2g 4 дня назад

    Great work sir! Really very helpful for my exams. Very grateful to you 🙏
    One suggestion sir, kindly please share practice problems in the description or in comment box sir. Could you please do it as soon as possible sir?
    Thanks a lot sir 😀

  • @venkateshwarlupurumala6283
    @venkateshwarlupurumala6283 Год назад +4

    Very clear Explanation Sir.... Thank you so much...

    • @MaheshHuddar
      @MaheshHuddar  Год назад

      Welcome
      Please do like share and subscribe

  • @jambulingamlogababu8914
    @jambulingamlogababu8914 11 месяцев назад +4

    Excellent Teaching. Salute to you sir

    • @MaheshHuddar
      @MaheshHuddar  11 месяцев назад +1

      Welcome
      Do like share and

  • @NandeeshBilagi
    @NandeeshBilagi 6 месяцев назад +2

    Clear and nice explanation. Thanks for the video

    • @MaheshHuddar
      @MaheshHuddar  6 месяцев назад

      Welcome
      Do like share and subscribe

  • @rodrigorcbb
    @rodrigorcbb Год назад +5

    Thanks for the video. Great explanation!

  • @radhay4291
    @radhay4291 11 месяцев назад +2

    Thank u very much.Very clear explanation and it is to understand

    • @MaheshHuddar
      @MaheshHuddar  11 месяцев назад

      Welcome
      Do like share and subscribe

  • @Dinesh-be8ys
    @Dinesh-be8ys Год назад +4

    thank u for uploading like this video

    • @MaheshHuddar
      @MaheshHuddar  Год назад +1

      Welcome
      Do like share and subscribe

  • @RaviShankar-gm9of
    @RaviShankar-gm9of 8 месяцев назад +4

    Super Bhayya ...

  • @krishnachaitanya3089
    @krishnachaitanya3089 Год назад +3

    Thats a clear explanation i have seen

    • @MaheshHuddar
      @MaheshHuddar  Год назад +1

      Thank you
      Do like share and subscribe

  • @priyalmaheta690
    @priyalmaheta690 6 месяцев назад +1

    content and teaching is very good please also provide the notes it will be helpful

    • @MaheshHuddar
      @MaheshHuddar  6 месяцев назад

      Thank You
      Do like share and subscribe

  • @thilagarajthangamuthu2935
    @thilagarajthangamuthu2935 Год назад +3

    Thank you sir. Clear and easy to understand. Thank you.

  • @PRANEETHAMALAKAPALLI
    @PRANEETHAMALAKAPALLI 9 месяцев назад +2

    tq sir for this wonderful concept

    • @MaheshHuddar
      @MaheshHuddar  9 месяцев назад

      Welcome
      Do like share and subscribe

  • @Straight_Forward615
    @Straight_Forward615 6 месяцев назад +1

    thanks a lot for this wonderful lecture.

    • @MaheshHuddar
      @MaheshHuddar  6 месяцев назад

      Welcome!
      Do like share and subscribe

  • @priya1912
    @priya1912 9 месяцев назад +2

    thank u so much

    • @MaheshHuddar
      @MaheshHuddar  9 месяцев назад

      Welcome
      Do like share and subscribe

  • @kapras711
    @kapras711 Год назад +1

    super explanation .. very easy to understand with out any hook ups sir
    thanks ...Inspr KVV.Prasad

    • @MaheshHuddar
      @MaheshHuddar  Год назад

      Thank You
      Do like share and subscribe

  • @Blackoutfor10days
    @Blackoutfor10days 5 месяцев назад +2

    Can you add the concept of hidden Markov model in your machine learning playlist

  • @yashtiwari4696
    @yashtiwari4696 Год назад +3

    Sir please upload the content of ensemble methods bagging boosting and random forest

    • @MaheshHuddar
      @MaheshHuddar  Год назад +3

      Ensemble Learning: ruclips.net/video/eNyUfpGBLts/видео.html
      Random Forest: ruclips.net/video/kPq328mJNE0/видео.html

  • @srinivas664
    @srinivas664 2 месяца назад

    Nice presentation tq sir

  • @yuva_india123
    @yuva_india123 Год назад +1

    Thanks sir for your explanation 🎉

    • @MaheshHuddar
      @MaheshHuddar  Год назад +1

      Welcome
      Do like share and subscribe

  • @ashishlal7212
    @ashishlal7212 5 месяцев назад

    Thank you so much today is my data mining and ML paper

    • @MaheshHuddar
      @MaheshHuddar  5 месяцев назад

      Welcome
      Do like share and subscribe

  • @shubhangibaruah3940
    @shubhangibaruah3940 9 месяцев назад +1

    thank you sir, you were amazing🤩

    • @MaheshHuddar
      @MaheshHuddar  9 месяцев назад

      Welcome
      Please do like share and subscribe

  • @Husain8570
    @Husain8570 Месяц назад

    Best explanation

  • @SaifMohamed-de8uo
    @SaifMohamed-de8uo 8 месяцев назад

    thank you so much you are great professor

    • @MaheshHuddar
      @MaheshHuddar  8 месяцев назад +1

      You are very welcome
      Do like share and subscribe

  • @sinarezaei4288
    @sinarezaei4288 7 месяцев назад

    Thank you very much master huddar❤

    • @MaheshHuddar
      @MaheshHuddar  7 месяцев назад

      Welcome
      Do like share and subscribe

  • @waidapapa1514
    @waidapapa1514 Год назад +2

    Why we are not dealing with e2 means why we not do e2^T.[cov matrix]

    • @rohanshah8129
      @rohanshah8129 Год назад +1

      Here, we had considered 2 dimension as the high dimensonal data for example.
      One of the most usecase of PCA is in dimensionality reduction.
      So, if you want you can use e2 and get second PC. But then think about it.
      From 2 variable, we again got 2 variables. That's why he has shown only PC1.
      However, in reality we generally use 2 PC axes (mostly depends on your data). If it has a lot of variables, then 3 or 4 can also be good but we don't generally go beyond that. So, in this case you will need e2, e3 and e4 as well. So this is how it works.

    • @aravind25
      @aravind25 3 дня назад

      Should we consider e2 in place of e1​@@rohanshah8129

  • @putridisperindag6986
    @putridisperindag6986 10 месяцев назад +1

    thank you very much Sir, for ur explantion on that video. I still confused so I would like to ask how to get the value of: [-4.3052, 3.7361, 5.6928, -5.1238] how can I get the value. I still dont get. Thank u Sir

    • @jvbrothers5454
      @jvbrothers5454 9 месяцев назад

      yeahh im also confused how did he get im getting values diffrent 0.3761 5.6928 -5.128

  • @HamidAli-ff2zn
    @HamidAli-ff2zn Год назад

    Thank you so much sir amazing explaination♥♥♥

  • @nikks9969
    @nikks9969 6 месяцев назад

    Hello sir, thank you for your explanation.I have a doubt at 08:17 why you have considered only first equation?

    • @MaheshHuddar
      @MaheshHuddar  6 месяцев назад

      You will get same answer with second equation
      You can use either first or second no issues

  • @RaviShankar-gm9of
    @RaviShankar-gm9of 8 месяцев назад

    linear discriminent analysis please make a video bhayya

  • @advancedappliedandpuremath
    @advancedappliedandpuremath Год назад +1

    Sir book name please

  • @ManjupriyaR-p9r
    @ManjupriyaR-p9r 11 месяцев назад

    Thank you very much sir

    • @MaheshHuddar
      @MaheshHuddar  11 месяцев назад

      Welcome
      Do like share and subscribe

  • @Ateeq10
    @Ateeq10 8 месяцев назад

    Thank you

    • @MaheshHuddar
      @MaheshHuddar  8 месяцев назад

      Welcome
      Do like share and subscribe

  • @MadaraUchiha-wj8sl
    @MadaraUchiha-wj8sl 8 месяцев назад

    Thanks you,sir

    • @MaheshHuddar
      @MaheshHuddar  8 месяцев назад

      Welcome
      Do like share and subscribe

  • @mango-strawberry
    @mango-strawberry 4 месяца назад

    thanks a lot

    • @MaheshHuddar
      @MaheshHuddar  4 месяца назад

      You are most welcome
      Do like share and subscribe

  • @zafar151
    @zafar151 7 месяцев назад

    Excellent

    • @MaheshHuddar
      @MaheshHuddar  7 месяцев назад

      Thank You
      Do like share and subscribe

  • @Abhilashaisgood
    @Abhilashaisgood 4 месяца назад

    thankyou sirr, how to calculate 2nd pc?

    • @MaheshHuddar
      @MaheshHuddar  4 месяца назад

      Select the second eigen vector and multiply to the given feature matrix

  • @demodemo-o4z
    @demodemo-o4z Год назад +2

    Hi Sir, Great explanation about PCA. But when I searched the covariance matrix for more 2 variables it's showing that covariance is only done between 2 variables.
    How to calculate the covariance if a dataset have more than 2 variables. Could you please give an explanation on that.....!!

    • @fintech1378
      @fintech1378 Год назад +2

      you need to do for all pairwise combinations

    • @shahmirkhan1502
      @shahmirkhan1502 Год назад +3

      @fintech1378 is right. You need to do pairwise combinations. For example, for 4 variables, your covariance matrix will be 4x4 with the following combinations:
      cov(a, a) cov (a, b) cov (a, c) cov(a,d)
      cov(b, a) cov(b, b) cov(b, c) cov(b, d)
      cov(c, a) cov (c, b) cov(c, c) cov(c, d)
      cov(d, a) cov(d, b) cov (d, c) cov(d, d)

    • @rohanshah8129
      @rohanshah8129 Год назад +1

      If there are n variables, covariance matrix will be of nxn shape.

    • @parthibdey6005
      @parthibdey6005 7 месяцев назад

      is this covariance for reducing 4 to 1@@shahmirkhan1502

  • @muhammadsaad3793
    @muhammadsaad3793 10 месяцев назад

    Nice!

    • @MaheshHuddar
      @MaheshHuddar  10 месяцев назад

      Thank You
      Do like share and subscribe

  • @aefgfaefafe
    @aefgfaefafe Год назад +1

    بحبككككككككككككككككككككككككككككككككككككككككك يا سوسو

    • @MaheshHuddar
      @MaheshHuddar  Год назад +1

      What it means..?

    • @abishekraju4521
      @abishekraju4521 Год назад +2

      @@MaheshHuddar According to google translate: _"I love you sooo"_

  • @jameykeller5055
    @jameykeller5055 9 месяцев назад

    devru sir neevu

    • @MaheshHuddar
      @MaheshHuddar  9 месяцев назад

      Do like share and subscribe

  • @brucewayne.64
    @brucewayne.64 10 месяцев назад

    Thanks Sir

    • @MaheshHuddar
      @MaheshHuddar  10 месяцев назад

      Welcome
      Do like share and subscribe