Linear discriminant analysis explained | LDA algorithm in python | LDA algorithm explained

Поделиться
HTML-код
  • Опубликовано: 11 сен 2024

Комментарии • 50

  • @UnfoldDataScience
    @UnfoldDataScience  2 года назад

    Access English, Hindi Course here - www.unfolddatascience.com/store
    Don't forget to register on the website, it's free🙂

  • @kalam_indian
    @kalam_indian 2 года назад +2

    aman sir, please make a video only containing the different topics of different subjects to be prepared to become a machine learning engineer and data scientist,
    include every topic from a to z of different subjects like mathematics, programming, database language to learn, etc etc
    and also different framework and different packages etc...
    i mean please include everything...
    that will be very helpful to everyone,
    thank you sir

  • @souravbiswas6892
    @souravbiswas6892 2 года назад +5

    I think another difference between PCA and LDA is,- PCA is unsupervised ML technique whereas LDA is supervised.

    • @vinaykannam3629
      @vinaykannam3629 Год назад

      PCA use for both supervised , unsupervised
      By ignoring target feature

  • @johnsonkachidza9621
    @johnsonkachidza9621 2 года назад +1

    You are brilliant, man! Great teacher, explainer.

  • @Annasupari
    @Annasupari Год назад +1

    Sir please explain in detail the math behind algorithms because that is the most crucial in understanding than the python implementation.
    And it is mentioned in sklearn documentation for LDA that it used Bayes probablitic rule, which yiu have not mentioned. Sir i request you to create videos with full information.

  • @anu.s3283
    @anu.s3283 2 года назад +1

    Thank you so much Aman for another insightful video. All your videos are really good. I have seen almost all your videos. All the concepts get clear when I learn from your channel. Thanks for your great help . And In this series can you please make videos on SVD and TSNE as well.. Though I search from channels you give very clear explanation. Will be waiting for more videos

  • @navoditmehta8833
    @navoditmehta8833 2 года назад +1

    Great video Aman 💯💯

  • @dsavkay
    @dsavkay 5 месяцев назад

    Excellent explanation!

  • @sadhnarai8757
    @sadhnarai8757 2 года назад

    Very good Aman

  • @sahil2pradhan
    @sahil2pradhan 2 года назад

    I learned alot from your videos,it really help in aiml course

  • @PramodKumar-su8xv
    @PramodKumar-su8xv 4 месяца назад

    Great video. Can we have a similar one on t-SNE?

  • @sachinahankari
    @sachinahankari 5 месяцев назад

    Your simplicity appreciated

  • @CodeExplorers5464
    @CodeExplorers5464 Год назад

    thank you bro , you are a Great teacher , so please make a video on SVM

  • @prajaktawarghat1850
    @prajaktawarghat1850 Год назад

    Thank you so much sir , all videos are very informative.🙏🙏

  • @vinaykannam3629
    @vinaykannam3629 Год назад

    How to identify which feature extraction method we going to apply?
    I mean when we see a dataset?

  • @HariomKumar-wi1mk
    @HariomKumar-wi1mk 2 года назад +1

    Thankyou so much sir 😇🙏

  • @maryamazari1758
    @maryamazari1758 2 года назад

    What great trainings you have!!! I hope you have a tutorial video on Genetic Algorithm.

  • @AmanKumar-gq7li
    @AmanKumar-gq7li 2 года назад

    good video.

  • @VikashKumar-je6fb
    @VikashKumar-je6fb 2 года назад

    I have given interview with one of product based company.. they asked why lasso regression peanlize to zero..? It would be great if you explain this..

    • @UnfoldDataScience
      @UnfoldDataScience  2 года назад +1

      Shape of the constraint in lasso is diamond, when the optimization touches corner, it becomes zero. Please read little about it, it's not difficult to understand.

  • @rishigupta2342
    @rishigupta2342 2 года назад

    Very good explanation. Could you make a video on differences between LDA VS QDA?

  • @indiannationalist07
    @indiannationalist07 2 года назад

    Aman would you explain what is Eigen Value Decomposition ,Eigen Space and Eigen Face in Detail 🙏🙏

  • @sarans3185
    @sarans3185 2 года назад +1

    That means how kernel trick is different from LDA sir

  • @indiannationalist07
    @indiannationalist07 2 года назад +1

    You didn't say one Difference that PCA is Unsupervised learning and LDA is Supervised Learning

    • @UnfoldDataScience
      @UnfoldDataScience  2 года назад +1

      Thanks for suggesting. Yes may be i missed. I love these when you interact through comments. Keep rocking

  • @ayushsengar4153
    @ayushsengar4153 2 года назад

    How to decide if we should use lda, pca or svd?

    • @UnfoldDataScience
      @UnfoldDataScience  2 года назад

      Cant decide in advance based on what is the objective and how the data distribution is we need to take call

  • @shakhawathossainsajal1859
    @shakhawathossainsajal1859 2 года назад

    i am really confused! what is difference between LDA (Linear discriminant analysis) and LDA (latent dirichlet allocation)?

    • @muhammadzubair440
      @muhammadzubair440 Год назад +1

      LDA (Linear Discriminant Analysis) and LDA (Latent Dirichlet Allocation) are two different techniques used in different areas of data analysis.
      LDA (Linear Discriminant Analysis) is a supervised learning technique used for dimensionality reduction and classification. It aims to find a linear combination of features that maximizes the separation between classes in a dataset. In other words, it tries to project the data onto a lower-dimensional space such that the classes are well-separated. LDA is commonly used in pattern recognition, machine learning, and data analysis.
      LDA (Latent Dirichlet Allocation), on the other hand, is an unsupervised learning technique used in natural language processing and text mining. It is a probabilistic model that is used to discover the latent topics that underlie a collection of documents. LDA assumes that each document is a mixture of several topics, and each topic is a distribution over words. The goal of LDA is to learn the topic distribution for each document and the word distribution for each topic.
      So, while both techniques are referred to as "LDA," they are used for different purposes in different areas of data analysis.

    • @shakhawathossainsajal1859
      @shakhawathossainsajal1859 Год назад

      @@muhammadzubair440 Thanks a lot for the explanation!

  • @phanikumar3136
    @phanikumar3136 2 года назад

    Plz explain tsne

  • @DsfgdteChgdgxc-o5h
    @DsfgdteChgdgxc-o5h 12 дней назад

    Moore Ronald Thompson Thomas Williams Barbara

  • @Abhinavkumar-og3xd
    @Abhinavkumar-og3xd 5 месяцев назад

    Please say in hindi.

  • @Lord31325
    @Lord31325 2 года назад +2

    Very poor explanation.

    • @UnfoldDataScience
      @UnfoldDataScience  2 года назад

      Thanks for your feedback. Most people found it useful though

    • @Lord31325
      @Lord31325 2 года назад

      @@UnfoldDataScience 3:58 decision boundary is wrong or the data symbols used are incorrect. 7:35 purpose is laughable. 13:13 I dont see any separation boundary. You dont talk about how it works with data with more than 2 variables / the general cost function. Maximizing the cost function is just to differentiate. Anyone can import sklearn and use it. Nothing useful there as well. Hope you know it is used in classification as well not just dimension reduction, but you never even mention it in the explanation part. But you show the classification example in your code. How is this vid useful to anyone?

  • @CurmeLeila-m5w
    @CurmeLeila-m5w 3 дня назад

    Clark Carol Lewis Amy Taylor Lisa

  • @sadhnarai8757
    @sadhnarai8757 2 года назад

    Very nice Aman