Linear discriminant analysis explained | LDA algorithm in python | LDA algorithm explained

Поделиться
HTML-код
  • Опубликовано: 28 ноя 2024

Комментарии • 50

  • @UnfoldDataScience
    @UnfoldDataScience  2 года назад

    Access English, Hindi Course here - www.unfolddatascience.com/store
    Don't forget to register on the website, it's free🙂

  • @souravbiswas6892
    @souravbiswas6892 2 года назад +6

    I think another difference between PCA and LDA is,- PCA is unsupervised ML technique whereas LDA is supervised.

    • @vinaykannam3629
      @vinaykannam3629 Год назад

      PCA use for both supervised , unsupervised
      By ignoring target feature

  • @kalam_indian
    @kalam_indian 2 года назад +2

    aman sir, please make a video only containing the different topics of different subjects to be prepared to become a machine learning engineer and data scientist,
    include every topic from a to z of different subjects like mathematics, programming, database language to learn, etc etc
    and also different framework and different packages etc...
    i mean please include everything...
    that will be very helpful to everyone,
    thank you sir

  • @navoditmehta8833
    @navoditmehta8833 2 года назад +1

    Great video Aman 💯💯

  • @dsavkay
    @dsavkay 8 месяцев назад

    Excellent explanation!

  • @johnsonkachidza9621
    @johnsonkachidza9621 2 года назад +1

    You are brilliant, man! Great teacher, explainer.

  • @Annasupari
    @Annasupari 2 года назад +1

    Sir please explain in detail the math behind algorithms because that is the most crucial in understanding than the python implementation.
    And it is mentioned in sklearn documentation for LDA that it used Bayes probablitic rule, which yiu have not mentioned. Sir i request you to create videos with full information.

  • @sachinahankari
    @sachinahankari 8 месяцев назад

    Your simplicity appreciated

  • @sahil2pradhan
    @sahil2pradhan 2 года назад

    I learned alot from your videos,it really help in aiml course

  • @sadhnarai8757
    @sadhnarai8757 2 года назад

    Very nice Aman

  • @prajaktawarghat1850
    @prajaktawarghat1850 2 года назад

    Thank you so much sir , all videos are very informative.🙏🙏

  • @AmanKumar-gq7li
    @AmanKumar-gq7li 2 года назад

    good video.

  • @rajaneeshkumar4518
    @rajaneeshkumar4518 2 дня назад

    Summary for [Linear discriminant analysis explained | LDA algorithm in python | LDA algorithm explained](ruclips.net/video/2EjGJXzsC2U/видео.html)
    Title: "Understanding Linear Discriminant Analysis (LDA) and Its Implementation in Python with HDFC Bank Example"
    [00:01]([00:01](ruclips.net/video/2EjGJXzsC2U/видео.html&t=1)) Introduction to Linear Discriminant Analysis (LDA)
    - LDA stands for Linear Discriminant Analysis, focusing on discriminating factors between two categories or classes.
    - Video covers the mathematics, differences from PCA, Python implementation, and best practices for using LDA.
    [02:01]([02:01](ruclips.net/video/2EjGJXzsC2U/видео.html&t=121)) Linear Discriminant Analysis simplifies classification using one or more axes.
    - By projecting data onto a single axis, LDA creates a clear boundary for classification.
    - Adding multiple dimensions requires visualizing data points in multi-dimensional space for accurate decision boundaries.
    [03:56]([03:56](ruclips.net/video/2EjGJXzsC2U/видео.html&t=236)) LDA helps in creating new axis for better decision boundaries
    - LDA generates new axis to project data for easier classification and boundary creation
    - LDA differs from PCA by focusing on creating axis for classification rather than maximizing variance
    [05:57]([05:57](ruclips.net/video/2EjGJXzsC2U/видео.html&t=357)) LDA vs PCA purpose difference
    - LDA creates components to provide separation boundary for easier projection
    - PCA creates principal components to capture maximum variance of the data
    [08:02]([08:02](ruclips.net/video/2EjGJXzsC2U/видео.html&t=482)) Linear discriminant analysis (LDA) maximizes mean difference and minimizes variance for effective categorization
    - LDA works by maximizing the difference between mean of two groups
    - Variance should be minimized for effective categorization in LDA
    [09:57]([09:57](ruclips.net/video/2EjGJXzsC2U/видео.html&t=597)) LDA cost function and data projection
    - Cost function of LDA is important for optimization
    - Data projection on a new axis using eigenvalues and eigenvectors
    [11:48]([11:48](ruclips.net/video/2EjGJXzsC2U/видео.html&t=708)) Linear discriminant analysis (LDA) projects data onto new axis for clear separation.
    - Using LDA in Python involves importing necessary libraries, loading data with distinct classes, and fitting the model.
    - New axes generated by LDA provide clear boundaries for classification, but may not work for all datasets.
    [13:41]([13:41](ruclips.net/video/2EjGJXzsC2U/видео.html&t=821)) LDA requires distinction in classes for effective use.
    - LDA works effectively when there is clear distinction in features of underlying classes.
    - Feature engineering is necessary to make data distinct for LDA to create boundaries easily.

  • @vinaykannam3629
    @vinaykannam3629 Год назад

    How to identify which feature extraction method we going to apply?
    I mean when we see a dataset?

  • @CodeExplorers5464
    @CodeExplorers5464 2 года назад

    thank you bro , you are a Great teacher , so please make a video on SVM

  • @HariomKumar-wi1mk
    @HariomKumar-wi1mk 2 года назад +1

    Thankyou so much sir 😇🙏

  • @PramodKumar-su8xv
    @PramodKumar-su8xv 7 месяцев назад

    Great video. Can we have a similar one on t-SNE?

  • @anu.s3283
    @anu.s3283 2 года назад +1

    Thank you so much Aman for another insightful video. All your videos are really good. I have seen almost all your videos. All the concepts get clear when I learn from your channel. Thanks for your great help . And In this series can you please make videos on SVD and TSNE as well.. Though I search from channels you give very clear explanation. Will be waiting for more videos

  • @VikashKumar-je6fb
    @VikashKumar-je6fb 2 года назад

    I have given interview with one of product based company.. they asked why lasso regression peanlize to zero..? It would be great if you explain this..

    • @UnfoldDataScience
      @UnfoldDataScience  2 года назад +1

      Shape of the constraint in lasso is diamond, when the optimization touches corner, it becomes zero. Please read little about it, it's not difficult to understand.

  • @maryamazari1758
    @maryamazari1758 2 года назад

    What great trainings you have!!! I hope you have a tutorial video on Genetic Algorithm.

  • @indiannationalist07
    @indiannationalist07 2 года назад

    Aman would you explain what is Eigen Value Decomposition ,Eigen Space and Eigen Face in Detail 🙏🙏

  • @sarans3185
    @sarans3185 2 года назад +1

    That means how kernel trick is different from LDA sir

  • @ayushsengar4153
    @ayushsengar4153 2 года назад

    How to decide if we should use lda, pca or svd?

    • @UnfoldDataScience
      @UnfoldDataScience  2 года назад

      Cant decide in advance based on what is the objective and how the data distribution is we need to take call

  • @rishigupta2342
    @rishigupta2342 2 года назад

    Very good explanation. Could you make a video on differences between LDA VS QDA?

  • @shakhawathossainsajal1859
    @shakhawathossainsajal1859 2 года назад

    i am really confused! what is difference between LDA (Linear discriminant analysis) and LDA (latent dirichlet allocation)?

  • @indiannationalist07
    @indiannationalist07 2 года назад +1

    You didn't say one Difference that PCA is Unsupervised learning and LDA is Supervised Learning

    • @UnfoldDataScience
      @UnfoldDataScience  2 года назад +1

      Thanks for suggesting. Yes may be i missed. I love these when you interact through comments. Keep rocking

  • @phanikumar3136
    @phanikumar3136 2 года назад

    Plz explain tsne

  • @DsfgdteChgdgxc-o5h
    @DsfgdteChgdgxc-o5h 3 месяца назад

    Moore Ronald Thompson Thomas Williams Barbara

  • @Abhinavkumar-og3xd
    @Abhinavkumar-og3xd 7 месяцев назад

    Please say in hindi.

  • @CurmeLeila-m5w
    @CurmeLeila-m5w 2 месяца назад

    Clark Carol Lewis Amy Taylor Lisa

  • @Lord31325
    @Lord31325 2 года назад +2

    Very poor explanation.

    • @UnfoldDataScience
      @UnfoldDataScience  2 года назад

      Thanks for your feedback. Most people found it useful though

    • @Lord31325
      @Lord31325 2 года назад

      @@UnfoldDataScience 3:58 decision boundary is wrong or the data symbols used are incorrect. 7:35 purpose is laughable. 13:13 I dont see any separation boundary. You dont talk about how it works with data with more than 2 variables / the general cost function. Maximizing the cost function is just to differentiate. Anyone can import sklearn and use it. Nothing useful there as well. Hope you know it is used in classification as well not just dimension reduction, but you never even mention it in the explanation part. But you show the classification example in your code. How is this vid useful to anyone?

  • @sadhnarai8757
    @sadhnarai8757 2 года назад

    Very good Aman