aman sir, please make a video only containing the different topics of different subjects to be prepared to become a machine learning engineer and data scientist, include every topic from a to z of different subjects like mathematics, programming, database language to learn, etc etc and also different framework and different packages etc... i mean please include everything... that will be very helpful to everyone, thank you sir
Sir please explain in detail the math behind algorithms because that is the most crucial in understanding than the python implementation. And it is mentioned in sklearn documentation for LDA that it used Bayes probablitic rule, which yiu have not mentioned. Sir i request you to create videos with full information.
Summary for [Linear discriminant analysis explained | LDA algorithm in python | LDA algorithm explained](ruclips.net/video/2EjGJXzsC2U/видео.html) Title: "Understanding Linear Discriminant Analysis (LDA) and Its Implementation in Python with HDFC Bank Example" [00:01]([00:01](ruclips.net/video/2EjGJXzsC2U/видео.html&t=1)) Introduction to Linear Discriminant Analysis (LDA) - LDA stands for Linear Discriminant Analysis, focusing on discriminating factors between two categories or classes. - Video covers the mathematics, differences from PCA, Python implementation, and best practices for using LDA. [02:01]([02:01](ruclips.net/video/2EjGJXzsC2U/видео.html&t=121)) Linear Discriminant Analysis simplifies classification using one or more axes. - By projecting data onto a single axis, LDA creates a clear boundary for classification. - Adding multiple dimensions requires visualizing data points in multi-dimensional space for accurate decision boundaries. [03:56]([03:56](ruclips.net/video/2EjGJXzsC2U/видео.html&t=236)) LDA helps in creating new axis for better decision boundaries - LDA generates new axis to project data for easier classification and boundary creation - LDA differs from PCA by focusing on creating axis for classification rather than maximizing variance [05:57]([05:57](ruclips.net/video/2EjGJXzsC2U/видео.html&t=357)) LDA vs PCA purpose difference - LDA creates components to provide separation boundary for easier projection - PCA creates principal components to capture maximum variance of the data [08:02]([08:02](ruclips.net/video/2EjGJXzsC2U/видео.html&t=482)) Linear discriminant analysis (LDA) maximizes mean difference and minimizes variance for effective categorization - LDA works by maximizing the difference between mean of two groups - Variance should be minimized for effective categorization in LDA [09:57]([09:57](ruclips.net/video/2EjGJXzsC2U/видео.html&t=597)) LDA cost function and data projection - Cost function of LDA is important for optimization - Data projection on a new axis using eigenvalues and eigenvectors [11:48]([11:48](ruclips.net/video/2EjGJXzsC2U/видео.html&t=708)) Linear discriminant analysis (LDA) projects data onto new axis for clear separation. - Using LDA in Python involves importing necessary libraries, loading data with distinct classes, and fitting the model. - New axes generated by LDA provide clear boundaries for classification, but may not work for all datasets. [13:41]([13:41](ruclips.net/video/2EjGJXzsC2U/видео.html&t=821)) LDA requires distinction in classes for effective use. - LDA works effectively when there is clear distinction in features of underlying classes. - Feature engineering is necessary to make data distinct for LDA to create boundaries easily.
Thank you so much Aman for another insightful video. All your videos are really good. I have seen almost all your videos. All the concepts get clear when I learn from your channel. Thanks for your great help . And In this series can you please make videos on SVD and TSNE as well.. Though I search from channels you give very clear explanation. Will be waiting for more videos
Shape of the constraint in lasso is diamond, when the optimization touches corner, it becomes zero. Please read little about it, it's not difficult to understand.
@@UnfoldDataScience 3:58 decision boundary is wrong or the data symbols used are incorrect. 7:35 purpose is laughable. 13:13 I dont see any separation boundary. You dont talk about how it works with data with more than 2 variables / the general cost function. Maximizing the cost function is just to differentiate. Anyone can import sklearn and use it. Nothing useful there as well. Hope you know it is used in classification as well not just dimension reduction, but you never even mention it in the explanation part. But you show the classification example in your code. How is this vid useful to anyone?
Access English, Hindi Course here - www.unfolddatascience.com/store
Don't forget to register on the website, it's free🙂
I think another difference between PCA and LDA is,- PCA is unsupervised ML technique whereas LDA is supervised.
PCA use for both supervised , unsupervised
By ignoring target feature
aman sir, please make a video only containing the different topics of different subjects to be prepared to become a machine learning engineer and data scientist,
include every topic from a to z of different subjects like mathematics, programming, database language to learn, etc etc
and also different framework and different packages etc...
i mean please include everything...
that will be very helpful to everyone,
thank you sir
Great video Aman 💯💯
Thanks a lot.
Excellent explanation!
You are brilliant, man! Great teacher, explainer.
Sir please explain in detail the math behind algorithms because that is the most crucial in understanding than the python implementation.
And it is mentioned in sklearn documentation for LDA that it used Bayes probablitic rule, which yiu have not mentioned. Sir i request you to create videos with full information.
Your simplicity appreciated
Thanks Sachin.
I learned alot from your videos,it really help in aiml course
Very nice Aman
Thank you so much sir , all videos are very informative.🙏🙏
good video.
Thank you Aman 😊
Summary for [Linear discriminant analysis explained | LDA algorithm in python | LDA algorithm explained](ruclips.net/video/2EjGJXzsC2U/видео.html)
Title: "Understanding Linear Discriminant Analysis (LDA) and Its Implementation in Python with HDFC Bank Example"
[00:01]([00:01](ruclips.net/video/2EjGJXzsC2U/видео.html&t=1)) Introduction to Linear Discriminant Analysis (LDA)
- LDA stands for Linear Discriminant Analysis, focusing on discriminating factors between two categories or classes.
- Video covers the mathematics, differences from PCA, Python implementation, and best practices for using LDA.
[02:01]([02:01](ruclips.net/video/2EjGJXzsC2U/видео.html&t=121)) Linear Discriminant Analysis simplifies classification using one or more axes.
- By projecting data onto a single axis, LDA creates a clear boundary for classification.
- Adding multiple dimensions requires visualizing data points in multi-dimensional space for accurate decision boundaries.
[03:56]([03:56](ruclips.net/video/2EjGJXzsC2U/видео.html&t=236)) LDA helps in creating new axis for better decision boundaries
- LDA generates new axis to project data for easier classification and boundary creation
- LDA differs from PCA by focusing on creating axis for classification rather than maximizing variance
[05:57]([05:57](ruclips.net/video/2EjGJXzsC2U/видео.html&t=357)) LDA vs PCA purpose difference
- LDA creates components to provide separation boundary for easier projection
- PCA creates principal components to capture maximum variance of the data
[08:02]([08:02](ruclips.net/video/2EjGJXzsC2U/видео.html&t=482)) Linear discriminant analysis (LDA) maximizes mean difference and minimizes variance for effective categorization
- LDA works by maximizing the difference between mean of two groups
- Variance should be minimized for effective categorization in LDA
[09:57]([09:57](ruclips.net/video/2EjGJXzsC2U/видео.html&t=597)) LDA cost function and data projection
- Cost function of LDA is important for optimization
- Data projection on a new axis using eigenvalues and eigenvectors
[11:48]([11:48](ruclips.net/video/2EjGJXzsC2U/видео.html&t=708)) Linear discriminant analysis (LDA) projects data onto new axis for clear separation.
- Using LDA in Python involves importing necessary libraries, loading data with distinct classes, and fitting the model.
- New axes generated by LDA provide clear boundaries for classification, but may not work for all datasets.
[13:41]([13:41](ruclips.net/video/2EjGJXzsC2U/видео.html&t=821)) LDA requires distinction in classes for effective use.
- LDA works effectively when there is clear distinction in features of underlying classes.
- Feature engineering is necessary to make data distinct for LDA to create boundaries easily.
How to identify which feature extraction method we going to apply?
I mean when we see a dataset?
thank you bro , you are a Great teacher , so please make a video on SVM
Thankyou so much sir 😇🙏
Welcome Hariom.
Great video. Can we have a similar one on t-SNE?
Thank you so much Aman for another insightful video. All your videos are really good. I have seen almost all your videos. All the concepts get clear when I learn from your channel. Thanks for your great help . And In this series can you please make videos on SVD and TSNE as well.. Though I search from channels you give very clear explanation. Will be waiting for more videos
Thanks Anu. Sure.
I have given interview with one of product based company.. they asked why lasso regression peanlize to zero..? It would be great if you explain this..
Shape of the constraint in lasso is diamond, when the optimization touches corner, it becomes zero. Please read little about it, it's not difficult to understand.
What great trainings you have!!! I hope you have a tutorial video on Genetic Algorithm.
Great suggestion! Thank you
Aman would you explain what is Eigen Value Decomposition ,Eigen Space and Eigen Face in Detail 🙏🙏
Ok let me take that as one topic in list to come.
That means how kernel trick is different from LDA sir
Data is projected on new axis in LDA.
Kernel Trick Use is SVM with different types of kernels like rbf ,linear etc
In SVM we don't reduce dimensionality
How to decide if we should use lda, pca or svd?
Cant decide in advance based on what is the objective and how the data distribution is we need to take call
Very good explanation. Could you make a video on differences between LDA VS QDA?
Yes, sure. Thanks for suggesting
i am really confused! what is difference between LDA (Linear discriminant analysis) and LDA (latent dirichlet allocation)?
@Muhammad Zubair Thanks a lot for the explanation!
You didn't say one Difference that PCA is Unsupervised learning and LDA is Supervised Learning
Thanks for suggesting. Yes may be i missed. I love these when you interact through comments. Keep rocking
Plz explain tsne
Moore Ronald Thompson Thomas Williams Barbara
Please say in hindi.
Clark Carol Lewis Amy Taylor Lisa
Very poor explanation.
Thanks for your feedback. Most people found it useful though
@@UnfoldDataScience 3:58 decision boundary is wrong or the data symbols used are incorrect. 7:35 purpose is laughable. 13:13 I dont see any separation boundary. You dont talk about how it works with data with more than 2 variables / the general cost function. Maximizing the cost function is just to differentiate. Anyone can import sklearn and use it. Nothing useful there as well. Hope you know it is used in classification as well not just dimension reduction, but you never even mention it in the explanation part. But you show the classification example in your code. How is this vid useful to anyone?
Very good Aman