StatQuest: Linear Discriminant Analysis (LDA) clearly explained.

Поделиться
HTML-код
  • Опубликовано: 17 ноя 2024

Комментарии • 913

  • @statquest
    @statquest  4 года назад +29

    NOTE: The StatQuest LDA Study Guide is available! statquest.gumroad.com
    Support StatQuest by buying my book The StatQuest Illustrated Guide to Machine Learning or a Study Guide or Merch!!! statquest.org/statquest-store/

    • @realcirno1750
      @realcirno1750 4 года назад +1

      woohoo

    • @WIFI-nf4tg
      @WIFI-nf4tg 3 года назад +1

      Can you please do something on canonical analysis ?

    • @statquest
      @statquest  3 года назад +1

      @@WIFI-nf4tg I'll keep that in mind.

    • @falaksingla6242
      @falaksingla6242 2 года назад

      Hi Josh,
      Love your content. Has helped me to learn a lot & grow. You are doing an awesome work. Please continue to do so.
      Wanted to support you but unfortunately your Paypal link seems to be dysfunctional. Please update it.

    • @aayushtheapple
      @aayushtheapple 2 года назад

      website shows "Error establishing a database connection" !

  • @yuniprastika7022
    @yuniprastika7022 4 года назад +517

    the funny thing is, so many materials from this channel are for those university students (like me) but he keeps treating us like kindergarten children. Haha feels like i'll never be growing up, by watching your videos sir! QUADRO BAAM SIR, THIS WORLD HAS BEEN GONE TOO SERIOUS, THANK YOU FOR BRINGING BACK THE JOY

    • @statquest
      @statquest  4 года назад +31

      Thank you very much! :)

    • @daisy-fb5jc
      @daisy-fb5jc 2 года назад +28

      I am a kindergarden kid in this subject : (

    • @andrejalabama1204
      @andrejalabama1204 Год назад +7

      @@daisy-fb5jc same here; i need someone to explain it like im a little kid

    • @SinoLegionaire
      @SinoLegionaire 11 месяцев назад +4

      Remember: Us Adults are just big children

  • @bokai5829
    @bokai5829 5 лет назад +378

    Every time I heard the intro music. I know my assignment is due in 2 days.

    • @statquest
      @statquest  5 лет назад +12

      LOL! :)

    • @bokai5829
      @bokai5829 5 лет назад +1

      @@statquest Thank you very much!

    • @KonesThe
      @KonesThe 4 года назад +6

      hahahah I'm on the same boat right now

    • @HenriqueEC1
      @HenriqueEC1 4 года назад +2

      Good to know I'm not alone.

    • @PJokerLP
      @PJokerLP Год назад +3

      10,5 hours till my machine learning exam. Thank you so much, I feel way better prepared than if I would have watched all of my class material.

  • @beccalynch4407
    @beccalynch4407 3 года назад +83

    Just spent hours so confused, watching my lectures where the professor used only lin alg and not a single picture. Watched this video and understood it right away. Thank you so much for what you do!

  • @robinduan1985
    @robinduan1985 6 лет назад +82

    This is amazing! 15 mins video does way better than my lecturer in an 2 hours class

    • @elise3455
      @elise3455 3 года назад +6

      While these 15 min videos are excellent for gaining intuition, you still often need those 2-hour classes to get familiar with the mathematical rigor.

    • @NoahElRhandour
      @NoahElRhandour 2 года назад +8

      @@elise3455 no you dont. math follows super quick and easy when you understood what it is about

    • @GaganSingh-zz9el
      @GaganSingh-zz9el 2 года назад +1

      @@NoahElRhandour yeah brother

    • @Jacob-t1j
      @Jacob-t1j Год назад

      @@elise3455 No you don't. Math become super easy once you understand what you doing

  • @seifeldineslam
    @seifeldineslam 4 года назад +29

    This was honestly helpful, i am an aspiring behavioral geneticist (Aspiring because I am still an undergraduate of biotechnology) with really disrupted fundamentals of math especially statics. Your existence as a youtube channel is a treasure discovery to me !

  • @awesomebroification
    @awesomebroification 2 дня назад +1

    Amazingly, my professor did not even discuss projecting data to new axes that maximize linear separability of the groupings. Thank you so much for the core intuition so I dig in a little further.

  • @haydo8373
    @haydo8373 7 лет назад +159

    Hey what is the intro track called? I couldn't find it on Spotify. . . :D

    • @hiteshjoshi3061
      @hiteshjoshi3061 2 месяца назад +1

      It's their own

    • @hiteshjoshi3061
      @hiteshjoshi3061 2 месяца назад +1

      Listen carefully it's the channel name in it and is cool 😂😂👌

    • @solar0wind
      @solar0wind 7 часов назад

      @@hiteshjoshi3061 I think they know that and it was a joke^^

  • @Muzik2hruRain
    @Muzik2hruRain 5 лет назад +5

    You, sir, you are a life saver. Now in every complicated machine learning topics I look for your explanation, or at least wonder how you would have approached this. Thank you, really.

    • @statquest
      @statquest  5 лет назад +1

      Awesome! Thank you! :)

  • @PV10008
    @PV10008 5 лет назад +3

    I really like the systematic way you approach each topic and anticipate all the questions a student might have.

  • @rachelstarmer9835
    @rachelstarmer9835 8 лет назад +86

    Awesome! Even I get it and love it! I'm going to share one of your stat-quest posts as an example of why simple explanations in everyday language is far superior to using academic jargon in complex ways to argue a point. Also, it's a great example of how to develop an argument. You've created something here that's useful beyond statistics! Three cheers for the liberal arts education!!!! Three cheers for Stat-Quest!!

  • @seahseowpeh8278
    @seahseowpeh8278 4 дня назад +2

    Another great video. Thank you so much. You are definitely one of the best educators in the world.

  • @neelkhare6934
    @neelkhare6934 4 года назад +4

    Wow , that is one of the best explanations of LDA
    it helped me get an intuitive idea about LDA and what it actually does in classification
    Thank You!

    • @statquest
      @statquest  4 года назад

      Hooray! Thank you! :)

    • @neelkhare6934
      @neelkhare6934 4 года назад

      Can you make a video on quadratic discriminant Analysis

    • @statquest
      @statquest  4 года назад

      @@Sachin-vr4ms Which part? Can you specify minutes and seconds in the video?

    • @statquest
      @statquest  4 года назад

      @@Sachin-vr4ms I'm sorry that it is confusing, but let me try to explain: At 9:46, imagine rotating the black line a bunch of times, a few degrees at a time, and using the equation shown at 8:55 to calculate a value at each step. The rotation that gives us the largest value (i.e. there is a relatively large distance between the means and a relatively small amount of scatter in both clusters) is the rotation that we select. If we have 3 categories, then we rotate an "x/y-axis" a bunch of times, a few degrees each time, and calculate the distances from the means to the central point and the scatter for each category and then calculate the ratio of the squared means and the scatter. Again, the rotation with the largest value is the one that we will use. Does that help?

    • @statquest
      @statquest  4 года назад

      @@Sachin-vr4ms I'm glad it was helpful, and I'll try to include more "how to do this in R and python" videos.

  • @AtharvaShirode-ff8es
    @AtharvaShirode-ff8es 3 месяца назад +1

    You are just superb!! 8yrs. & still so concise and best explanation

  • @rohil1993
    @rohil1993 4 года назад +3

    This explains the beauty of LDA so well! Thank you so much!

    • @statquest
      @statquest  4 года назад

      Awesome! Thank you very much! :)

  • @merida3975
    @merida3975 2 года назад +2

    The song at the beginning made my day, even though I took wrong tutorial of Linear discriminant analysis in data science. Just awesome. Love it a lot. We need more and more funny teachers like you.

  • @Azureandfabricmastery
    @Azureandfabricmastery 4 года назад +4

    Hi Josh, Helpful to understand the differences between PCA and LDA and how LDA actually works internally. You're indeed making life easier with visual demonstrations for students like me :) God bless and Thank you!

    • @statquest
      @statquest  4 года назад

      Glad it was helpful!

  • @laurading7012
    @laurading7012 4 года назад +2

    I just graduated from high school, but your videos helped me understand many research papers. Thank you very much!!!!!

  • @alexhoneycutt8352
    @alexhoneycutt8352 5 лет назад +4

    This helped me understand LDA before my midterm! I could not wrap my head around how the functions worked and what they did, but I got an "ah-hah!" moment at 6:49 and I totally understand it now. Thank you for explaining this!

    • @statquest
      @statquest  5 лет назад +1

      Hooray! Good luck with your midterm. :)

  • @RaghuMittal
    @RaghuMittal 7 лет назад +1

    Great video! I initially couldn't understand LDA looking at the math equations elsewhere, but when I came across this video, I was able to understand LDA very well. Thanks for the effort.

  • @alis5893
    @alis5893 3 года назад +3

    Josh. you are an amazing teacher. i have learned so much from you , a big thank you from the bottom ofmy heart. god bless you

  • @arpitagupta4474
    @arpitagupta4474 2 года назад +2

    I am able to grasp on this topic without being scared. Kudos to this channel

  • @jinyog5276
    @jinyog5276 2 года назад +1

    I didn't understand what the professor talked about in the lecture until I watched your videos. Thanks Josh, you save me!

  • @aneeshmenon12
    @aneeshmenon12 8 лет назад +3

    woww........toooo goodddddddddddd.....dear Starmer...nothing to say..you are incredible...I am eagerly waiting for your next video...

  • @lifeislarge
    @lifeislarge 3 года назад +1

    Never thought anyone could explain things this easily. I appreciate the effort. Thank You

  • @hlatse98
    @hlatse98 7 лет назад +47

    Brilliant video! Very helpful. Thank you.

  • @sridharyamijala4739
    @sridharyamijala4739 2 года назад +2

    Another excellent video just as great as the one on PCA. I read a Professor's view on most of the models and algorithms stuff in ML where he recommended understanding the concepts well so that we know where to apply and not worry too much about the actual computation at that stage. The thing that is great in your videos is that you explain the concept very well.

    • @statquest
      @statquest  2 года назад +1

      Thank you very much! :)

  • @whasuklee
    @whasuklee 5 лет назад +5

    Came for my midterm tomorrow, stayed for the intro track.

  • @adejumobiidris2892
    @adejumobiidris2892 2 года назад +1

    Thank you so much for helping me provide a faster solution for the confusion that has taken control of my head for 72h.

  • @Anmolmovies
    @Anmolmovies 6 лет назад +5

    Absolutely brilliant. Kudo's to you for making seem it so simple. Thanks!

  • @nishisaxena4831
    @nishisaxena4831 4 года назад +1

    much better than my university lecture that I listened to twice but couldn't understand ... this was awesome, thanks!

    • @statquest
      @statquest  4 года назад

      Hooray! I'm glad the video was helpful. :)

  • @republic2033
    @republic2033 6 лет назад +3

    Thank you, very educative and entertaining!

  • @leeamraa
    @leeamraa 3 месяца назад +1

    among the best best 15 minutes you can spend on youtube! thank you.

  • @armansh7978
    @armansh7978 5 лет назад +4

    Awesome, just I can say bravo man, bravo, thank you very much.

  • @sassmos008
    @sassmos008 7 лет назад

    wow... my professor has been trying to teach me the concepts for weeks. and now I finally understand. Thank you so much. I will refer this to my mates.

  • @MartinUToob
    @MartinUToob 5 лет назад +6

    When's the StatQuest album coming out? (Here come the Grammies!)
    🎸👑
    Actually, the only reason I watch your videos is for the music.
    😍🎶🎵

  • @elizabeths3989
    @elizabeths3989 3 года назад +2

    You are about to be the reason I pass my qualifying exam in bioinformatics 🙏🙏

    • @statquest
      @statquest  3 года назад

      Good luck!!! BAM! :)

  • @hannav7125
    @hannav7125 3 года назад +3

    fact: none of you skipped the intro

    • @statquest
      @statquest  3 года назад

      This is one of my favorites. :)

  • @mahdimantash313
    @mahdimantash313 2 года назад +2

    I really can't thank you enough for that...you did in 16 mins what I couldn't do in 4 hours. keep on the good work!! and thank you again !!!

  • @bonleofen6722
    @bonleofen6722 6 лет назад +7

    4:14 was waiting for the "sound"

    • @statquest
      @statquest  6 лет назад

      :)

    • @bonleofen6722
      @bonleofen6722 6 лет назад +1

      Hey Josh, I am really thankful for the videos you are making and posting. I am very much motivated and inclined towards learning machine learning and most of the sources didn't give such a fundamental explanation of how things work.

    • @bonleofen6722
      @bonleofen6722 6 лет назад +1

      And, I am not promising but I do really look forward to buying your song and gifting it one of my friend with whom I share the same music taste and who also happens to be an expert in Python

    • @statquest
      @statquest  6 лет назад +1

      @@bonleofen6722 You're welcome!!! I'm really happy to hear that you like my videos and they are helping you.

    • @bonleofen6722
      @bonleofen6722 6 лет назад

      They are helping me loads.

  • @scottsun9413
    @scottsun9413 Год назад +1

    Really great videos, saved me from my data science classes. I'm applying for graduate program at UNC, hope I can have the opportunity to meet the content creators sometime in the future.

  • @maverickstclare3756
    @maverickstclare3756 5 лет назад +7

    "Dr, those cancer pills just make me feel worse"
    presses red button "wohp waaaaaaaa"
    "next patient please"
    :)

  • @peterantley
    @peterantley 7 лет назад

    You are my hero. I am a senior hoping to get into data science and your videos are great and very helpful. Keep up the good work.

  • @ravihammond
    @ravihammond 6 лет назад +4

    This guy is amazing.

  • @jialingzhang1341
    @jialingzhang1341 3 года назад

    Thanks for this brilliant video! One thing I think is worth mentioning or emphasizing is LDA is supervised and PCA is unsupervised.

  • @saiakhil1997
    @saiakhil1997 4 года назад +1

    I really liked how you compared the processes of PCA and LDA analysis. I got to know a different way to view LDA due to this video

  • @LEK-0000
    @LEK-0000 6 лет назад +3

    Why is he always singing at the beginning of the video?? Lolol

    • @statquest
      @statquest  6 лет назад +3

      Can't stop, won't stop! ;)

    • @GaneshKumar-bv2td
      @GaneshKumar-bv2td 6 лет назад +3

      honestly im not complaining..it shows he is funny and is true to his self :)

    • @statquest
      @statquest  6 лет назад

      Ganesh Kumar Thank you!!

  • @SeqBioMusic
    @SeqBioMusic 7 лет назад +1

    Awesome! It'll be good to give some differences of PCA and LDA. For example, PCA is studying the X. LDA is studying the X->Y.

  • @JuneSiyu
    @JuneSiyu 7 лет назад +5

    Nice singing

  • @phoenixflames6019
    @phoenixflames6019 2 месяца назад +1

    10/10 intro song
    10/10 explanation
    using PCA, I can reduce these two ratings to just one: 10/10 is enough to rate the whole video
    using LDA, the RUclips chapters feature maximizes the separation between these 2 major components (intro and explanation) of the video

  • @datoubi
    @datoubi 4 года назад +1

    i recommended all your videos to my fellow students in the data analysis course

    • @statquest
      @statquest  4 года назад

      Thank you very much! :)

  • @neillunavat
    @neillunavat 4 года назад +1

    I am so glad this channel has grown to around 316k subscribers. Very well explained. The best of bests.

  • @oklu_
    @oklu_ 3 года назад +2

    thank you for your kind, slow, and detailed explanation😭

    • @statquest
      @statquest  3 года назад

      You’re welcome 😊!

  • @rodrigolivianu9531
    @rodrigolivianu9531 4 года назад +1

    Great video! Just wanted to point out that LDA is a classifier, which involves a few more steps than the procedure described here, such as assumption that the data is gaussian. The procedure here described is only the feature extraction/dimensionality reduction phase of the LDA. G

    • @statquest
      @statquest  4 года назад +1

      You are correct! I made this video before I was aware that people had adapted LDA for classification. Technically we are describing "Fisher's Linear Discriminant". That said, using LDA for classification is robust to violations to the gaussian assumptions. For more details, see: sebastianraschka.com/Articles/2014_python_lda.html

    • @rodrigolivianu9531
      @rodrigolivianu9531 4 года назад

      StatQuest with Josh Starmer That said, I must admit I am having a really hard time understanding how the fisherian and baysian approach lead to the same conclusion even with completely different routes. If you have any source on that it would be of enormous help for my sanity haha

  • @Illinoise888
    @Illinoise888 4 года назад +1

    Thanks for the video! I have an exam next week and even though its open book, I still didn't feel comfortable going into it. This video definitely helped!

    • @statquest
      @statquest  4 года назад

      Good luck and let me know how it goes. :)

  • @Dr.CandanEsin
    @Dr.CandanEsin 4 года назад +1

    Too much time and effort spent, but they worth it. Best explanation I watched after six weeks of search. Cordially thank you.

  • @chemicalbiomedengine
    @chemicalbiomedengine 5 лет назад +2

    always excited when i look for a topic and its available on statquest

  • @rmiliming
    @rmiliming 2 года назад +1

    very clearly explained. the video is very enjoyable to watch too! Statquest has all that is needed to learn machine learning algos and stats well

  • @DaveGogerly
    @DaveGogerly 2 года назад +1

    I love your stuff, you have the knack to explain things better than most!

    • @statquest
      @statquest  2 года назад

      Thank you!

    • @meng-laiyin2198
      @meng-laiyin2198 2 года назад

      @@statquest Thank you so much for this video. I tried to understand LDA by reading lots of materials (books, papers, etc.), but none of them can explain things as clear as you do. Really appreciate it!

    • @statquest
      @statquest  2 года назад

      @@meng-laiyin2198 Thanks! :)

  • @ChaminduWeerasinghe
    @ChaminduWeerasinghe 3 года назад +1

    Best explanation iv ever seen on ML. This is the first time iv watch ML youtube video without rewind :| ..
    Keep Up bro..

  • @hanaibrahim1563
    @hanaibrahim1563 2 года назад +1

    Amazing. Thank you for this excellent video. Explained everything super clearly to me in a super concise manner without all the academic jargon getting in the way.

    • @statquest
      @statquest  2 года назад

      Glad it was helpful!

  • @jahanvi9429
    @jahanvi9429 2 года назад +1

    the song in the introduction is always awesome. thanks lol! and very useful video

  • @balexander28
    @balexander28 7 лет назад +1

    All of your StatQuest videos are awesome! Thanks for using your time to help others! Much appreciated!

  • @worldofbrahmatej2023
    @worldofbrahmatej2023 6 лет назад +2

    Excellent! You are a better teacher than many overrated professors out there :)

  • @wtfJonKnowNothing
    @wtfJonKnowNothing 10 месяцев назад +1

    I'm more aligned to hear and love the song than the lecture these days :)

  • @onurdemir353
    @onurdemir353 5 лет назад +1

    You are awesome.Eventually,I was able to reach understanding point of machine learning staffs thanks to you.

  • @gptty
    @gptty 5 лет назад +1

    I get it! You sir is the best lecturer in statistics

  • @saharafox2360
    @saharafox2360 2 года назад +1

    That helped me a lot! Thank you sooo much! Now I'm ready for my exam tomorrow :)

  • @vivekmankar9643
    @vivekmankar9643 4 года назад +1

    This channel deserves millions of subscribers !!!!

  • @arungandhi5612
    @arungandhi5612 4 года назад +1

    you are very cool bro. I aced my work at my research institute because of youuuuuuuu

    • @statquest
      @statquest  4 года назад

      That's awesome!!! So glad to hear the videos are helpful. :)

  • @hamzaghandi4807
    @hamzaghandi4807 2 года назад +2

    Besides this wonderful explanation, Your music is very good !

  • @seant7907
    @seant7907 4 года назад +1

    subscribed just because the way you described this topic is so simple and understandable. nice job!

    • @statquest
      @statquest  4 года назад

      Thank you very much! :)

  • @daisy-fb5jc
    @daisy-fb5jc 2 года назад +1

    I wish I can throw this video to my professor, and teach her how to give understandable lectures.
    Just a wish.

  • @amrit20061994
    @amrit20061994 3 года назад +2

    "But what if we used data from 10k genes?"
    "Suddenly, being able to create 2 axes that maximize the separation of three categories is 'super cool'."
    Well played, StatQuest, well played!

  • @mohammadadnan8248
    @mohammadadnan8248 7 лет назад

    Tomorrow is my exam, that might be helpful
    Thanks a lot from India

  • @muskanjhunjhunwalla8505
    @muskanjhunjhunwalla8505 6 лет назад +1

    It was a very helpful video. I get to understand it in the first attempt only. Thanks a lot for this video sir.

    • @statquest
      @statquest  6 лет назад

      Hooray!!! I'm glad the video was so helpful! :)

  • @LifeofTF
    @LifeofTF 4 года назад +1

    Loved the explanation. Your channel has been a truly invaluable source for studying ML. I was wondering whether you could make a video on the differences/similarities along with use cases for KNN/LDA/PCA.

    • @statquest
      @statquest  4 года назад

      I'll keep that in mind.

  • @weixu553
    @weixu553 7 лет назад

    great video, you make all the academic terms very understandable, cheers from China

  • @nuttapatchaovanapricha
    @nuttapatchaovanapricha 11 месяцев назад +1

    Very useful and intuitive, also sick intro music right there as usual! xD

    • @statquest
      @statquest  11 месяцев назад

      I think this might be my favorite intro.

  • @paulhamacher773
    @paulhamacher773 4 года назад +2

    This channel is pure gold!

  • @alphabetadministrator
    @alphabetadministrator 8 месяцев назад +1

    Hello Josh. As always, thank you for your super intuitive videos. I won't survive college without you.
    I do have an unanswered conundrum about this video, however. For Linear Discriminant Analysis, shouldn't there be at least as many predictors as the number of clusters? Here's why. Say p=1 and I have 2 clusters. In this case, there is nothing I can do to further optimize the class separations. The points as they are on the line already maximizes the Fisher Criterion(between-class scatter/in-class scatter). While I do not have the second predictor axis to begin with, even if I were to apply a linear transformation on the line to find a new line to re-project the data on, it will only make the means closer together. Extending this reasoning to the 2D case where you used gene x and gene y as predictors and 3 classes, if the 3 classes exist on a 2D plane, there is nothing we can do to further optimize the separation of the means of the 3 classes because re-projecting the points on a new tilted 2D plane will most likely reduce the distances between the means. Now, if each scatter lied perfectly vertically such that as Gene Y goes up the classes are separated distinctly, then we could re-project the points on a new line(that would be parallel to the invisible vertical class separation line) to further minimize each class's scatter, but this kind of case is very rare.
    Given my reasoning, my intuition is that an implicit assumption for LDA is that there needs to be at least as many predictors as the number of classes to separate. Is my intuition valid?

    • @statquest
      @statquest  8 месяцев назад

      I believe your question might be answered in this video on PCA tips: ruclips.net/video/oRvgq966yZg/видео.html

  • @sanketbadhe3572
    @sanketbadhe3572 5 лет назад +2

    I just watched all your videos for intro track :P ......awesome tracks and nicely explained videos

  • @cnbmonster1042
    @cnbmonster1042 3 года назад +1

    Amazing! I subscribed after watching your video only twice!

  • @Pmarmagne
    @Pmarmagne 3 года назад +1

    Another clearly explained video by StatQuest!

  • @vishwanathg8083
    @vishwanathg8083 6 лет назад

    Thankyou , Explanation of LDA & PCA is very clear....

  • @RaviShankar-jm1qw
    @RaviShankar-jm1qw 4 года назад +2

    Simply superb! Awesome Josh!!!!

    • @statquest
      @statquest  4 года назад

      Thank you very much! :)

  • @duchuyho7027
    @duchuyho7027 7 лет назад

    Great video Joshua ! Looking forward to learning more from you !
    Cheers from Japan !

  • @JalerSekarMaji
    @JalerSekarMaji 6 лет назад

    Wow!
    At first "wt.f is Statquest"
    then
    At the end of video, STATQUEST! and I checked on the description. Its a great website !
    Thanks

  • @mrdoerp
    @mrdoerp 6 лет назад +2

    this videos are incredible, i would pay for it if i had money

    • @statquest
      @statquest  6 лет назад +1

      Sometimes it's the thought that counts! I'm glad you enjoy the videos. :)

  • @sakhawat3003
    @sakhawat3003 4 года назад

    Hey man! That was a nice clear cut explanation . I have been doing machine learning using LDA but I never knew what this LDA actually does . I only had a vague idea . By the way , you wrote "seperatibility" instead of "separability " at 5:26 ....

    • @statquest
      @statquest  4 года назад +1

      That's embarrassing. One day when StatQuest is making the big bucks I will hire an editor and my poor spelling will no be source of great shame.

  • @nintishia
    @nintishia Год назад +1

    Once again, a fantastic job. Thanks, StatQuest.

  • @wei-tingko7871
    @wei-tingko7871 3 года назад +1

    I really like your channel, the explanation of concepts was clear and precise!!

  • @amanzholdaribay9871
    @amanzholdaribay9871 5 лет назад +1

    Thanks as every time! The best explanations of complicated things!

  • @chieftainsupreme9387
    @chieftainsupreme9387 Год назад +1

    You're an excellent teacher. Thank you so much.

  • @sureshmakwana8709
    @sureshmakwana8709 Год назад

    The best explanation on whole internet 💯

  • @thenkprajapati
    @thenkprajapati 5 лет назад +1

    Indeed clearly explained. Please also make videos on Independent Component Analysis and Singular Value Decomposition.

    • @statquest
      @statquest  5 лет назад

      OK. I'll put those on the to-do list.

    • @thenkprajapati
      @thenkprajapati 5 лет назад +1

      @@statquest Thanks. People like me are waiting.

    • @statquest
      @statquest  5 лет назад

      @@thenkprajapati Cool! However - just so you know, it could still be awhile before I make the video. I get about 3 requests every day, but I can only make about 2 videos a month. The more people that ask for a specific topic, the more priority I give that topic. So if you know a lot of people interested in Independent Component Analysis or SVD, tell them to put in their requests as well so that I'll prioritize these subjects.

  • @AnirudhJas
    @AnirudhJas 6 лет назад +1

    Very nicely explained! Thank you very much Josh!

  • @jennysspiceoflife8581
    @jennysspiceoflife8581 3 года назад +1

    Thank you for the illustration, it's very clear!

    • @statquest
      @statquest  3 года назад

      Glad it was helpful!

  • @康朵朵-d1r
    @康朵朵-d1r 5 лет назад

    Can i say that the most brilliant thing about statquest is the silly song??? love it, super fan

  • @karannchew2534
    @karannchew2534 3 года назад

    Like PCA, LDA "compress" the data into lower dimensions. But unlike PCA, it do so while keeping/maximising the classificability (separability) of the data according to the given classification, as much as possible.
    Data must already be classified to use LDA.
    Find the line (the new lower dimension) such that the difference between the means of two classes of data are maximised. At the same time, the variance among the data of the same class is minimised.