The KL Divergence : Data Science Basics

Поделиться
HTML-код
  • Опубликовано: 28 дек 2024

Комментарии • 281

  • @zafersahinoglu5913
    @zafersahinoglu5913 Год назад +25

    I am a research scientist. You provide a clear and concise treatment of KL-Divergence. The best I have seen to date. Thanks.

  • @szymonk.7237
    @szymonk.7237 Год назад +145

    Wow... 😳 I've never seen more genius, easy and intuitive explanation of KL-div 😳👏👏👏👏👏 Big thanks good man ! ❤️

  • @DS-vu5yo
    @DS-vu5yo Год назад +21

    That was the best description of why we use log that I have ever seen. Good work, man.

  • @murkyPurple123
    @murkyPurple123 Год назад +71

    Your bottom-up (instead of top-down) approach that you mentioned in the beginning of the video would be really great to see for all kinds of differrent concepts!

  • @usethisforproductivity-tg7xq
    @usethisforproductivity-tg7xq Месяц назад

    you are probably the best teacher I've ever seen, and I've learned from tons of people online like Andrew Ng, Andrej Karpathy, MIT lecture series, Brad Traversy, Statsquest.

  • @KippSchoenwald-m1u
    @KippSchoenwald-m1u Год назад +4

    I'm in the middle of a $2,500 course, BUT → RUclips → your video... 👏🏻👏🏻👏🏻👏🏻👏🏻 Thank you for starting with the "why", and appealing to my brains desire to understand, not just do.

  • @varadpuntambekar8895
    @varadpuntambekar8895 8 месяцев назад +8

    I don't think I'm ever going to forget this. Thanks so much.

  • @nileshchandrapikle1946
    @nileshchandrapikle1946 Месяц назад

    Excellent way to explain the concept of KLD. I landed on this video after checking 4-5 other tutorials but none of those match the easeness of this tutorial. Thnanks.

  • @elevenyhz
    @elevenyhz Месяц назад

    I am a postdoc studying information theory and language. This is the best KL divergence explanation I've heard. I don't think I am going to forget it. :) Thanks!

  • @steamedbean180
    @steamedbean180 3 месяца назад +1

    This is the most intuitive explanation for any statistics problem.

  • @Oliprod123
    @Oliprod123 2 месяца назад

    Clear, simplified, the best approach to lead to why to use a formula. Thank you!!

    • @ritvikmath
      @ritvikmath  2 месяца назад

      Glad it was helpful!

  • @trungphan9137
    @trungphan9137 Год назад +7

    This is mind blowing.... I love the way you go from the problem to the solution, it's clever way to understand this KL divergence

  • @marka5968
    @marka5968 Год назад +11

    Wow. This is the best explanation of KL-divergence I've ever heard. So many over-complicated stuff out there but yours is absolutely genius.

  • @joker345172
    @joker345172 24 дня назад

    Amazing explanation. Why isn't math usually explained like this?? I usually find myself having to crack open formulas to figure out what they mean, and 99% of the time, it's not clear at all.
    The work you're doing is absolutely amazing!!

  • @brandonkim4675
    @brandonkim4675 Год назад +1

    I recently got interested in learning machine learning and stumbled upon the stable diffusion, the current state of art open source image generation ai.
    That's where I encountered the KL divergence. The more I try to understand it, more complicated concepts and formulas are thrown at me.
    I managed to find some videos that explains how to derive it, but none of them explained why the hell logarithm is present in it for gods sake!
    And here you are, explaining every missing details from other videos and blog posts in a way that the person who knows very little about the subject can understand in a very satisfying and easy to follow way. Hats off to you, sir. I wish every teachers are like you.

    • @ritvikmath
      @ritvikmath  Год назад +1

      Thanks and godspeed for your journey through machine learning !

  • @julianwebb9222
    @julianwebb9222 8 месяцев назад +1

    That was great! Not just dumping the formula on you but walking you through its logic with simple steps. Loved it! ❤

  • @eagermage3157
    @eagermage3157 Год назад +4

    Best Math Teacher ever. So clearly explained the design and thinking process of how the algo comes out. Many video just explain the formula which confused me why we should do this way... Thank you!

  • @JBoy340a
    @JBoy340a Год назад +21

    That was great. I have struggled to understand certain aspects of KL Divergence, and this is a great way to think about it without getting bogged down in symbology.

  • @raafeyazher5687
    @raafeyazher5687 2 месяца назад

    Man you are amazing,
    I am gonna binge watch all the videos for better intuitive understandings

  • @SSJVNN
    @SSJVNN Год назад

    The comments didn't lie you actually explained this so well. I watched the ads all the way through btw.

  • @NinoGodoradze-j7l
    @NinoGodoradze-j7l Месяц назад

    You have the best explanations for understanding data science concepts. Thank you so much !

  • @AdeOlubummo
    @AdeOlubummo 6 месяцев назад +2

    Just fantastic! Even if I forget the formula for KL divergence, I can "re-engineer" it on demand.

  • @LunaGamingUpdates
    @LunaGamingUpdates 2 месяца назад

    Damn, this is so sick. Thanks for creating this content it helped me a lot to understand kl divergence as you showed it in a step by step process that makes it intuitive and easy to understand. Keep making such videos,thank you.

  • @asimosman3428
    @asimosman3428 Год назад +3

    This video is absolutely mind-blowing! The way it breaks down such a complex concept into an intuitive understanding is truly remarkable.
    Thank you!

  • @vorushin
    @vorushin 11 месяцев назад

    Thank you for the great explanation! I totally agree that math is not given from above, but invented by people. And showing how the invention can be done is the best way to teach the new concepts. Thanks a lot!

  • @shamarbauyrzhan7997
    @shamarbauyrzhan7997 Год назад +3

    Let's celebrate a new video on this amazing chanel!!! Love your work!

  • @tayyibulhassan6227
    @tayyibulhassan6227 Год назад +1

    One of the BEST tutorials for sure

  • @thankyouthankyou1172
    @thankyouthankyou1172 Год назад

    I found out this professor is very good at explaining every tough concept! respect and many appreciations!

  • @godlyradmehr2004
    @godlyradmehr2004 8 месяцев назад

    The best explanation I've ever seen about KL divergence ❤

  • @andrashorvath2411
    @andrashorvath2411 Год назад +1

    Fantastically clearly explained, congrats.

  • @Mars.2024
    @Mars.2024 11 месяцев назад

    Everytime i have a math question your hannel is my first choice! Amazing ✅ thanks a million 🎉

  • @tampopo_yukki
    @tampopo_yukki Год назад +1

    I love how you approach to the KL divergence!

  • @somdubey5436
    @somdubey5436 Год назад

    superb...I believe this is the best explanation I have ever come across for K L Divergence. Thanks a tonne.

  • @aizazkhan5439
    @aizazkhan5439 5 месяцев назад

    You sir win. Simply the best explanation. I can’t thank you enough.

  • @yhoang6674
    @yhoang6674 Год назад +1

    In the 'Motivation for log,' you said that taking a simple average is not the right way to go, and then you try to find a function that makes f(4) and f(1/4) have opposite signs. That means you are trying to make two very different distributions have the smallest distance possible (canceling each other out), which is contradictory to what we expected. We expected them to be large.

  • @seyitbahar3259
    @seyitbahar3259 5 месяцев назад

    Omg, that was such a brilliant way of delivering information! Thank you so much for this video.

  • @AKSHATJAIN-g1v
    @AKSHATJAIN-g1v 11 дней назад

    Thanks...KL Divergence always seemed a hurdle in understanding diffusion models. Thanks for clarification a lot.

  • @mehmetozkan1479
    @mehmetozkan1479 Год назад

    I have never seen complex math explained this good Thank you very much!

  • @somethingness
    @somethingness 6 месяцев назад

    What an excellent teacher you are! Thank you for all your videos.

  • @joesavage9077
    @joesavage9077 7 месяцев назад

    Wow!!! This approach to explaining was mind "opening". I got it! Thanks so much

  • @RakshithReddy5555
    @RakshithReddy5555 Год назад

    Blew my mind, I wanted to understand what kl divergence is to understand the recent Gen AI papers and couldn't. This video helped me a lot.

  • @ringo8530
    @ringo8530 Год назад

    you are way more better than my school's professor. thank you

  • @mantische
    @mantische Год назад +1

    It was the easiest explanation I’ve ever seen.

  • @majdmrawed1132
    @majdmrawed1132 2 месяца назад

    signed in just to like and comment. thanks man, you're a life saver

  • @mrcaljoe1
    @mrcaljoe1 Год назад +1

    I think you're channel and teaching style is brilliant. I wish I knew about this channel when I was doing my undergrad.

  • @VisweswaraSriadibhatla
    @VisweswaraSriadibhatla 11 месяцев назад

    me not know some of the fundamentals after listening to your explanation made a lot of sense, and I felt I understood the concept well. I am willing to watch your videos more often.

  • @momcilomrkaic2214
    @momcilomrkaic2214 10 месяцев назад

    Your videos are great just keep going, I watched you for few years already

  • @LifeKiT-i
    @LifeKiT-i 9 месяцев назад +1

    wahh.. i am studying computer science master degree. Your video really helps me a lot! please keep on doing such great work for us!

  • @trentbolt2006
    @trentbolt2006 Год назад +2

    You've really made my day with ur explanation. Thank you so much :D

  • @komuna5984
    @komuna5984 Год назад

    Thanks a lot for sharing the underlying motivation behind the K-L divergence! I really needed such deep insights! JAJAKALLAH...

  • @paigecarlson1742
    @paigecarlson1742 9 месяцев назад

    Outstanding. Really helping me through this info retrieval course!

  • @andrew-qf4xl
    @andrew-qf4xl Год назад

    the thing you said in the first minute, is something ive been saying for a while now. As students we arent told what problem drove scientists or engineers into constructing new formulas or ways of thinking.

  • @benjaminbear138
    @benjaminbear138 3 месяца назад +1

    This is how math should be taught

  • @sandipmehta2950
    @sandipmehta2950 Год назад +1

    amazing explanation. not many can do this. well done.

  • @yo-yoyo2303
    @yo-yoyo2303 2 месяца назад

    This explanation is gold ! 🏅

  • @anujadassanayake6202
    @anujadassanayake6202 Год назад

    Great explanation, this is the first time I'm learning about KL divergence and it was very easy to grasp because of the way you taught it

  • @Erolctak
    @Erolctak 2 месяца назад

    Great for understanding the steps clearly! Good work!

  • @MagmaMusen
    @MagmaMusen 6 месяцев назад

    Thanks!

  • @MrCEO-jw1vm
    @MrCEO-jw1vm 6 месяцев назад

    Thank you for a good explanation of a seemingly weird looking formula. It would be hard to forget this formula now that I got it from here.

  • @chaochaisit
    @chaochaisit Год назад

    Math should all be taught this way, and to go one step further we should teach people how to make sense of math themselve in the long run
    Thanks for the explanation for kl divergence though ;)

  • @tanvirazhar
    @tanvirazhar Год назад

    Amazing. The pace you have explained, the approach...everything is just top-notch.

  • @jayalekshmi936
    @jayalekshmi936 Год назад +1

    I am a masters student in data science and machine learning and I have to tell you that this is the best explanation one can get for concepts like this...Hope you make more videos on these types in concepts.

  • @kasyaci
    @kasyaci Год назад

    That was one of the best explanations I have ever heard! Great job and many thanks!

  • @Andy-qi5nh
    @Andy-qi5nh Год назад

    Amazing teaching. It helps a lot in my process of data shift covariate detection project. Thanks

  • @gingerderidder8665
    @gingerderidder8665 9 месяцев назад

    Taking the MITx Stats class, but I find that you explain the concepts so much better!

  • @jackritwik09
    @jackritwik09 Год назад

    mad respect for Ritvik from Ritwik for acing the subtle art of intuitive explanation:)) If only professors could master the same art.

  • @akrylic_
    @akrylic_ Год назад

    Universities should fire their math professors and get you to teach their classes. Well done!

  • @Justin-zw1hx
    @Justin-zw1hx Год назад

    dude, the explanation is so good, you rock!

  • @re4ct0r
    @re4ct0r 8 месяцев назад

    Why do we want to prioritize popular x values in our current distribution? (See 11:46)

  • @НиколайНовичков-е1э

    Thank you! This is the best explanation of KL divergence wich i've seen

  • @luisgoogle8098
    @luisgoogle8098 Год назад

    How can this guy only have 8,000 views on such a good video... Very nice way of explaining!

  • @khaleda.s761
    @khaleda.s761 2 месяца назад

    OMG. Thanks for the intuitive explaintation really appreciate that 👏

  • @qiguosun129
    @qiguosun129 Год назад

    Thanks for the lecture, your work is always so intuitive.

  • @fh3652
    @fh3652 Год назад

    Great stuff... Learning a way to teach maths to my kid... A constructivist method... While learning about stats... I really appreciate your work.

  • @Hobbies_forkids
    @Hobbies_forkids Год назад

    Excellent way to explain it. Makes maths sounds logical and approachable 🎉

  • @midnightwanders5876
    @midnightwanders5876 Год назад

    Great work! I've been a fan of your ,material for some time and in this video you have truly mastered your craft.

  • @ResilientFighter
    @ResilientFighter Год назад +2

    Love this homie! Better than university.

  • @razgaon3680
    @razgaon3680 Год назад

    Best video I've seen in a while!

  • @winstongraves8321
    @winstongraves8321 Год назад +1

    This was awesome. Really helpful to think through it backwards and “redevelop” our own function

  • @markozege
    @markozege Год назад

    Thank you for this, the best explanation of KL divergence that I have seen. Love how you approach it building gradually, really inspiring for how to learn math.

  • @lbognini
    @lbognini Месяц назад

    17:15: I was just thinking to myself: can't KLD, in that case, be considered as a measure of likelihood (of Q1 and Q2 given the observed distribution?

  • @manducchuc915
    @manducchuc915 Год назад

    Thanks, exactly the explanation I have been looking for!

  • @ChocolateMilkCultLeader
    @ChocolateMilkCultLeader Год назад

    This is the perfect video in Math. Love it. Shared with all my readers

  • @Fire_ous
    @Fire_ous Месяц назад +1

    Excellent.

  • @adaslesniak
    @adaslesniak 11 месяцев назад

    Great video. Two pieces missing for it to be perfect are in my opinion. If you could just calculate for sum of log (p(x)/q(x)) and show us what's wrong with that number. Exactly as you did with simple p(x)/q(x) why it isn't good solution. Finally in last slide if you could give numbers. You tell about quantification of something that is visually clear, but missing the numbers is kinda missed opportunity to explain how it works :)
    Again - thank you a lot for explanation. Great work.

  • @jrlearnstomath
    @jrlearnstomath 2 месяца назад

    Truly. Bravo, this was awesome

  • @rohitramaswamy8131
    @rohitramaswamy8131 4 месяца назад

    Great vid man, god damn your pedagogy is incredible

  • @seansullivan6986
    @seansullivan6986 7 месяцев назад

    Excellent intuitive explanation!

  • @hpp496videos
    @hpp496videos Год назад

    This was incredibly illustrative!

  • @dancox272
    @dancox272 3 месяца назад

    Absolutely beautiful explanation! Thank you

    • @ritvikmath
      @ritvikmath  3 месяца назад

      Glad it was helpful!

    • @dancox272
      @dancox272 3 месяца назад

      @@ritvikmath one question, suppose one were to use the average of the absolute value of the difference between the 2 distributions, why would this not be a good metric?

  • @PrajwalSingh15
    @PrajwalSingh15 Год назад

    Thank you so much for this explanation and also got a new insight about the log :)

  • @orenkoriat
    @orenkoriat Год назад +1

    great explanation!

  • @0hexe
    @0hexe Год назад

    Amazing video, love the format!

  • @TheFirebolt2010
    @TheFirebolt2010 9 месяцев назад

    It would be interesting to have a video on how you study to understand a topic, what resources you use and the materials you look for

  • @vzinko
    @vzinko Год назад +3

    Thank you as always for sharing your brilliant teachings, Ritvik. Could you please do a video on the Gram-Schmidt process and how orthonormal basis matrices are relevant to data science?

  • @chrisschrumm6467
    @chrisschrumm6467 4 месяца назад

    Great job!!! Love this explanation

  • @yb801
    @yb801 Год назад

    This is an amzing explanation, thanks!

  • @mariusschmidt6883
    @mariusschmidt6883 Год назад

    Wow. Just wow! This is brilliant🤩

  • @barbaraalexandrova6680
    @barbaraalexandrova6680 7 месяцев назад

    thank you for the best expanation on this topic

  • @danscherb4130
    @danscherb4130 Год назад

    Another amazing video! Please keep them coming!