Your bottom-up (instead of top-down) approach that you mentioned in the beginning of the video would be really great to see for all kinds of differrent concepts!
you are probably the best teacher I've ever seen, and I've learned from tons of people online like Andrew Ng, Andrej Karpathy, MIT lecture series, Brad Traversy, Statsquest.
I'm in the middle of a $2,500 course, BUT → RUclips → your video... 👏🏻👏🏻👏🏻👏🏻👏🏻 Thank you for starting with the "why", and appealing to my brains desire to understand, not just do.
Excellent way to explain the concept of KLD. I landed on this video after checking 4-5 other tutorials but none of those match the easeness of this tutorial. Thnanks.
I am a postdoc studying information theory and language. This is the best KL divergence explanation I've heard. I don't think I am going to forget it. :) Thanks!
Amazing explanation. Why isn't math usually explained like this?? I usually find myself having to crack open formulas to figure out what they mean, and 99% of the time, it's not clear at all. The work you're doing is absolutely amazing!!
I recently got interested in learning machine learning and stumbled upon the stable diffusion, the current state of art open source image generation ai. That's where I encountered the KL divergence. The more I try to understand it, more complicated concepts and formulas are thrown at me. I managed to find some videos that explains how to derive it, but none of them explained why the hell logarithm is present in it for gods sake! And here you are, explaining every missing details from other videos and blog posts in a way that the person who knows very little about the subject can understand in a very satisfying and easy to follow way. Hats off to you, sir. I wish every teachers are like you.
Best Math Teacher ever. So clearly explained the design and thinking process of how the algo comes out. Many video just explain the formula which confused me why we should do this way... Thank you!
That was great. I have struggled to understand certain aspects of KL Divergence, and this is a great way to think about it without getting bogged down in symbology.
Damn, this is so sick. Thanks for creating this content it helped me a lot to understand kl divergence as you showed it in a step by step process that makes it intuitive and easy to understand. Keep making such videos,thank you.
Thank you for the great explanation! I totally agree that math is not given from above, but invented by people. And showing how the invention can be done is the best way to teach the new concepts. Thanks a lot!
In the 'Motivation for log,' you said that taking a simple average is not the right way to go, and then you try to find a function that makes f(4) and f(1/4) have opposite signs. That means you are trying to make two very different distributions have the smallest distance possible (canceling each other out), which is contradictory to what we expected. We expected them to be large.
me not know some of the fundamentals after listening to your explanation made a lot of sense, and I felt I understood the concept well. I am willing to watch your videos more often.
the thing you said in the first minute, is something ive been saying for a while now. As students we arent told what problem drove scientists or engineers into constructing new formulas or ways of thinking.
Math should all be taught this way, and to go one step further we should teach people how to make sense of math themselve in the long run Thanks for the explanation for kl divergence though ;)
I am a masters student in data science and machine learning and I have to tell you that this is the best explanation one can get for concepts like this...Hope you make more videos on these types in concepts.
Thank you for this, the best explanation of KL divergence that I have seen. Love how you approach it building gradually, really inspiring for how to learn math.
Great video. Two pieces missing for it to be perfect are in my opinion. If you could just calculate for sum of log (p(x)/q(x)) and show us what's wrong with that number. Exactly as you did with simple p(x)/q(x) why it isn't good solution. Finally in last slide if you could give numbers. You tell about quantification of something that is visually clear, but missing the numbers is kinda missed opportunity to explain how it works :) Again - thank you a lot for explanation. Great work.
@@ritvikmath one question, suppose one were to use the average of the absolute value of the difference between the 2 distributions, why would this not be a good metric?
Thank you as always for sharing your brilliant teachings, Ritvik. Could you please do a video on the Gram-Schmidt process and how orthonormal basis matrices are relevant to data science?
I am a research scientist. You provide a clear and concise treatment of KL-Divergence. The best I have seen to date. Thanks.
Wow... 😳 I've never seen more genius, easy and intuitive explanation of KL-div 😳👏👏👏👏👏 Big thanks good man ! ❤️
Glad you liked it!
I agree,
That was the best description of why we use log that I have ever seen. Good work, man.
Your bottom-up (instead of top-down) approach that you mentioned in the beginning of the video would be really great to see for all kinds of differrent concepts!
Great idea!
you are probably the best teacher I've ever seen, and I've learned from tons of people online like Andrew Ng, Andrej Karpathy, MIT lecture series, Brad Traversy, Statsquest.
I'm in the middle of a $2,500 course, BUT → RUclips → your video... 👏🏻👏🏻👏🏻👏🏻👏🏻 Thank you for starting with the "why", and appealing to my brains desire to understand, not just do.
I don't think I'm ever going to forget this. Thanks so much.
Excellent way to explain the concept of KLD. I landed on this video after checking 4-5 other tutorials but none of those match the easeness of this tutorial. Thnanks.
I am a postdoc studying information theory and language. This is the best KL divergence explanation I've heard. I don't think I am going to forget it. :) Thanks!
This is the most intuitive explanation for any statistics problem.
Thanks!
Clear, simplified, the best approach to lead to why to use a formula. Thank you!!
Glad it was helpful!
This is mind blowing.... I love the way you go from the problem to the solution, it's clever way to understand this KL divergence
thanks!
Wow. This is the best explanation of KL-divergence I've ever heard. So many over-complicated stuff out there but yours is absolutely genius.
Glad it was helpful!
Amazing explanation. Why isn't math usually explained like this?? I usually find myself having to crack open formulas to figure out what they mean, and 99% of the time, it's not clear at all.
The work you're doing is absolutely amazing!!
I recently got interested in learning machine learning and stumbled upon the stable diffusion, the current state of art open source image generation ai.
That's where I encountered the KL divergence. The more I try to understand it, more complicated concepts and formulas are thrown at me.
I managed to find some videos that explains how to derive it, but none of them explained why the hell logarithm is present in it for gods sake!
And here you are, explaining every missing details from other videos and blog posts in a way that the person who knows very little about the subject can understand in a very satisfying and easy to follow way. Hats off to you, sir. I wish every teachers are like you.
Thanks and godspeed for your journey through machine learning !
That was great! Not just dumping the formula on you but walking you through its logic with simple steps. Loved it! ❤
Best Math Teacher ever. So clearly explained the design and thinking process of how the algo comes out. Many video just explain the formula which confused me why we should do this way... Thank you!
That was great. I have struggled to understand certain aspects of KL Divergence, and this is a great way to think about it without getting bogged down in symbology.
Glad it was helpful!
Man you are amazing,
I am gonna binge watch all the videos for better intuitive understandings
The comments didn't lie you actually explained this so well. I watched the ads all the way through btw.
You have the best explanations for understanding data science concepts. Thank you so much !
Just fantastic! Even if I forget the formula for KL divergence, I can "re-engineer" it on demand.
Damn, this is so sick. Thanks for creating this content it helped me a lot to understand kl divergence as you showed it in a step by step process that makes it intuitive and easy to understand. Keep making such videos,thank you.
This video is absolutely mind-blowing! The way it breaks down such a complex concept into an intuitive understanding is truly remarkable.
Thank you!
Thank you for the great explanation! I totally agree that math is not given from above, but invented by people. And showing how the invention can be done is the best way to teach the new concepts. Thanks a lot!
Let's celebrate a new video on this amazing chanel!!! Love your work!
🎉
One of the BEST tutorials for sure
I found out this professor is very good at explaining every tough concept! respect and many appreciations!
The best explanation I've ever seen about KL divergence ❤
Fantastically clearly explained, congrats.
Everytime i have a math question your hannel is my first choice! Amazing ✅ thanks a million 🎉
I love how you approach to the KL divergence!
superb...I believe this is the best explanation I have ever come across for K L Divergence. Thanks a tonne.
You sir win. Simply the best explanation. I can’t thank you enough.
In the 'Motivation for log,' you said that taking a simple average is not the right way to go, and then you try to find a function that makes f(4) and f(1/4) have opposite signs. That means you are trying to make two very different distributions have the smallest distance possible (canceling each other out), which is contradictory to what we expected. We expected them to be large.
Omg, that was such a brilliant way of delivering information! Thank you so much for this video.
Thanks...KL Divergence always seemed a hurdle in understanding diffusion models. Thanks for clarification a lot.
I have never seen complex math explained this good Thank you very much!
What an excellent teacher you are! Thank you for all your videos.
Wow!!! This approach to explaining was mind "opening". I got it! Thanks so much
Blew my mind, I wanted to understand what kl divergence is to understand the recent Gen AI papers and couldn't. This video helped me a lot.
you are way more better than my school's professor. thank you
It was the easiest explanation I’ve ever seen.
signed in just to like and comment. thanks man, you're a life saver
I think you're channel and teaching style is brilliant. I wish I knew about this channel when I was doing my undergrad.
me not know some of the fundamentals after listening to your explanation made a lot of sense, and I felt I understood the concept well. I am willing to watch your videos more often.
Your videos are great just keep going, I watched you for few years already
wahh.. i am studying computer science master degree. Your video really helps me a lot! please keep on doing such great work for us!
You've really made my day with ur explanation. Thank you so much :D
Thanks a lot for sharing the underlying motivation behind the K-L divergence! I really needed such deep insights! JAJAKALLAH...
You're so welcome!
Outstanding. Really helping me through this info retrieval course!
the thing you said in the first minute, is something ive been saying for a while now. As students we arent told what problem drove scientists or engineers into constructing new formulas or ways of thinking.
This is how math should be taught
amazing explanation. not many can do this. well done.
This explanation is gold ! 🏅
Great explanation, this is the first time I'm learning about KL divergence and it was very easy to grasp because of the way you taught it
Great for understanding the steps clearly! Good work!
Thanks!
Thank you for a good explanation of a seemingly weird looking formula. It would be hard to forget this formula now that I got it from here.
Thanks!
Math should all be taught this way, and to go one step further we should teach people how to make sense of math themselve in the long run
Thanks for the explanation for kl divergence though ;)
Amazing. The pace you have explained, the approach...everything is just top-notch.
I am a masters student in data science and machine learning and I have to tell you that this is the best explanation one can get for concepts like this...Hope you make more videos on these types in concepts.
Wow, thanks!
That was one of the best explanations I have ever heard! Great job and many thanks!
Thanks!!
Amazing teaching. It helps a lot in my process of data shift covariate detection project. Thanks
Glad it was helpful!
Taking the MITx Stats class, but I find that you explain the concepts so much better!
Glad to hear!
mad respect for Ritvik from Ritwik for acing the subtle art of intuitive explanation:)) If only professors could master the same art.
Universities should fire their math professors and get you to teach their classes. Well done!
dude, the explanation is so good, you rock!
Glad it helped!
Why do we want to prioritize popular x values in our current distribution? (See 11:46)
Thank you! This is the best explanation of KL divergence wich i've seen
Glad it was helpful!
How can this guy only have 8,000 views on such a good video... Very nice way of explaining!
Wow, thank you!
OMG. Thanks for the intuitive explaintation really appreciate that 👏
Thanks for the lecture, your work is always so intuitive.
You are very welcome
Great stuff... Learning a way to teach maths to my kid... A constructivist method... While learning about stats... I really appreciate your work.
Glad it was helpful!
Excellent way to explain it. Makes maths sounds logical and approachable 🎉
Great work! I've been a fan of your ,material for some time and in this video you have truly mastered your craft.
Wow, thank you!
Love this homie! Better than university.
😊
Best video I've seen in a while!
Thanks!
This was awesome. Really helpful to think through it backwards and “redevelop” our own function
Thank you for this, the best explanation of KL divergence that I have seen. Love how you approach it building gradually, really inspiring for how to learn math.
17:15: I was just thinking to myself: can't KLD, in that case, be considered as a measure of likelihood (of Q1 and Q2 given the observed distribution?
Thanks, exactly the explanation I have been looking for!
This is the perfect video in Math. Love it. Shared with all my readers
Excellent.
Great video. Two pieces missing for it to be perfect are in my opinion. If you could just calculate for sum of log (p(x)/q(x)) and show us what's wrong with that number. Exactly as you did with simple p(x)/q(x) why it isn't good solution. Finally in last slide if you could give numbers. You tell about quantification of something that is visually clear, but missing the numbers is kinda missed opportunity to explain how it works :)
Again - thank you a lot for explanation. Great work.
Truly. Bravo, this was awesome
Great vid man, god damn your pedagogy is incredible
Excellent intuitive explanation!
This was incredibly illustrative!
Absolutely beautiful explanation! Thank you
Glad it was helpful!
@@ritvikmath one question, suppose one were to use the average of the absolute value of the difference between the 2 distributions, why would this not be a good metric?
Thank you so much for this explanation and also got a new insight about the log :)
Happy to help!
great explanation!
Glad you think so!
Amazing video, love the format!
It would be interesting to have a video on how you study to understand a topic, what resources you use and the materials you look for
Thank you as always for sharing your brilliant teachings, Ritvik. Could you please do a video on the Gram-Schmidt process and how orthonormal basis matrices are relevant to data science?
Great job!!! Love this explanation
This is an amzing explanation, thanks!
Glad it was helpful!
Wow. Just wow! This is brilliant🤩
Thanks!
thank you for the best expanation on this topic
Another amazing video! Please keep them coming!