A Short Introduction to Entropy, Cross-Entropy and KL-Divergence

Поделиться
HTML-код
  • Опубликовано: 4 окт 2024
  • Entropy, Cross-Entropy and KL-Divergence are often used in Machine Learning, in particular for training classifiers. In this short video, you will understand where they come from and why we use them in ML.
    Paper:
    "A mathematical theory of communication", Claude E. Shannon, 1948, pubman.mpdl.mpg...
    Errata:
    At 5:05, the sign is reversed on the second line, it should read: "Entropy = -0.35 log2(0.35) - ... - 0.01 log2(0.01) = 2.23 bits"
    At 8:43, the sum of predicted probabilities should always add up to 100%. Just pretend that I wrote, say, 23% instead of 30% for the Dog probability and everything's fine.
    The painting on the first slide is by Annie Clavel, a great French artist currently living in Los Angeles. The painting is reproduced with her kind authorization. Please visit her website: www.annieclavel....

Комментарии • 469

  • @revimfadli4666
    @revimfadli4666 4 года назад +454

    This feels like a 1.5-hour course conveyed in just 11 minutes, i wonder how much entropy it has :)

    • @grjesus9979
      @grjesus9979 3 года назад +2

      hahaha

    • @anuraggorkar5595
      @anuraggorkar5595 3 года назад +1

      Underrated Comment

    • @klam77
      @klam77 3 года назад +4

      ahhh....too clever. the comment has distracted my entropy from the video. Negative marks for you!

    • @Darkev77
      @Darkev77 3 года назад

      @@klam77 Could you elaborate on his joke please?

    • @ashrafg4668
      @ashrafg4668 3 года назад +5

      @@Darkev77 The idea here is that most other resources (videos, blogs) take a very long time (and more importantly say a lot of things) to convey the ideas that this video did in a short time (and with just the essential ideas). This video, thus, has low entropy (vs most other resources that have much higher entropy).

  • @jennyread9464
    @jennyread9464 6 лет назад +562

    Fantastic video, incredibly clear. Definitely going to subscribe!
    I do have one suggestion. I think some people might struggle a little bit around 2m22s where you introduce the idea that if P(sun)=0.75 and P(rain)=0.25, then a forecast of rain reduces your uncertainty by a factor of 4. I think it's a little hard to see why at first. Sure, initially P(rain)=0.25 while after the forecast P(rain)=1, so it sounds reasonable that that would be a factor of 4. But your viewers might wonder why you can’t equally compute this as, initially P(sun)=0.75 while after the forecast P(sun)=0. That would give a factor of 0!
    You could talk people through this a little more, e.g. say imagine the day is divided into 4 equally likely outcomes, 3 sunny and 1 rainy. Before, you were uncertain about which of the 4 options would happen but after a forecast of rain you know for sure it is the 1 rainy option - that’s a reduction by a factor of 4. However after a forecast of sun, you only know it is one of the 3 sunny options, so your uncertainty has gone down from 4 options to 3 - that’s a reduction by 4/3.

    • @AurelienGeron
      @AurelienGeron  6 лет назад +59

      Thanks Jenny! You're right, I went a bit too fast on this point, and I really like the way you explain it. :)

    • @god-son-love
      @god-son-love 6 лет назад +1

      Shouldn't one use information gain to check the extent of reduction ? IG = (-1log2(1) - 0log2(0) ) - (-(3/4)log2(4/3)-(1/4)log2(1/4)) = 0.01881437472 bit

    • @dlisetteb
      @dlisetteb 6 лет назад +4

      thank youuuuuuuuuuuuuuuuu

    • @rameshmaddali6208
      @rameshmaddali6208 6 лет назад +18

      Actually I understand the concept from your comment than the video itself :) thanks a lot

    • @maheshwaranumapathy
      @maheshwaranumapathy 6 лет назад +8

      awesome, great insight i did struggle to get it at first place. Checked out the comments and bam! Thanks :)

  • @ArxivInsights
    @ArxivInsights 6 лет назад +258

    As a Machine Learning practitioner & RUclips vlogger, I find these videos incredibly valuable! If you want to freshen up on those so-often-needed theoretical concepts, your videos are much more efficient and clear than reading through several blogposts/papers. Thank you very much!!

    • @AurelienGeron
      @AurelienGeron  6 лет назад +19

      Thanks! I just checkout out your channel and subscribed. :)

    • @pyeleon5036
      @pyeleon5036 6 лет назад +2

      I like your video too! Especially the VAE one

    • @fiddlepants5947
      @fiddlepants5947 5 лет назад +5

      Arxiv, it was actually your video on VAE's that encouraged me to check out this video for KL-Divergence. Keep up the good work, both of you.

    • @grjesus9979
      @grjesus9979 4 года назад

      thank you, at first i messed up trying to understand but now reading your comment i understamd it. Thank you! 😊

  • @yb801
    @yb801 Год назад +2

    Thank you , I have always confused about these three concepts, you make these concepts really clear for me.

  • @agarwaengrc
    @agarwaengrc Год назад +2

    Haven't seen a better, clearer explanation of entropy and KL-Divergence, ever, and I've studied information theory before, in 2 courses and 3 books. Phenomenal, this should be made the standard intro for these concepts, in all university courses.

  • @colletteloueva13
    @colletteloueva13 Год назад +2

    One of the most beautiful videos I've watched and understood a concept :')

  • @xintongbian
    @xintongbian 6 лет назад +42

    I've been googling KL Divergence for some time now without understanding anything... your video conveys that concept effortlessly. beautiful explanation

  • @Rafayak
    @Rafayak 5 лет назад +24

    Finally, someone who understands, and doesn't just regurgitate the wikipedia page :) Thanks alot!

  • @aa-xn5hc
    @aa-xn5hc 6 лет назад +41

    you are a genius in creating clarity

  • @AladinxGonca
    @AladinxGonca 5 месяцев назад +1

    You are the most talented tutor I've ever seen

  • @陈亮宇-m1s
    @陈亮宇-m1s 6 лет назад +8

    I come to find Entorpy, but I received Entorpy, Cross-Enropy and KL-divergence. You are so generous!

  • @metaprog46and2
    @metaprog46and2 4 года назад +2

    Phenomenal explanation of a seemingly esoteric concept into one that's simple & easy-to-understand. Great choice of examples too. Very information-dense yet super accessible for most people (I'd imagine).

  • @JakeMiller2020
    @JakeMiller2020 4 года назад

    I always seem to come back to watch this video every 3-6 months, when I forget what KL Divergence is conceptually. It's a great video.

  • @chenranxu6941
    @chenranxu6941 3 года назад +1

    Wow! It's just incredible to convey so much information while still keeping everything simple & well-explained, and within 10 min.

  • @summary7428
    @summary7428 3 года назад

    this is by far the best and most concise explanation on the fundamental concepts of information theory we need for machine learning..

  • @paulmendoza9736
    @paulmendoza9736 Год назад

    I want to like this video 1000 times. To the point, no BS, clear, understandable.

  • @SagarYadavIndia
    @SagarYadavIndia Год назад

    Beautiful short video, explaining the concept that is usually a 2 hour explanation in about 10 minutes.

  • @011azr
    @011azr 6 лет назад +4

    Sir, you have a talent to explain stuff in a crystal clear manner. You just make something that is usually explained by a huge sum of math equations to be something so simple like this. Great job, please continue on making more RUclips videos!

  • @michaelzumpano7318
    @michaelzumpano7318 Год назад +3

    Wow! This was the perfect mix of motivated examples and math utility. I watched this video twice. The second time I wrote it all out. 3 full pages! It’s amazing that you could present all these examples and the core information in ten minutes without it feeling rushed. You’re a great teacher. I’d love to see you do a series on Taleb’s books - Fat Tails and Anti-Fragility.

  • @jdm89s13
    @jdm89s13 5 лет назад +1

    This 11-ish minute presentation so clearly and concisely explained what I had a hard time understanding from a one hour lecture in school. Excellent video!

  • @s.r8081
    @s.r8081 3 года назад +1

    Fantastic! This short video really explains the concept of entropy, cross-entropy, and KL-Divergence clearly, even if you know nothing about them before.
    Thank you for the clear explaination!

  • @glockenspiel_
    @glockenspiel_ 3 года назад +2

    Thank you, very well explained! I decided to get into machine learning in this hard quarantine period but I didn't have many expectations placed on me. Thanks to your clear and friendly explanations in your book I am learning, improving and, not least, enjoying a lot. So thank you so much!

  • @khaledelsayed762
    @khaledelsayed762 2 года назад

    Very elegant indicating how cognizant the presenter is.

  • @sushilkhadka8069
    @sushilkhadka8069 2 месяца назад

    Wow best explaination ever, I found this while I was in college. I just come here once a year just to refresh my intution.

  • @DailyHomerClips
    @DailyHomerClips 5 лет назад

    this is by far the best description of those 3 terms , can't be thankful enough

  • @billmo6824
    @billmo6824 2 года назад

    Really, I definitely cannot come up with an alternative way to explain this concept more concisely.

  • @hassanmatout741
    @hassanmatout741 6 лет назад +2

    This channel will sky rocket. no doubt. Thank you so much! Clear, visualized and well explained at a perfect pace! Everything is high quality! Keep it up sir!

  • @GreenCowsGames
    @GreenCowsGames Год назад

    I am new to information theory and computer science in general, and this is the best explanation I could find about these topics by far!

  • @Dr.Roxirock
    @Dr.Roxirock Год назад +1

    I really enjoyed the way you are explaining it. It's so inspiring watching and learning difficult concepts from the author of such an incredible book in the ML realm. I wish you could teach via video other concepts as well.
    Cheers,
    Roxi

  • @frankcastle3288
    @frankcastle3288 3 года назад

    I have been using cross-entropy for classification for years and I just understood it. Thanks Aurélien!

  • @Dinunzilicious
    @Dinunzilicious 3 года назад

    Incredibly video, easily one of the top three I've ever stumbled across in terms of concise educational value. Also love the book, great for anyone wanting this level of clarity on a wide range of ml topics.
    Not sure if this will help anyone else, but I was having trouble understanding why we choose 1/p as the "uncertainty reduction factor," and not, say 1-p or some other metric. What helped me gain an intuition for this was realizing 1/p is the number of bits we would need to encode a uniform distribution if every event had the probability p. So the information, -log(p), is how many bits that event would be "worth" were it part of a uniform distribution. This uniform distribution is also the maximum entropy distribution that event could possibly come from given its probability...though you can't reference entropy without first explaining information.

  • @voraciousdownloader
    @voraciousdownloader 4 года назад +1

    Really the best explanation of KL divergence I have seen so far !! Thank you.

  • @bingeltube
    @bingeltube 6 лет назад +2

    Very recommendable! Finally, I found someone who could explain these concepts of entropy, cross entropy in very intuitive ways

  • @CowboyRocksteady
    @CowboyRocksteady Год назад

    i'm loving the slides and explaination. I noticed the name in the corner and thought, oh nice i know that name. then suddenly... It's the author of that huge book i love!

  • @thegamersschool9978
    @thegamersschool9978 2 года назад

    I am reading your book! and oh man oh what a book!!! first I thought how the book and video has exactly same example for explanation until I saw the book of yours on the later part of the video, and realized it's you it's so great to listen to you after reading you!!

  • @jackfan1008
    @jackfan1008 6 лет назад +1

    This explanation is absolutely fantastic. Clear, concise and comprehensive. Thank you for the video.

  • @maryamzarabian4617
    @maryamzarabian4617 2 года назад

    thank you for useful video , and also really thanks for your book . You express very difficult concepts of machine learning like a piece of cake .

  • @sagnikbhattacharya1202
    @sagnikbhattacharya1202 6 лет назад +3

    You make the toughest concepts seem super easy! I love your videos!!!

  • @fahdciwan8709
    @fahdciwan8709 4 года назад +1

    phew !! as newbie to Machine Learning without a background in maths this video saved me, else i never expected to grasp the Entropy concept

  • @matthewwilson2688
    @matthewwilson2688 6 лет назад +2

    This is the best explanation of entropy and KL I have found. Thanks

  • @misnik1986
    @misnik1986 3 года назад

    Thank you so much Monsieur Geron pour cette explication simple et limpide

  • @ykkim77
    @ykkim77 4 года назад +1

    This is the best explanation of the topics that I have ever seen. Thanks!

  • @jamesjenkins9480
    @jamesjenkins9480 2 года назад +1

    I've learned about this before, but this is the best explanation I've come across. And was a helpful review, since it's been a while since I used this. Well done.

  • @swapanjain892
    @swapanjain892 6 лет назад +1

    You have no idea how much this video has helped me.Thanks for making such quality content and keep creating more.

  • @ramensusho
    @ramensusho 3 месяца назад

    The no. of bits I received is way higher than I expected !!
    Nice video

  • @fberron
    @fberron 3 года назад

    Finally I understood Shannon's theory of information. Thank you
    Aurélien

  •  Год назад +1

    the best video on cross entropy on youtube so far

  • @-long-
    @-long- 5 лет назад

    Guys this is the best explanation on Entropy , Cross-Entropy and KL-Divergence.

  • @GuilhermeKodama
    @GuilhermeKodama 5 лет назад +1

    the best explanation I ever had about the topic. It was really insightful.

  • @LC-lj5kd
    @LC-lj5kd 6 лет назад +2

    ur tutorial is always invincible. quite explicit with great examples. Thanks for ur work

  • @achillesarmstrong9639
    @achillesarmstrong9639 5 лет назад +2

    This is the 3rd time I watch this video. In April , September, and the December 2018. The first time I watched, I thought I understood this topic, but I know that I knew nothing back then.

  • @vaishanavshukla5199
    @vaishanavshukla5199 4 года назад +2

    great understanding
    and very good mentor

  • @gowthamramesh2443
    @gowthamramesh2443 5 лет назад +11

    Kinda feels like 3Blue1Brown's version of Machine learning Fundamentals. Simply Amazing

    • @AurelienGeron
      @AurelienGeron  5 лет назад +5

      Thanks a lot, I'm a huge fan of 3Blue1Brown! 😊

  • @areejabdu3125
    @areejabdu3125 5 лет назад

    this explanation really helps the learner in understanding such vague scientific concepts, thanx for the clear explanation !!

  • @romanmarakulin7448
    @romanmarakulin7448 5 лет назад +1

    Thank you so much! Not only it helped me understand KL-Divergence, but also it is helpful to remember the formula. From now I will place signs in right places. Keep it up!

  • @mohamadnachabe1
    @mohamadnachabe1 5 лет назад +1

    This was the best intuitive explanation of entropy and cross entropy I've seen. Thanks!

  • @shuodata
    @shuodata 4 года назад

    Best Entropy and Cross-Entropy explanation I have ever seen

  • @sagarsaxena7202
    @sagarsaxena7202 5 лет назад +1

    Great work in the explanation. I have been pretty confused with this concept and the implication of Information theory with ML. This video does the trick in clarifying the concepts while providing a sync between information theory and ML usage. Thanks much for the video.

  • @akshiwakoti7851
    @akshiwakoti7851 4 года назад

    Hats off! One of the best teachers ever! This definitely helped me better understand it both mathematically and intuitively just in a single watch. Thanks for reducing my 'learning entropy'. My KL divergence on this topic is near zero now. ;)

  • @alirezamarahemi2352
    @alirezamarahemi2352 2 года назад

    Not only this video is fantastic in explaining the concepts, but also the book "Hands-On Machine Learning with Scikit-Learn, Keras, and TensorFlow_ Concepts, Tools, and Techniques to Build Intelligent Systems-O’Reilly Media (2019)" is the best book I've studied on machine learning subject by the same author (Aurélien Géron).

  • @ashutoshnirala5965
    @ashutoshnirala5965 4 года назад

    Thankyou for such a wonderful and to the point video. Now I know: Entropy, Cross Entropy, KL Divergence and also why cross entropy is such a good choice as loss function.

  • @salman3112
    @salman3112 6 лет назад +2

    Your channel has become one of my favorite channels. Your explanation of CapsNet and now this is just amazing. I am going to get your book too. Thanks a lot. :)

  • @unleasedflow8532
    @unleasedflow8532 3 года назад

    Nicely conveyed what is to be learned about the topic. I think I absorbed all the way. Best tutorial, keep dropping video like this.

  • @sanjaykrish8719
    @sanjaykrish8719 4 года назад +3

    Aurelien has a knack for making things simpler. Check out his Deep leaning using TensorFlow course in Udacity. It's amazing.

  • @YYchen713
    @YYchen713 2 года назад

    Fantastic video! Now all the dots are connected! I have used loss function for NN machine learning, but not knowing the math behind it! This is so enlightening!

  • @ilanaizelman3993
    @ilanaizelman3993 5 лет назад

    Thanks, for people who are looking for ML explanation: Cross-Entropy is computed with -log(0.25)

  • @ramonolivier57
    @ramonolivier57 4 года назад +1

    Excellent explanation and discussion. Thank you very much!!

  • @anonymous.youtuber
    @anonymous.youtuber Год назад +1

    Magnificent explanation! 👍

  • @ekbastu
    @ekbastu 5 лет назад +2

    I came here to learn how to correctly pronounce his name :).
    The content is simply great. Thanks a lot.

  • @samzhao1827
    @samzhao1827 4 года назад

    Very few of people can explain like you to be honest! I read so many decision tree tutorial and they are actually talking the same thing(information gain), but after I reading their articles I got 0 understanding still, big thanks to this video!

  • @elvisng1977
    @elvisng1977 2 года назад

    This video is so clear and so well explained, just like his book!

  • @aiyifei4732
    @aiyifei4732 3 года назад

    Thanks, explain is clear. I found it's clean and easy to understand compare with my lecture notes. I don't even think they mentioned the history and derivation/origin

  • @juliannajia6871
    @juliannajia6871 3 года назад

    The best video describing Cross-Entropy

  • @julioreyram
    @julioreyram 3 года назад

    I'm amazed by this video, you are a gifted teacher.

  • @meerkatj9363
    @meerkatj9363 6 лет назад +1

    I've seen all your videos now. You've taught me a lot of things and this was some good moments. Can't wait for more. Thanks so much

  • @deteodskopje
    @deteodskopje 4 года назад

    Very nice. Really short yet clearly grasping the point of these concepts. Subscribed.
    I was really excited when I found this chanel. I mean the book Hands On Machine Learning is maybe the best book you can find these days

  • @helelemamayku6302
    @helelemamayku6302 5 лет назад +1

    Normally when I like a video, I just click the like button. Since this is sooooo helpful, I will also leave a comment to thank you for making this.

  • @andrewtwigg
    @andrewtwigg 3 года назад

    Thanks for the explanation, very clear and complements your excellent book

  • @leastactionlab2819
    @leastactionlab2819 4 года назад +1

    Great video to learn interpretations of the concept of cross-entropy.

  • @chinmaym92
    @chinmaym92 6 лет назад +2

    I rarely comment on videos, but this video is so good. I just couldn't resist. Thank you so much for the video. :)

  • @MrFurano
    @MrFurano 6 лет назад

    To-the-point and intuitive explanation and examples! Thank you very much! Salute to you!

  • @PerisMartin
    @PerisMartin 6 лет назад

    Your explanations are so much better than other "famous" ML vloggers (... looking at you Siraj Raval!). You truly know what you are talking about, even my grandma could understand this!! Subscribed, liked and belled. More, please!

    • @AurelienGeron
      @AurelienGeron  6 лет назад

      Thanks Martin, I'm glad you enjoyed this presentation! My agenda is packed, but I'll do my best to upload more videos asap. :)

  • @srikumarsastry7473
    @srikumarsastry7473 6 лет назад +1

    So much clear explanation! Need more of them!

  • @se123acabaron
    @se123acabaron 5 лет назад

    Fantastic video! It made me understand and get together many "loose" concepts. Thank you very much for this contribution!

  • @JoeVaughnFarsight
    @JoeVaughnFarsight Год назад

    Merci Aurélien Géron, c'était une très belle présentation !

  • @danyalkhaliq915
    @danyalkhaliq915 4 года назад

    super clear .. never I heard this explanation of Entropy and Cross Entropy !

  • @AbhishekSingh-og7kf
    @AbhishekSingh-og7kf 3 года назад

    Every concept are very clear... Thanks a lot!!

  • @michaelding5970
    @michaelding5970 4 года назад

    The best explanation I've seen on this topic.

  • @MrMijertone
    @MrMijertone 6 лет назад +1

    I had to find a word for how well you explain. Perspicious. Thank you.

    • @AurelienGeron
      @AurelienGeron  6 лет назад

      I just learned a new word, thanks James! :)

  • @sametcansonmez6955
    @sametcansonmez6955 4 года назад +1

    It is really clear and easy to understand

  • @thomaswatts6517
    @thomaswatts6517 3 года назад

    Incredible, a frictionless explanation

  • @davidbeauchemin3046
    @davidbeauchemin3046 6 лет назад

    Awesome video, you made the concept of entropy so much clearer.

  • @DEEPAKKUMAR-sk5sq
    @DEEPAKKUMAR-sk5sq 5 лет назад +1

    Please do a video on 'PAC learning'. It seems very complex. Your way of explanation can make it easy!!

  • @zoeye720
    @zoeye720 5 лет назад +2

    This video explains the concepts so well! Thank you!

  • @djin5395
    @djin5395 2 года назад

    Thank you so much for your video in 18th July 2022 ! It helps me.

  • @olegovcharenko8684
    @olegovcharenko8684 4 года назад +1

    Brilliant explanation!

  • @mankaransingh2599
    @mankaransingh2599 5 лет назад +3

    lol, i was reading only your book when i searched for 'cross entropy' and boom, i never knew you had a youtube channel too !

    • @AurelienGeron
      @AurelienGeron  5 лет назад +3

      Haha, I hope you enjoy it! :) I haven't posted a video in months, because I've been busy moving to Singapore and writing the 2nd edition of my book, but as soon as I finish the book I'll get back to posting videos!

    • @mankaransingh2599
      @mankaransingh2599 5 лет назад +2

      ​@@AurelienGeron Good luck with that ! You are a great teacher.

  • @deepfakevasmoy3477
    @deepfakevasmoy3477 4 года назад +1

    Thanks for clear explanation!

  • @laura_uzcategui
    @laura_uzcategui 4 года назад

    Really good explanation, the visuals were also great for understanding! Thanks Aurelien.

  • @BDEvans
    @BDEvans 4 года назад +1

    Brilliant video, thanks so much