Entropy is the limit of compression (Huffman Coding)

Поделиться
HTML-код
  • Опубликовано: 4 янв 2014
  • What is the limit of compression? Huffman codes are introduced in order to demonstrate how entropy is the ultimate limit of compression.

Комментарии • 21

  • @ArtOfTheProblem
    @ArtOfTheProblem  3 года назад

    Link to series playlist: ruclips.net/p/PLbg3ZX2pWlgKDVFNwn9B63UhYJVIerzHL

  • @NiceyNiceGuyMike
    @NiceyNiceGuyMike 10 лет назад +5

    I cannot believe how well that was explained. I've not done much searching for explanations for Huffman, but this really makes it easy to understand.

  • @vivek73
    @vivek73 9 лет назад +6

    Wonderful video, I was trying to learn huffman codes from a book, but could not get my mind around it. but this explanation is superb

  • @baseballsbetter
    @baseballsbetter 10 лет назад +4

    Great video. Awesome to see a new upload from you

  • @martonbalogh6021
    @martonbalogh6021 2 года назад +1

    Love the music
    It's also impressing and informative.
    Thank you.

  • @xTh3N00b
    @xTh3N00b 10 лет назад +2

    awsome videos lovem

  • @aMammoth
    @aMammoth 10 лет назад +2

    YES YES YES YES! These are soo good!

  • @venkatraman7982
    @venkatraman7982 4 года назад

    How do we know the probabilities practically. In case of mobile communication. Each time the information changes

  • @kirdook
    @kirdook 9 лет назад +1

    In this given problem we didn't have that information, but in practice, you could you get better compression if you did it with letter sequences (in stead of individual letters), similar to what the other guy was doing for his english speaking machines, correct?

    • @frisosmit8920
      @frisosmit8920 7 лет назад +1

      I think so. Lets say you have a very predictable message that contains only 4 different sentences, you can encode them in much fewer bits than if you have each sentence as an option.

  • @seriousmax
    @seriousmax 9 лет назад

    Hi. Could you please list the music you used as soundtrack for your videos? I like dissonance :).

    • @britcruise2629
      @britcruise2629 9 лет назад +1

      Cam Murray makes all the music. He's posted a collection here: cameronmichaelmurray.bandcamp.com/album/art-of-the-problem-volume-1-gambling-with-secrets you can also bug him there about making a 2nd for this series

    • @seriousmax
      @seriousmax 9 лет назад

      Ah, great! Thanks.

  • @MattSpaul
    @MattSpaul 10 лет назад +3

    What is the simulation shown at 3:22 is it online?

    • @ArtOfTheProblem
      @ArtOfTheProblem  10 лет назад +2

      Yes I have it here: www.khanacademy.org/math/applied-math/informationtheory/moderninfotheory/p/information-entropy-exploration

    • @MattSpaul
      @MattSpaul 10 лет назад +2

      Art of the Problem Thanks :)

  • @johnsherfey3675
    @johnsherfey3675 8 лет назад

    Could you do ?
    A: 1
    B : 01
    C : 10
    D : 00.

    • @dongiea
      @dongiea 8 лет назад +4

      John Sherfey No, you can't. Huffman coding creates a prefix-free code. That means that the code for one symbol can't be the prefix for another. Having A: 1, and C: 10 violates this

    • @johnsherfey3675
      @johnsherfey3675 8 лет назад

      dongiea how?

    • @johnsherfey3675
      @johnsherfey3675 8 лет назад

      dongiea d is 00 not 0 so c 10 no 100 = AD?

    • @johnsherfey3675
      @johnsherfey3675 8 лет назад +2

      dongiea​ also to correct you and my self 1001 would have an error. is it Ada or bc