What is Entropy? and its relation to Compression

Поделиться
HTML-код
  • Опубликовано: 16 сен 2024
  • Explains Entropy in information theory and gives and example that shows its relationship to compression.
    Related videos: (see: iaincollings.com)
    • What are Channel Capacity and Code Rate? • What are Channel Capac...
    • What is Water Filling for Communications Channels? • What is Water Filling ...
    • What is a Gaussian Codebook? • What is a Gaussian Cod...
    • What is Fisher Information? • What is Fisher Informa...
    • How are Throughput, Bandwidth, and Data Rate Related? • How are Throughput, Ba...
    Full categorised list of videos and PDF Summary Sheets: iaincollings.com
    .

Комментарии • 54

  • @safiyajd
    @safiyajd 2 года назад +3

    I can’t express how amazing and helpful is the work you are doing! Keep going man, May God lighten your way, satisfy you, and reward you immensely!

    • @iain_explains
      @iain_explains  2 года назад +1

      Thanks so much. I'm glad the videos have been helpful.

    • @_SeaH0rse
      @_SeaH0rse 2 года назад

      This dude is such a blessing lol, wish I'd found these years ago.

  • @rationalthinker9612
    @rationalthinker9612 Год назад +2

    Whoa, I am reading my textbook for Communications Engineering, an EE course, and the book does a terrible job at explaining these concepts. You knocked it out of the park with that explanation, thanks!!!

    • @iain_explains
      @iain_explains  Год назад

      I'm glad you liked it. Have you seen my web page? It's got a categorised listing of all the videos on the channel. I've got lots more that I'm sure will help with a range of other concepts you're reading about. One of my main motivations in making these videos, is to help explain concepts that can be confusing in textbooks. iaincollings.com

  • @moebius2217
    @moebius2217 3 года назад +1

    OMG! How did people miss this fabulous Lecture?

  • @arvindp551
    @arvindp551 2 года назад +1

    Thank you Professor, you are really good at saving paper and keeping us hungry for in-depth knowledge.

  • @JoAna-cg7gx
    @JoAna-cg7gx 4 месяца назад

    Incredibly helpful!

  • @bbanahh
    @bbanahh 3 года назад +2

    Excellent work!

  • @speedsystem4582
    @speedsystem4582 Год назад

    I love your lectures. It's really helpful. My professor hardly bothers to explain, like in our class, he told us the method to perform Huffman coding without any context!
    By the way, I observed a slight inconsistency in the video towards the end as you compared the improved coding efficiency on stringing '⍺β'; The 'β' from the previous case is half as likely as ⍺, so if '⍺' is always followed by 'β', both have to equally likely in the previous example

    • @iain_explains
      @iain_explains  Год назад

      You're right, but the two examples are not the same (and I didn't intend them to be the same) - which is why the latter example can be compressed more (which is the point I was making).

  • @khanhnguyenviet3674
    @khanhnguyenviet3674 9 месяцев назад

    Thank you Professor. Your videos are incredibly helpful.

  • @gary1951
    @gary1951 2 года назад

    Thank you so much for this, basically explain what my professor tried to do... but did so much better.

  • @2262sandeep
    @2262sandeep Год назад

    Wow, this is actually teaching !!
    Love you Sir, pls record all your knowledge 🙏 ❤

  • @kassiemarq3928
    @kassiemarq3928 Год назад

    thank you. you teach better than my professor

  • @HaliFins
    @HaliFins 7 дней назад

    Thanks from Norway!!

  • @CuongPhamQ
    @CuongPhamQ 2 года назад

    Complex concepts are explained by an amazing Professor

  • @zainabiraq9921
    @zainabiraq9921 Год назад +1

    Thanx a lot

  • @gwagsawogbami8938
    @gwagsawogbami8938 5 месяцев назад +1

    what i needed

  • @chadx8269
    @chadx8269 Год назад

    You are gold.

  • @TheRn35
    @TheRn35 Год назад

    Thank you so much for the intuitive understanding!

  • @nikhilsachan_cse7144
    @nikhilsachan_cse7144 2 года назад

    best explanation and good example taken

  • @niividaaaw
    @niividaaaw 3 года назад

    Sir In love with your teaching ! One doubt ( not releated to this video )
    What role poles and zeroes play in making a system odd and even,today I read zeros at origin makes the signal odd ! Sir can you xplain sir,it was given as
    S/(s+1)(s-1) => odd, (s+1)(s-1)/s neither odd nor even (but 2nd one violtes X(s)=X(-s)=>even
    X(s)=-X(-s)=>odd

  • @jerrythomas2046
    @jerrythomas2046 9 месяцев назад

    Thank you

  • @HavaN5rus
    @HavaN5rus 2 года назад

    Good job 👍 Man, you are an amazing teacher!

  • @CuongPhamQ
    @CuongPhamQ 2 года назад

    Amazing lecturer! Thanks a lot :)

  • @AS-nx9fu
    @AS-nx9fu 2 года назад

    Why is information an exponential function?
    Also how does a system map symbols to bits ?

    • @iain_explains
      @iain_explains  2 года назад

      Good question. It is possible to define other "information measures". It doesn't have to be an exponential/logarithmic function, but that function is good because it fits the requirements. The "amount" of information in a message is related to the probability of that message (as discussed in the video). And there are natural boundary conditions: it must equal zero when a message always happens (probability = 1, ie. the message doesn't actually tell you anything you didn't already know, because it always happens, so you were already expecting that message), and it must equal almost infinity when a message almost never happens (probability = 0, ie. those messages are extremely unexpected and surprising, so they are extremely informative when they actually happen), and the function must be continuous between those boundary conditions. The negative log function fits the bill. And as for your question about mapping bits, this video should help: "What is a Constellation Diagram?" ruclips.net/video/kfJeL4LQ43s/видео.html You might also be interested in this video: "What is Fisher Information?" ruclips.net/video/82molmnRCg0/видео.html

    • @AS-nx9fu
      @AS-nx9fu 2 года назад +1

      @@iain_explains thanks for the response... appreciate it!!!

  • @dimitrisv.1729
    @dimitrisv.1729 3 года назад

    Great work. Keep going!!
    Could this method of producing the codebook based on entropy value be applied in wireless communications? Because each codeword has not equal bits, overlapping in a sequence of symbols must not be done such us your examples.
    Ηow such a thing can be avoided with the presence of noise?

    • @iain_explains
      @iain_explains  3 года назад +2

      Yes, definitely. The codeword length is not the same thing as the frame length, or packet length. The compression encoder takes a sequence of letters/symbols/bits/words (whatever), and converts it into a sequence of 1's and 0's (made up of codewords). That sequence is then divided up into packets according to whatever communication protocol is being used. I plan to make a video soon on source and channel coding, so keep a look out for that.

  • @user-pb8yw8cw3s
    @user-pb8yw8cw3s 2 года назад

    Amazing it's helpful !

  • @soxtrae2643
    @soxtrae2643 2 года назад

    Great video!

  • @wesleycoleman3781
    @wesleycoleman3781 2 года назад

    Why do Gamma and Delta have to have 3 bits? Couldn't you get away with alpha: 0, Beta:1, Gamma:10, Delta: 11? the average there would be 1.25 bits/symbol wouldn't it?

    • @iain_explains
      @iain_explains  2 года назад +1

      It needs to be uniquely decodable. With your suggested mapping, if you received a 1, you wouldn't have any way of knowing if it was from a Beta having been sent, or if you needed to wait for the next bit because either Gamma or Delta had been sent.

  • @sadeqebrahimi2925
    @sadeqebrahimi2925 3 года назад

    this is the place that you can learn complex concepts only on one paper of sheet.