*Welcome to Compressor Head* #everybitcounts #compressorhead #developers Introducing a three part series focused on compression (goo.gl/ojiISz) with your host Colt McAnlis. Understanding compression algorithms means understanding how humans view and use data. In episode 1, Colt walks through the creation of Information Theory, and how it’s spawned the concept of Variable Length Codes, which since the early 1950s has been at the heart of data compression algorithms. Find the entire playlist at g.co/compressorhead.
This is in intro to computer networking books. You start to see the need for error correction due to possible electrical interference and mistakes in the channels. Thanks Google!
At 19:10 we see the symbol table for the Huffman coding example, and 'A' is clearly a prefix for 'B'. For others that may have been confused by this, the symbol table is generated from tracing from the root to the leaves, not from leaf to root as demonstrated here.
Great video. But should the evaluation of the Huffman tree not be the other way round to fulfil the prefix condition? You had A - 0 B - 01 C - 11 But it should be A - 0 B - 10 C - 11
I'm a bit confused about the generated Huffman codes at 19:10. Wouldn't it be better for A to be 0, B to be 10 and C to be 11 (going from the top to the bottom)? A=0, B=01 and C=11 seems like it would violate the prefix property (B starts by 0).
I came here from the Coffee with a Googler series. Great show. Well put together. Entertaining. I loved the "will code for ..." running gag. And best of all I understood it better then when I tried reading it hahah. Going to check out the rest of episodes.
It was great. I started reading the book 'Understanding Compression' and thought the content was kind of similar and going in a similar direction. Googled a bit and, yep, the bald guy is one of the authors.
Nice show 😀 ... really enjoyed it. But shouldn't the code for B in the Huffman tree be 10 instead of 01, otherwise the VLC won't have the prefix property?
There is an article written in 3-apr-2015 comparing between 10 famous apps www.androidauthority.com/voice-call-data-comparison-598541 this article shows that there are many voice and video call apps consume less data than hangouts, so I want to ask do these apps use new compression algorithms than hangouts? why hangouts is not good enough in data compression and consuming data?
@@piguy5450 Short answer, No. Longer answer, if they do it'll would be by developing some algorithms that makes newer algorithms feasible. Doing something akin to Shor's algorithm to find better patterns within a dataset. While qubits are nice, you don't save them to a harddrive as they'll lose their quantum state quite quickly. However, at a certain point improving compression becomes about finding novel patterns in seemingly random code and there might be some algorithms that work with quantum computers that could do that with better time complexity, but generally we can do all that stuff in classical computing already anyway so the answer should be no. However, it is the case that quantum computing is great for one thing (while they actually suck for most things) and that is mapping out the probability within quantum phenomena. And the probability of sequences are how you build compression tables so in the very limited case of compressing data that is fundamentally about quantum phenomena, you would want to know the probability of that phenomena and quantum computers would actually provide you those probabilities. But, that's part of an edge condition of a best case scenario, constructed explicitly to find something it might do better. So the longer answer is also no.
*Welcome to Compressor Head*
#everybitcounts #compressorhead #developers
Introducing a three part series focused on compression (goo.gl/ojiISz) with your host Colt McAnlis. Understanding compression algorithms means understanding how humans view and use data.
In episode 1, Colt walks through the creation of Information Theory, and how it’s spawned the concept of Variable Length Codes, which since the early 1950s has been at the heart of data compression algorithms. Find the entire playlist at g.co/compressorhead.
This is in intro to computer networking books. You start to see the need for error correction due to possible electrical interference and mistakes in the channels. Thanks Google!
Leslie Sox Great insight! But that's a different show ;)
Adrián de la Rosa Martín You can see all 3 episodes here g.co/compressorhead
Enjoyable. Thank you.
At 19:10 we see the symbol table for the Huffman coding example, and 'A' is clearly a prefix for 'B'. For others that may have been confused by this, the symbol table is generated from tracing from the root to the leaves, not from leaf to root as demonstrated here.
Great video. But should the evaluation of the Huffman tree not be the other way round to fulfil the prefix condition? You had
A - 0
B - 01
C - 11
But it should be
A - 0
B - 10
C - 11
Absolutely.
Yep, I think so too.
Yup! We missed that one. I'll add an annotation! Thanks!
Agreeee
Isn't the VLC wrong at the end of the video? It should be A -> 0, B -> 10 (not 01) and C -> 11.
this has got to be one of the best attempts at explaining and teaching a super dry and boring subject. THANK YOU :D
You got the Huffman codes backwards. The codes you created don't have the prefix property.
Great video, I actually learned a lot and enjoyed every minute if it.
I'm a bit confused about the generated Huffman codes at 19:10. Wouldn't it be better for A to be 0, B to be 10 and C to be 11 (going from the top to the bottom)? A=0, B=01 and C=11 seems like it would violate the prefix property (B starts by 0).
Yup! this was a mistake, we'll add an annotation to fix it.
i was confused too,..but guys, thanks a lot for such a great video series
***** Thanks! Glad you liked it!
you're welcome !
Great question! I was thinking the same thing.
Really easy! I was trying to understand Huffman for a long time, before I found this video =) Thanks much
I came here from the Coffee with a Googler series.
Great show. Well put together. Entertaining. I loved the "will code for ..." running gag. And best of all I understood it better then when I tried reading it hahah.
Going to check out the rest of episodes.
Can't wait to watch the next one... Thank you Colt McAnlis
You can find all three videos at g.co/compressorhead
Great! Thank you Louis Gray..
just started to watch your video. I must admit that these videos are very very useful and easy to understand! Thanks Google!
Awesome video! People will get inspired for learning computer science topics with more videos like that. Congrats Colt.
Thanks Colt McAnlis for wonderfully explaining this! :-)
It was great. I started reading the book 'Understanding Compression' and thought the content was kind of similar and going in a similar direction. Googled a bit and, yep, the bald guy is one of the authors.
This is brilliantly taught, good job!
OMG....Dood my mind is rushing me to like and subscribe in the middle of this video. Amazing video
This amazing content. Well produced and informative. Compressor Head indeed.
I love this series!! Kindly make more such videos on other fundamentals as well.
Colt is great! Please do more videos like this on varying topics!
There should be Oscars for teachers - you would win them all!
Great series, but just to clarify, you can have variable length code that do not obey the prefix property. here is an example: 11 110
Very nice lecture series. Justice for my PhD in HEVC Video compression!!
Very good class :-). Thank u for your time in share with us this theory in a simple way.
this is awesome.
very well explained!
Great explanation, thanks!
That's an amazing video. I know I'm a year late (how come I missed this!), but it's never too late to say thanks :))
Thanks a lot for this video, you did on wonderful job on this.
☥huuuum !! interesting !!
Fantastic it's help me a lot. Tnx
love the show, make it a regular thing
More Like This Please
Super awesome ..!!!
Muy bien explicado, Gracias. Me quedo esperando el siguiente capítulo. :)
You can catch all 3 episodes at g.co/compressorhead
Colt McAnlis so thanks!!
Huffman encoding does not obey to the prefix rule. How is it considered as vlc method?
Very understandable. Thanks!
Nice show 😀 ... really enjoyed it.
But shouldn't the code for B in the Huffman tree be 10 instead of 01, otherwise the VLC won't have the prefix property?
Maximum Awesome.
excelent !
Thanks
Great video!
So awesome, so I can expain this
great Job i like u show
02:42 is "eee" same as "s"?
At 6:45 I think you meant to say P1 and P2, not P0 and P1
There is an article written in 3-apr-2015 comparing between 10 famous apps www.androidauthority.com/voice-call-data-comparison-598541 this article shows that there are many voice and video call apps consume less data than hangouts, so I want to ask do these apps use new compression algorithms than hangouts? why hangouts is not good enough in data compression and consuming data?
that is great...
awesome explanation :)
Will quantum computing allow for new compression algorithms?
with qubits it is certainly possible. we may only need 1 character per word, as they work in varying degrees of polarization
@@piguy5450 Short answer, No. Longer answer, if they do it'll would be by developing some algorithms that makes newer algorithms feasible. Doing something akin to Shor's algorithm to find better patterns within a dataset. While qubits are nice, you don't save them to a harddrive as they'll lose their quantum state quite quickly. However, at a certain point improving compression becomes about finding novel patterns in seemingly random code and there might be some algorithms that work with quantum computers that could do that with better time complexity, but generally we can do all that stuff in classical computing already anyway so the answer should be no.
However, it is the case that quantum computing is great for one thing (while they actually suck for most things) and that is mapping out the probability within quantum phenomena. And the probability of sequences are how you build compression tables so in the very limited case of compressing data that is fundamentally about quantum phenomena, you would want to know the probability of that phenomena and quantum computers would actually provide you those probabilities. But, that's part of an edge condition of a best case scenario, constructed explicitly to find something it might do better. So the longer answer is also no.
Excellent video ..!! @ 18:07 how can the probability be 4? I guess what he is trying to explain is frequency rather than probability.
Let's not forget variable length quantities, which work I suppose under the assumption the most occurring numbers are small.
This is amazing video! And will be very useful in my country (Brazil) if you subtitle for us :)
Awesome video, but you need to create the Huffman code top-down from the tree. In your example B (01) starts with "A" (0) ;-)
Awesome, but you forgot to reverse codes after obtaining them from the tree, so that 01 becomes 10
in the Hoffman Tree the variable B should be "1-0" and not "0-1".. [19:00] ?!!
At 19:04 the codes are in reverse order.
Isn't the formula sigma P(i) * log2 of 1/P(i)?
I think that it is better to watch The Art of The Problems videos on information theory and Markov chains.
They are much clearer.
360p rly?
ok 1080p available :D
Maximilian Beck Sorry about the initial 360p. We promise it was not a compression issue.
Now I want to work at Google :)
Claude Shannon's first name is not pronounced like the word "cloud".
ASCII is 7 bit, not 8 bit.
Hahahha "pay attention"
Grats!! Great job.. I was gonna go write some VLC's, until ya got to F@#$%&n Markov Chain.... ~(8^(!)~
Colt McAnlis You would have done well with Leo Laporte some 10 years at #ZDTV =)
Thanks!
mad science?
here in 2024