Each year millions of data bits are lost to lossy compression. It doesn't have to be this way. If you have any information on lost data, please contact crime stoppers.
I remember back in the day, I used compression method which I made on TI-83 calculator where I converted text and numerals into basic code which I converted into pixels on picture. When the message was encrypted and compressed, it looked like a random pixels on screen. Little did my teacher knew, I had all the answers on it lol
It's a shame a lot of game developers and publishers seem to have forgot about data compression. Titanfall is like 50GB and 35GB of that is just audio files... I ain't joking. There are other games as well. Compress your damn games
Just because 4TB hard drives getting a lot cheaper doesn't mean devs can give us 50gb games that takes ages to download nevermind store(We don't all have crazy download speeds)...
I bet they are compressing their files. It is not as easy as "i'm gonna compress my game so people will need to download 20GB less". They will for sure not compress the audio with loss and compressing autio lossless will only result in a 20% reduction of size. That's still about 30GB of audio files to download IF they are not compressed already. Also compression has it's drawbacks, decompression takes relativly a lot work to do. Don't act like people don't know what they are doing when you just obviously don't know much about what THEY are doing. (Sorry for my englisch, it's late, i'm not a native speaker and i will not spellcheck that :P)
Next quickie should be about noise/sound dampening since you are already in the process of treating your new office with fancy echo reducing foam stuff?
Touche. It just confused me at first as it confuses me when Linus appears on NCIXcom. But it does make sense (at least the techquicky/linustechtips one)
my next techquickie suggestion: give a list of some important tech topics that we may not know about but might find interesting. give a 1 sentence summary to maybe pique our interest? then we vote on which one we want to see the most
I try to compress data on my floppy disk, but when I see someone attractive my hard drive shifts from rotatory to solid state, making it even "harder to compress"
Look up sloot digital coding system, its super neat idea and would be better than any known file compression methods today... Supposably it was 100 million times better then compression methods today, some think inventor was killed. IT compresses info really well and quickly, it was said around 16 full length movies could be utilized on 8 kilobyte's of space, all running at once, around the 1990's years. No one knows how it was done, but supposably compressed files had a unique patterns, or key tags, that it was all it took to represent needed information in processing, in which made it very autonomous and fast, quick action, quick reaction. ruclips.net/video/33Ij_IOQlx4/видео.html
Like others have said please game makers compress your damn files for games. As for what was shared in this video it hurt my head at times but thank Linus for trying to teach me something.
Look up sloot digital coding system, its super neat idea and would be better than any known file compression methods today... Supposably it was 100 million times better then compression methods today, some think inventor was killed. IT compresses info really well and quickly, it was said around 16 full length movies could be utilized on 8 kilobyte's of space, all running at once, around the 1990's years. No one knows how it was done, but supposably compressed files had a unique patterns, or key tags, that it was all it took to represent needed information in processing, in which made it very autonomous and fast, quick action, quick reaction. ruclips.net/video/33Ij_IOQlx4/видео.html
This is not a satisfactory explanation of what data compression is. The example given starts off as simple run-length encoding, then quickly takes a deep dive into weird imaginary realm. This "rudimentary example" is more convoluted than what actually happens in reality and does not serve to further understanding of the subject. Please, consult with engineers or developers before writing about subjects you are not familiar with. They'll give you much better examples that use real-world analogies that are both accurate and easier for layman to understand. **edited for politeness: I was too reactionary and said it was the "worst explanation of data compression on youtube" which might have been too abrasive. Sorry Linus and Linus fans.
TheMohawkNinja Well explaining things on youtube is not my job... (plus I get the feeling that you're being facetious and don't really care) but it's saturday and I'm feeling generous so why don't I give it a shot (plus some might find it educational): First of all, I wouldn't even bother with run-length encoding (the algorithm Linus was describing until 2m30s mark). Today's popular lossless compression algorithms (zip, gz, 7z, etc etc etc) are all dictionary-based. Suppose you want to compress a book. Instead of spelling out each word, simply write the dictionary entry number (Example: page 32, word 8. see how every word in the book can now be represented with just 2 numbers). As long as you and I have the same dictionary, we can easily pass around much smaller book, and decompress it at will. And that's it! This is literally what happens when you "zip" something. Each algorithm has its own unique ways to build, store, and lookup entries in dictionary (and some "massage" the data so they can be more easily compressed), but they are relatively minor variations.
Steve Jacobs But whining and bitching to Linus how he did a shitty job explaining it ,.....IS..? (facepalm) It's always easy to nitpick or criticize others attempts at something rather than taking responsibility of TRYING to do the same thing for ourselves to REALLy see if we could have done a better job or not, eh? Move on kid. No one is here to be your therapist.
Chronofusion What you're saying is akin to "you can't be a film critic unless you know how to make a movie." It's a silly and illogical argument. Regardless, this channel is about explaining and educating people. Therefore it's important to have factually correct content.
Look up sloot digital coding system, its super neat idea and would be better than any known file compression methods today... Supposably it was 100 million times better then compression methods today, some think inventor was killed. IT compresses info really well and quickly, it was said around 16 full length movies could be utilized on 8 kilobyte's of space, all running at once, around the 1990's years. No one knows how it was done, but supposably compressed files had a unique patterns, or key tags, that it was all it took to represent needed information in processing, in which made it very autonomous and fast, quick action, quick reaction. ruclips.net/video/33Ij_IOQlx4/видео.html
MarioDragon Indeed, google fiber..PLEASE...save us from the dumbass isps that think 12mbps ( crap t n t u verse that is). is sufficient( 60mbps down and 11 here and even I don't think that is fast enough)
they are intentionally making huge games and softwares so you buy more memory, more storage, larger memory in gpus, pay more for bandwidth etc. they are also obsessively pushing high resolution whether it's needed or not so that you update your screen and the whole pipeline including processing and rendering capabilities to keep up.
Look up sloot digital coding system, its super neat idea and would be better than any known file compression methods today... Supposably it was 100 million times better then compression methods today, some think inventor was killed. IT compresses info really well and quickly, it was said around 16 full length movies could be utilized on 8 kilobyte's of space, all running at once, around the 1990's years. No one knows how it was done, but supposably compressed files had a unique patterns, or key tags, that it was all it took to represent needed information in processing, in which made it very autonomous and fast, quick action, quick reaction. ruclips.net/video/33Ij_IOQlx4/видео.html
I have an idea. encode the binary of a file using md5 (you can convert Terabytes to kilobytes). Then With super-fast computers test many binaries to see if it matches but it takes years 😂
Look up sloot digital coding system, its super neat idea and would be better than any known file compression methods today... Supposably it was 100 million times better then compression methods today, some think inventor was killed. IT compresses info really well and quickly, it was said around 16 full length movies could be utilized on 8 kilobyte's of space, all running at once, around the 1990's years. No one knows how it was done, but supposably compressed files had a unique patterns, or key tags, that it was all it took to represent needed information in processing, in which made it very autonomous and fast, quick action, quick reaction. ruclips.net/video/33Ij_IOQlx4/видео.html
Bought call of duty modern warfare yesterday....200gb + is the reason I'm here .wow just wow still waiting for it to download. 24 hours and counting wtf!!!
Look up sloot digital coding system, its super neat idea and would be better than any known file compression methods today... Supposably it was 100 million times better then compression methods today, some think inventor was killed. IT compresses info really well and quickly, it was said around 16 full length movies could be utilized on 8 kilobyte's of space, all running at once, around the 1990's years. No one knows how it was done, but supposably compressed files had a unique patterns, or key tags, that it was all it took to represent needed information in processing, in which made it very autonomous and fast, quick action, quick reaction. ruclips.net/video/33Ij_IOQlx4/видео.html
Hearing the difference now isn't the reason to encode to FLAC. FLAC uses lossless compression, while MP3 is "lossy". What this means is that for each year the MP3 sits on your hard drive, it will lose roughly 12kbps, assuming you have SATA about 15kbps on IDE, but only 7kbps on SCSI, due to rotational velocidensity. You don't want to know how much worse it is on CD-ROM or other optical media. I started collecting MP3s in about 2001, and if I try to play any of the tracks I downloaded back then, even the stuff I grabbed at 320kbps, they just sound like crap. The bass is terrible, the midrange is well don't get me started. Some of those albums have degraded down to 32 or even 16kbps. FLAC rips from the same period still sound great, even if they weren't stored correctly, in a cool, dry place. Seriously, stick to FLAC, you may not be able to hear the difference now, but in a year or two, you'll be glad you did.
Richard Johnson Hahaha who started the whole "mp3s have an expiration date" thing. If that were the case, not only audio files would degrade, but other files would. In other words, Photoshop wont launch a year after being installed. And my Photoshop works fine after several years, so, yeah.
Appreciate the explanations, as I turn my family on to them when the need arises. Cant wait to see deduplication explained. Seems to be the next logical step.
I'd like to know about Windows 10 (7 & 11 if exists) data compression setting. How it ( much & well) compresses, which Resources are utilized more after Enabling that option, and is it worth it?
Look up sloot digital coding system, its super neat idea and would be better than any known file compression methods today... Supposably it was 100 million times better then compression methods today, some think inventor was killed. IT compresses info really well and quickly, it was said around 16 full length movies could be utilized on 8 kilobyte's of space, all running at once, around the 1990's years. No one knows how it was done, but supposably compressed files had a unique patterns, or key tags, that it was all it took to represent needed information in processing, in which made it very autonomous and fast, quick action, quick reaction. ruclips.net/video/33Ij_IOQlx4/видео.html
I was curious when it came to sending data from point A to point B, why do we only send data in binary? For example, using fiber optics, if we divided the light spectrum into 256 colors and sent colors as information instead of on/off, every "bit" of data transfer would have a byte of information packed in it. Of course you would need high-tech senders/receivers, and perhaps that is the bottleneck, but curious to get others thoughts on this.
Believe it or not, wifi and digital ,V works more like you suggest, setting multiple bits simultaneously on separate neighbouring frequencies (OFDM), multiple amplitudes (QAM). or using 4 phases to encode 2 bits at once (QPSK) rather than 2 phases to code 1 bit at a time (BPSK). However using 256 different frequencies, that is a little extreme and using a lot of bandwitdth, infact most telecoms fibre is a specific thickness for an infra-red frequency, a different frequency of light wouldn't go so well down it. Oh, and you need to be able to split it back up at the other end, with just mixing 3 colours I can end up with a full rainbow at the other end.
Look up sloot digital coding system, its super neat idea and would be better than any known file compression methods today... Supposably it was 100 million times better then compression methods today, some think inventor was killed. IT compresses info really well and quickly, it was said around 16 full length movies could be utilized on 8 kilobyte's of space, all running at once, around the 1990's years. No one knows how it was done, but supposably compressed files had a unique patterns, or key tags, that it was all it took to represent needed information in processing, in which made it very autonomous and fast, quick action, quick reaction. ruclips.net/video/33Ij_IOQlx4/видео.html
That lossless compression algorithm is hot Zip bombs like 42.zip were made. By repeating a '0' millions of millions of times so a 42KB file becomes petabytes in size when unzipped.
way cool brother , many many thanks for the intelligent & entertaining informative explanation , your educating older dudes like me on what all this is about , thanks again , Brock (from downunder australia )
Yeah, but how can you compress binary that way? Binary is already the simplest way to express something, so 11100111 just becomes "3 00 3" so 110011, not much an improvement either. And consider 11110011, it would have 100, so how is the algorithm gonna know the difference between an even number and an actual 0?
Look up sloot digital coding system, its super neat idea and would be better than any known file compression methods today... Supposably it was 100 million times better then compression methods today, some think inventor was killed. IT compresses info really well and quickly, it was said around 16 full length movies could be utilized on 8 kilobyte's of space, all running at once, around the 1990's years. No one knows how it was done, but supposably compressed files had a unique patterns, or key tags, that it was all it took to represent needed information in processing, in which made it very autonomous and fast, quick action, quick reaction. ruclips.net/video/33Ij_IOQlx4/видео.html
Look up sloot digital coding system, its super neat idea and would be better than any known file compression methods today... Supposably it was 100 million times better then compression methods today, some think inventor was killed. IT compresses info really well and quickly, it was said around 16 full length movies could be utilized on 8 kilobyte's of space, all running at once, around the 1990's years. No one knows how it was done, but supposably compressed files had a unique patterns, or key tags, that it was all it took to represent needed information in processing, in which made it very autonomous and fast, quick action, quick reaction. ruclips.net/video/33Ij_IOQlx4/видео.html
Look up sloot digital coding system, its super neat idea and would be better than any known file compression methods today... Supposably it was 100 million times better then compression methods today, some think inventor was killed. IT compresses info really well and quickly, it was said around 16 full length movies could be utilized on 8 kilobyte's of space, all running at once, around the 1990's years. No one knows how it was done, but supposably compressed files had a unique patterns, or key tags, that it was all it took to represent needed information in processing, in which made it very autonomous and fast, quick action, quick reaction. ruclips.net/video/33Ij_IOQlx4/видео.html
I would have liked a more technical video on data compression even if I already know the principal.. But thanks anyway, that was fun and instructive for noobie
Techquickie Just wanted to inform you that you did not link to the video compression video. There's no link in the description, or a click-able annotation.
Look up sloot digital coding system, its super neat idea and would be better than any known file compression methods today... Supposably it was 100 million times better then compression methods today, some think inventor was killed. IT compresses info really well and quickly, it was said around 16 full length movies could be utilized on 8 kilobyte's of space, all running at once, around the 1990's years. No one knows how it was done, but supposably compressed files had a unique patterns, or key tags, that it was all it took to represent needed information in processing, in which made it very autonomous and fast, quick action, quick reaction. ruclips.net/video/33Ij_IOQlx4/видео.html
I thought you were going to advise software to use. I know what it all is but was hoping for tips on the best software as my podcast is too large for the site i use and i tried Audacity & Wondershare with no success so far. Any tips on Software that won't break the bank that will bring a podcast audio file down from about 920mbs to 250mbs?
Look up sloot digital coding system, its super neat idea and would be better than any known file compression methods today... Supposably it was 100 million times better then compression methods today, some think inventor was killed. IT compresses info really well and quickly, it was said around 16 full length movies could be utilized on 8 kilobyte's of space, all running at once, around the 1990's years. No one knows how it was done, but supposably compressed files had a unique patterns, or key tags, that it was all it took to represent needed information in processing, in which made it very autonomous and fast, quick action, quick reaction. ruclips.net/video/33Ij_IOQlx4/видео.html
Look up sloot digital coding system, its super neat idea and would be better than any known file compression methods today... Supposably it was 100 million times better then compression methods today, some think inventor was killed. IT compresses info really well and quickly, it was said around 16 full length movies could be utilized on 8 kilobyte's of space, all running at once, around the 1990's years. No one knows how it was done, but supposably compressed files had a unique patterns, or key tags, that it was all it took to represent needed information in processing, in which made it very autonomous and fast, quick action, quick reaction. ruclips.net/video/33Ij_IOQlx4/видео.html
Or how about backing up that server you lost. If you can say compress all that to fit on a cheap 1tb drive making backups will be a lot easier and with a physical form network speeds can be faster like send 1tb/s over a slow 1mb/s network were about maxed out our frequencies so data compression is the key to faster network speeds
Howabout an episode on instancing, motion blur, and framerate/refreshrate? It's a very underrepresented topic, but it's actually really easy to understand, and might help clear up confusion about why shader-based motion blur is a thing, and why high refreshrates matter for more reasons than "they just like feel better and stuff mang".
you can compress better if you assign a symbol for a entire row of programming language words, then at the source destination use a dictionary to reverse the compression
Wrong statement. It is not always true that smaller compressed files are of lower quality. For example, if 2 different algorithms are used on the same raw (uncompressed) data, the one with the lower compressed size might actually be HIGHER quality than the other if it is a much better (or even slightly better) algorithm. There is not a direct correlation between filesize and quality. Downvoted for inaccuracy. Also, for the XXX00XXX example, why not encode it as X, 0, 231 which is only 3 bytes instead of what appears to be 8 bytes?
Look up sloot digital coding system, its super neat idea and would be better than any known file compression methods today... Supposably it was 100 million times better then compression methods today, some think inventor was killed. IT compresses info really well and quickly, it was said around 16 full length movies could be utilized on 8 kilobyte's of space, all running at once, around the 1990's years. No one knows how it was done, but supposably compressed files had a unique patterns, or key tags, that it was all it took to represent needed information in processing, in which made it very autonomous and fast, quick action, quick reaction. ruclips.net/video/33Ij_IOQlx4/видео.html
Look up sloot digital coding system, its super neat idea and would be better than any known file compression methods today... Supposably it was 100 million times better then compression methods today, some think inventor was killed. IT compresses info really well and quickly, it was said around 16 full length movies could be utilized on 8 kilobyte's of space, all running at once, around the 1990's years. No one knows how it was done, but supposably compressed files had a unique patterns, or key tags, that it was all it took to represent needed information in processing, in which made it very autonomous and fast, quick action, quick reaction. ruclips.net/video/33Ij_IOQlx4/видео.html
If you had a hash of a file with the file size you could rebuild it with a modern computer it will take years but with a quantum computer it might be possible to crack it fast enough like if a quantum computer can decode a hash for a file size of 100MB in less time it takes to send 100MB over the internet it could be used to increase internet speed I don't know the speeds of a quantum computer maybe you can decode a 1tb file from a 1kb hash file and get internet speeds of 1tb/s over any internet connection you might probably will get more than 1 file matching the hash but if all the files were a zip file only one of the zips would works it's a lot of stuff to do and expecting it to work almost instantly I don't know quantum or not might be too much to ask
Look up sloot digital coding system, its super neat idea and would be better than any known file compression methods today... Supposably it was 100 million times better then compression methods today, some think inventor was killed. IT compresses info really well and quickly, it was said around 16 full length movies could be utilized on 8 kilobyte's of space, all running at once, around the 1990's years. No one knows how it was done, but supposably compressed files had a unique patterns, or key tags, that it was all it took to represent needed information in processing, in which made it very autonomous and fast, quick action, quick reaction. ruclips.net/video/33Ij_IOQlx4/видео.html
Each year millions of data bits are lost to lossy compression. It doesn't have to be this way. If you have any information on lost data, please contact crime stoppers.
+Thecawesomeone Spread the word brotha. this has to end!
Just send them too google
Bruh
a few MB doesn't sound that bad
Probably closer to trillions tbh
I remember back in the day, I used compression method which I made on TI-83 calculator where I converted text and numerals into basic code which I converted into pixels on picture. When the message was encrypted and compressed, it looked like a random pixels on screen. Little did my teacher knew, I had all the answers on it lol
That's amazing, I say about the creativity
Teach me the ways
Woah, i wish i get that calculator
Is it me, or are the ad segments getting longer and longer..
***** **Slow clap**
***** **Fast Clap.**
***** **lossy clap**
**dick slap**
***** **Lossless Clap.**
2:19
So data compression looks like a 12 year old's gamer tag?
More like a chemistry geek
They watched too much Vin Diesel's xXx movies. That's why they think 'xXx sounds cool!!!'
@@sinnyjohns4362 c'mon guys, you were the same!
How do you use the time stamp thing?
@@Error-xl3ty You just type the minutes and seconds. 4:20 for example and it will automatically do it for you.
It's a shame a lot of game developers and publishers seem to have forgot about data compression. Titanfall is like 50GB and 35GB of that is just audio files... I ain't joking. There are other games as well. Compress your damn games
Just because 4TB hard drives getting a lot cheaper doesn't mean devs can give us 50gb games that takes ages to download nevermind store(We don't all have crazy download speeds)...
It's harder to pirate that way ;)
So there you go with one plus reason out of the many reasons.
XJOEMANN13 That is what I'm getting at, they should compress their games. I couldn't believe Titanfall had 36GB of Audio files.
I bet they are compressing their files. It is not as easy as "i'm gonna compress my game so people will need to download 20GB less". They will for sure not compress the audio with loss and compressing autio lossless will only result in a 20% reduction of size. That's still about 30GB of audio files to download IF they are not compressed already. Also compression has it's drawbacks, decompression takes relativly a lot work to do.
Don't act like people don't know what they are doing when you just obviously don't know much about what THEY are doing.
(Sorry for my englisch, it's late, i'm not a native speaker and i will not spellcheck that :P)
Carlos W. Why would it be harder to pirate? Most releases are packed anyway, so if there is something that can be compressed it will be.
Next quickie should be about noise/sound dampening since you are already in the process of treating your new office with fancy echo reducing foam stuff?
That would be nice . I would love to sound damping my
Game room
woopygoman Call it Techquickie as fast as possible.
Just noticed Linus has a unibrow lol
Almost but not quite.
Most likely just a shadow. The lighting may need to be adjusted.
He must not be using dollar shave club
Internet explorer?
***** around 1:00
Why are LinusTechTips and Techquicky separate channels?
Hey are pretty different content formats
Touche. It just confused me at first as it confuses me when Linus appears on NCIXcom. But it does make sense (at least the techquicky/linustechtips one)
my mom learns a lot from techquickies, but she doesn't care about the detailed reviews etc of hardware.
Ryan McClure ask vsause
+Ryan McClure Mooaahh moohhneeeyyy
Watching a 4K video on a 1080p monitor. I don't flow with the crowd, I do things my way!
+ThatGuyWithHisHat
Watching 4K video on my 1366x768 laptop shitty monitor!
mind you that is impossible, the lowest resolution that can upscale to 4K is 1080p.
my next techquickie suggestion:
give a list of some important tech topics that we may not know about but might find interesting. give a 1 sentence summary to maybe pique our interest? then we vote on which one we want to see the most
Pied Piper
Richard :)
lol just watched an episode came here from shannon coding!
loll just watched an episode too
Sad :( it ended
@@rickie_bo so sad it ended shame
Oh my god no way this was the first freshbooks sponsor!! I see this jawn weekly
ikr
0:45 He catches us standing on our bag trying to .ZIP it up ...
*Applause*
@@spikegorman1650 That was two years ago...
He was waiting for that applause
Your videos help me so much. Thank you.
I try to compress data on my floppy disk, but when I see someone attractive my hard drive shifts from rotatory to solid state, making it even "harder to compress"
same thing with my penis :(
haha sad as this may sound I actually thought that was brilliant, made ma day haha lol
if i were a guy id think thats funny lol
Oh god
i've watch an entire pulseway ad with Linus in it thinking it was the video
Nice ad
I'd like to see one about lossy compression, i.e. how mp3 or aac file compression works to save space, and their differences.
i didn't understand prety much of powers in like 10 hours of math,but i understand it now ,Thx Linus :)
Look up sloot digital coding system, its super neat idea and would be better than any known file compression methods today... Supposably it was 100 million times better then compression methods today, some think inventor was killed. IT compresses info really well and quickly, it was said around 16 full length movies could be utilized on 8 kilobyte's of space, all running at once, around the 1990's years. No one knows how it was done, but supposably compressed files had a unique patterns, or key tags, that it was all it took to represent needed information in processing, in which made it very autonomous and fast, quick action, quick reaction. ruclips.net/video/33Ij_IOQlx4/видео.html
This video doesn't apply to me in any way shape or form but theyre all so fun to watch. It's basically better than cable.
Like others have said please game makers compress your damn files for games. As for what was shared in this video it hurt my head at times but thank Linus for trying to teach me something.
This was super funny. You’ve got great comedy skills.
Look up sloot digital coding system, its super neat idea and would be better than any known file compression methods today... Supposably it was 100 million times better then compression methods today, some think inventor was killed. IT compresses info really well and quickly, it was said around 16 full length movies could be utilized on 8 kilobyte's of space, all running at once, around the 1990's years. No one knows how it was done, but supposably compressed files had a unique patterns, or key tags, that it was all it took to represent needed information in processing, in which made it very autonomous and fast, quick action, quick reaction. ruclips.net/video/33Ij_IOQlx4/видео.html
that new sponsor is like a breath of fresh air
This is not a satisfactory explanation of what data compression is. The example given starts off as simple run-length encoding, then quickly takes a deep dive into weird imaginary realm. This "rudimentary example" is more convoluted than what actually happens in reality and does not serve to further understanding of the subject.
Please, consult with engineers or developers before writing about subjects you are not familiar with. They'll give you much better examples that use real-world analogies that are both accurate and easier for layman to understand.
**edited for politeness: I was too reactionary and said it was the "worst explanation of data compression on youtube" which might have been too abrasive. Sorry Linus and Linus fans.
Well, if you're so knowledgeable, how about you explain this topic in a more "accurate" and "easier" to understand way?
TheMohawkNinja Well explaining things on youtube is not my job... (plus I get the feeling that you're being facetious and don't really care) but it's saturday and I'm feeling generous so why don't I give it a shot (plus some might find it educational):
First of all, I wouldn't even bother with run-length encoding (the algorithm Linus was describing until 2m30s mark). Today's popular lossless compression algorithms (zip, gz, 7z, etc etc etc) are all dictionary-based.
Suppose you want to compress a book. Instead of spelling out each word, simply write the dictionary entry number (Example: page 32, word 8. see how every word in the book can now be represented with just 2 numbers). As long as you and I have the same dictionary, we can easily pass around much smaller book, and decompress it at will.
And that's it! This is literally what happens when you "zip" something. Each algorithm has its own unique ways to build, store, and lookup entries in dictionary (and some "massage" the data so they can be more easily compressed), but they are relatively minor variations.
Steve Jacobs But whining and bitching to Linus how he did a shitty job explaining it ,.....IS..? (facepalm) It's always easy to nitpick or criticize others attempts at something rather than taking responsibility of TRYING to do the same thing for ourselves to REALLy see if we could have done a better job or not, eh? Move on kid. No one is here to be your therapist.
Chronofusion What you're saying is akin to "you can't be a film critic unless you know how to make a movie." It's a silly and illogical argument.
Regardless, this channel is about explaining and educating people. Therefore it's important to have factually correct content.
Steve Jacobs Thank you for the explanation. =D
This man never change his voice
Your awesome TechQuickie!
Look up sloot digital coding system, its super neat idea and would be better than any known file compression methods today... Supposably it was 100 million times better then compression methods today, some think inventor was killed. IT compresses info really well and quickly, it was said around 16 full length movies could be utilized on 8 kilobyte's of space, all running at once, around the 1990's years. No one knows how it was done, but supposably compressed files had a unique patterns, or key tags, that it was all it took to represent needed information in processing, in which made it very autonomous and fast, quick action, quick reaction. ruclips.net/video/33Ij_IOQlx4/видео.html
Hey, there's never such a thing as "Bad Tandoori Chicken"
Shiva Rampersaud these weaklings can't handle the spices 😆
Call of Duty needs to write down some notes
TOO bad Devs think 50gb and 100gb games are ok to do.
I have top tier FIOS and 50gb takes like 30min.
fucking ridiculous.
It's only gonna get worse, complain to ISPs not the game developers
MarioDragon Indeed, google fiber..PLEASE...save us from the dumbass isps that think 12mbps ( crap t n t u verse that is). is sufficient( 60mbps down and 11 here and even I don't think that is fast enough)
Chronofusion I have 150/150 FIOS that's not the issue.
the 30ish min is retarded.
PC Master Race
I have 152 down 12 up. Homeworld remastered was 13GB (For essentially four games) and I downloaded that in 15 minutes.
Chronofusion *Ahen* Your arguement is invalid. 12 MB/S is amazing compared to my connection. THIS, is NOT okay. i.imgur.com/AZW83yX.png
You really love those stock photos
Your "video compression" video at 3:37 does not have an annotation
'Reduce storage requirements'.
Meanwhile games are hitting nearly 100gb.
Either it's compressed badly, or devs are just lazy.
Or your games are being rendered with 4k textures and massive amounts of content.
they are intentionally making huge games and softwares so you buy more memory, more storage, larger memory in gpus, pay more for bandwidth etc. they are also obsessively pushing high resolution whether it's needed or not so that you update your screen and the whole pipeline including processing and rendering capabilities to keep up.
J *sigh*
Richard Hendrix’s compression algorithm
Look up sloot digital coding system, its super neat idea and would be better than any known file compression methods today... Supposably it was 100 million times better then compression methods today, some think inventor was killed. IT compresses info really well and quickly, it was said around 16 full length movies could be utilized on 8 kilobyte's of space, all running at once, around the 1990's years. No one knows how it was done, but supposably compressed files had a unique patterns, or key tags, that it was all it took to represent needed information in processing, in which made it very autonomous and fast, quick action, quick reaction. ruclips.net/video/33Ij_IOQlx4/видео.html
What kind of compression was Linus compressed with?
Linus can make a video assembly language as fast as possible? Very important.
I have an idea. encode the binary of a file using md5 (you can convert Terabytes to kilobytes). Then With super-fast computers test many binaries to see if it matches but it takes years 😂
Look up sloot digital coding system, its super neat idea and would be better than any known file compression methods today... Supposably it was 100 million times better then compression methods today, some think inventor was killed. IT compresses info really well and quickly, it was said around 16 full length movies could be utilized on 8 kilobyte's of space, all running at once, around the 1990's years. No one knows how it was done, but supposably compressed files had a unique patterns, or key tags, that it was all it took to represent needed information in processing, in which made it very autonomous and fast, quick action, quick reaction. ruclips.net/video/33Ij_IOQlx4/видео.html
Bought call of duty modern warfare yesterday....200gb + is the reason I'm here .wow just wow still waiting for it to download. 24 hours and counting wtf!!!
Look up sloot digital coding system, its super neat idea and would be better than any known file compression methods today... Supposably it was 100 million times better then compression methods today, some think inventor was killed. IT compresses info really well and quickly, it was said around 16 full length movies could be utilized on 8 kilobyte's of space, all running at once, around the 1990's years. No one knows how it was done, but supposably compressed files had a unique patterns, or key tags, that it was all it took to represent needed information in processing, in which made it very autonomous and fast, quick action, quick reaction. ruclips.net/video/33Ij_IOQlx4/видео.html
Hearing the difference now isn't the reason to encode to FLAC. FLAC uses lossless compression, while MP3 is "lossy". What this means is that for each year the MP3 sits on your hard drive, it will lose roughly 12kbps, assuming you have SATA about 15kbps on IDE, but only 7kbps on SCSI, due to rotational velocidensity. You don't want to know how much worse it is on CD-ROM or other optical media.
I started collecting MP3s in about 2001, and if I try to play any of the tracks I downloaded back then, even the stuff I grabbed at 320kbps, they just sound like crap. The bass is terrible, the midrange is well don't get me started. Some of those albums have degraded down to 32 or even 16kbps. FLAC rips from the same period still sound great, even if they weren't stored correctly, in a cool, dry place. Seriously, stick to FLAC, you may not be able to hear the difference now, but in a year or two, you'll be glad you did.
Low quality b8 m8. 8/80 b8 hrder h8tr.
Thanks For the tip Ed!
1st this is at least 4 years old. 2nd this is complete and utter bullshit. Files do not degrade over time smartass.
There is no loss, or else the file would become corrupted.
Richard Johnson Hahaha who started the whole "mp3s have an expiration date" thing. If that were the case, not only audio files would degrade, but other files would. In other words, Photoshop wont launch a year after being installed. And my Photoshop works fine after several years, so, yeah.
WOW! THE FIRST FRESHBOOKS BUMPER EVER!
I sure learn a lot from Techquickie.
Appreciate the explanations, as I turn my family on to them when the need arises. Cant wait to see deduplication explained. Seems to be the next logical step.
Video starts at 5:10 after funny jokes
😂😂
great video
I'd like to know about Windows 10 (7 & 11 if exists) data compression setting. How it ( much & well) compresses, which Resources are utilized more after Enabling that option, and is it worth it?
what's the improvement of data compression in 2020?
Look up sloot digital coding system, its super neat idea and would be better than any known file compression methods today... Supposably it was 100 million times better then compression methods today, some think inventor was killed. IT compresses info really well and quickly, it was said around 16 full length movies could be utilized on 8 kilobyte's of space, all running at once, around the 1990's years. No one knows how it was done, but supposably compressed files had a unique patterns, or key tags, that it was all it took to represent needed information in processing, in which made it very autonomous and fast, quick action, quick reaction. ruclips.net/video/33Ij_IOQlx4/видео.html
I was curious when it came to sending data from point A to point B, why do we only send data in binary? For example, using fiber optics, if we divided the light spectrum into 256 colors and sent colors as information instead of on/off, every "bit" of data transfer would have a byte of information packed in it. Of course you would need high-tech senders/receivers, and perhaps that is the bottleneck, but curious to get others thoughts on this.
It's seems like a good idea, but remember that it would probably be expensive and only be used by companies.
Believe it or not, wifi and digital ,V works more like you suggest, setting multiple bits simultaneously on separate neighbouring frequencies (OFDM), multiple amplitudes (QAM). or using 4 phases to encode 2 bits at once (QPSK) rather than 2 phases to code 1 bit at a time (BPSK). However using 256 different frequencies, that is a little extreme and using a lot of bandwitdth, infact most telecoms fibre is a specific thickness for an infra-red frequency, a different frequency of light wouldn't go so well down it. Oh, and you need to be able to split it back up at the other end, with just mixing 3 colours I can end up with a full rainbow at the other end.
HELP I'M BEING COMPRESSED!
Look up sloot digital coding system, its super neat idea and would be better than any known file compression methods today... Supposably it was 100 million times better then compression methods today, some think inventor was killed. IT compresses info really well and quickly, it was said around 16 full length movies could be utilized on 8 kilobyte's of space, all running at once, around the 1990's years. No one knows how it was done, but supposably compressed files had a unique patterns, or key tags, that it was all it took to represent needed information in processing, in which made it very autonomous and fast, quick action, quick reaction. ruclips.net/video/33Ij_IOQlx4/видео.html
That lossless compression algorithm is hot Zip bombs like 42.zip were made. By repeating a '0' millions of millions of times so a 42KB file becomes petabytes in size when unzipped.
I found it! The origins of fresh books on techquickie!
way cool brother , many many thanks for the intelligent & entertaining informative explanation , your educating older dudes like me on what all this is about , thanks again , Brock (from downunder australia )
I get to watch this video for school, yay.
I wanted to know about this and here i find Linus helping me out as always :D
Yeah, but how can you compress binary that way? Binary is already the simplest way to express something, so 11100111 just becomes "3 00 3" so 110011, not much an improvement either.
And consider 11110011, it would have 100, so how is the algorithm gonna know the difference between an even number and an actual 0?
Look up sloot digital coding system, its super neat idea and would be better than any known file compression methods today... Supposably it was 100 million times better then compression methods today, some think inventor was killed. IT compresses info really well and quickly, it was said around 16 full length movies could be utilized on 8 kilobyte's of space, all running at once, around the 1990's years. No one knows how it was done, but supposably compressed files had a unique patterns, or key tags, that it was all it took to represent needed information in processing, in which made it very autonomous and fast, quick action, quick reaction. ruclips.net/video/33Ij_IOQlx4/видео.html
Freshbooks. We meet again.
Isn't this Linus? Channel name changed?
It's a sister channel. He's got many.
Why does the algorithm need to replicate in order to note how many time its repeat ed can’t it do that by itself
Look up sloot digital coding system, its super neat idea and would be better than any known file compression methods today... Supposably it was 100 million times better then compression methods today, some think inventor was killed. IT compresses info really well and quickly, it was said around 16 full length movies could be utilized on 8 kilobyte's of space, all running at once, around the 1990's years. No one knows how it was done, but supposably compressed files had a unique patterns, or key tags, that it was all it took to represent needed information in processing, in which made it very autonomous and fast, quick action, quick reaction. ruclips.net/video/33Ij_IOQlx4/видео.html
Jesus. I didn’t come here to hear about luggage, stupid jokes or random garbage. Data compression is the title! Next!
Look up sloot digital coding system, its super neat idea and would be better than any known file compression methods today... Supposably it was 100 million times better then compression methods today, some think inventor was killed. IT compresses info really well and quickly, it was said around 16 full length movies could be utilized on 8 kilobyte's of space, all running at once, around the 1990's years. No one knows how it was done, but supposably compressed files had a unique patterns, or key tags, that it was all it took to represent needed information in processing, in which made it very autonomous and fast, quick action, quick reaction. ruclips.net/video/33Ij_IOQlx4/видео.html
I would have liked to hear words like "redundancy" and "entropy", since those are the fundamental principles of data compression
The Shutterstock pics are real in these videos haha I do love the editing humor tho
Would have liked more depth but entertaining and interesting - thanks!
I just checked the upload date when hi said fresh books as new sponser
I would have liked a more technical video on data compression even if I already know the principal.. But thanks anyway, that was fun and instructive for noobie
Techquickie Just wanted to inform you that you did not link to the video compression video. There's no link in the description, or a click-able annotation.
I’m watching this on a 4K tv and I can see Linus drooling lmfao
dude it took my cs teacher 40 min to explain the compression thingy and you did it in 5 min
Linus, how many channels do you have?!
Look up sloot digital coding system, its super neat idea and would be better than any known file compression methods today... Supposably it was 100 million times better then compression methods today, some think inventor was killed. IT compresses info really well and quickly, it was said around 16 full length movies could be utilized on 8 kilobyte's of space, all running at once, around the 1990's years. No one knows how it was done, but supposably compressed files had a unique patterns, or key tags, that it was all it took to represent needed information in processing, in which made it very autonomous and fast, quick action, quick reaction. ruclips.net/video/33Ij_IOQlx4/видео.html
LOL I ended Up funding the first freshbooks ads they did.
PLEASE do a video about safely removing/ejecting usb drives and when is is or is not important.
Not important at all
Who else here knows Richard Hendriks?
Solo now we need a fast as possible about audio compression, and their nitrates, we already have video and general compression
I thought you were going to advise software to use. I know what it all is but was hoping for tips on the best software as my podcast is too large for the site i use and i tried Audacity & Wondershare with no success so far. Any tips on Software that won't break the bank that will bring a podcast audio file down from about 920mbs to 250mbs?
Look up sloot digital coding system, its super neat idea and would be better than any known file compression methods today... Supposably it was 100 million times better then compression methods today, some think inventor was killed. IT compresses info really well and quickly, it was said around 16 full length movies could be utilized on 8 kilobyte's of space, all running at once, around the 1990's years. No one knows how it was done, but supposably compressed files had a unique patterns, or key tags, that it was all it took to represent needed information in processing, in which made it very autonomous and fast, quick action, quick reaction. ruclips.net/video/33Ij_IOQlx4/видео.html
Very nice! You reached the 500k Sub! A big thakt you to you for these great videos!
Greetings from Germany!
is this channel about computer science? if yeah than i would to subscribe :)
Someone please send this video to Infinity Ward.
Look up sloot digital coding system, its super neat idea and would be better than any known file compression methods today... Supposably it was 100 million times better then compression methods today, some think inventor was killed. IT compresses info really well and quickly, it was said around 16 full length movies could be utilized on 8 kilobyte's of space, all running at once, around the 1990's years. No one knows how it was done, but supposably compressed files had a unique patterns, or key tags, that it was all it took to represent needed information in processing, in which made it very autonomous and fast, quick action, quick reaction. ruclips.net/video/33Ij_IOQlx4/видео.html
great vid Linus as always
Or how about backing up that server you lost. If you can say compress all that to fit on a cheap 1tb drive making backups will be a lot easier and with a physical form network speeds can be faster like send 1tb/s over a slow 1mb/s network were about maxed out our frequencies so data compression is the key to faster network speeds
In your next video can you explain jan sloot's data compression technique
1st time quick books sponsored??
Please make a Video on comparison of compression techniques like 7z Zip Rar etc
funny and to the point. Thanks for those hilarious explanations! Helped so much!
1:56 yeah for free images! Thats a popular background!
Nice explanation linus. I hope you wrote the script.
Can you do a tech quickie on Arrays and Multi Dimensional Arrays
Howabout an episode on instancing, motion blur, and framerate/refreshrate? It's a very underrepresented topic, but it's actually really easy to understand, and might help clear up confusion about why shader-based motion blur is a thing, and why high refreshrates matter for more reasons than "they just like feel better and stuff mang".
you can compress better if you assign a symbol for a entire row of programming language words, then at the source destination use a dictionary to reverse the compression
Wow, I thought video ads were a no-no. Maybe Techquickie is special?
I think the in-house lower thirds for sponsors look better than the supplied ones. Probably because they fit the general LTT theme.
Wrong statement. It is not always true that smaller compressed files are of lower quality. For example, if 2 different algorithms are used on the same raw (uncompressed) data, the one with the lower compressed size might actually be HIGHER quality than the other if it is a much better (or even slightly better) algorithm. There is not a direct correlation between filesize and quality. Downvoted for inaccuracy. Also, for the XXX00XXX example, why not encode it as X, 0, 231 which is only 3 bytes instead of what appears to be 8 bytes?
well linus u helped and save the day because I was studying lossless in my computer subject and I was confused
LINUS answer this pls WHAT IS DNS SERVER? does it really make your internet speed faster?
i want linus to answer this and make a video about it.
google it or post on the forum
Bloody hell.... Another moron without that "option" what's it called again?? GOOGLE?
en.wikipedia.org/wiki/Domain_Name_System
Hey guys,
Whoever read my comment
Which is better zip or rar?
Rar
@Sur1u0nd_1
Thanks men
Finally someone answer.
data compression, like 7 zip or win rar arhive programs, from it you can chose how much to get less data from all files.
should i compress my c drive bec i wanna know if it does something to my system and ruins it
Look up sloot digital coding system, its super neat idea and would be better than any known file compression methods today... Supposably it was 100 million times better then compression methods today, some think inventor was killed. IT compresses info really well and quickly, it was said around 16 full length movies could be utilized on 8 kilobyte's of space, all running at once, around the 1990's years. No one knows how it was done, but supposably compressed files had a unique patterns, or key tags, that it was all it took to represent needed information in processing, in which made it very autonomous and fast, quick action, quick reaction. ruclips.net/video/33Ij_IOQlx4/видео.html
Data compression is for the overhead compartments more than the size of the suitcases (though those still matter)
*That drool on Linus' chin though...*
for the next techquickie, can you tell us more about haswell broadwell skylake or what those well well and lake lake means?
Anyone here in 2022?
Look up sloot digital coding system, its super neat idea and would be better than any known file compression methods today... Supposably it was 100 million times better then compression methods today, some think inventor was killed. IT compresses info really well and quickly, it was said around 16 full length movies could be utilized on 8 kilobyte's of space, all running at once, around the 1990's years. No one knows how it was done, but supposably compressed files had a unique patterns, or key tags, that it was all it took to represent needed information in processing, in which made it very autonomous and fast, quick action, quick reaction. ruclips.net/video/33Ij_IOQlx4/видео.html
ME
Me
im still here
If you had a hash of a file with the file size you could rebuild it with a modern computer it will take years but with a quantum computer it might be possible to crack it fast enough like if a quantum computer can decode a hash for a file size of 100MB in less time it takes to send 100MB over the internet it could be used to increase internet speed I don't know the speeds of a quantum computer maybe you can decode a 1tb file from a 1kb hash file and get internet speeds of 1tb/s over any internet connection you might probably will get more than 1 file matching the hash but if all the files were a zip file only one of the zips would works it's a lot of stuff to do and expecting it to work almost instantly I don't know quantum or not might be too much to ask
Look up sloot digital coding system, its super neat idea and would be better than any known file compression methods today... Supposably it was 100 million times better then compression methods today, some think inventor was killed. IT compresses info really well and quickly, it was said around 16 full length movies could be utilized on 8 kilobyte's of space, all running at once, around the 1990's years. No one knows how it was done, but supposably compressed files had a unique patterns, or key tags, that it was all it took to represent needed information in processing, in which made it very autonomous and fast, quick action, quick reaction. ruclips.net/video/33Ij_IOQlx4/видео.html
nice, how did i miss this ?
So which is the best software to ultra compress games?
Using this for GCSE revision