GZIP is not enough!

Поделиться
HTML-код
  • Опубликовано: 12 сен 2024

Комментарии • 71

  • @7thAttempt
    @7thAttempt 8 лет назад +4

    Seems I'm a couple of years late watching this. Never the less, very interesting stuff! :) Thanks for posting!

  • @bcbigb
    @bcbigb 8 лет назад +16

    My vote is always for middle-out. I can't believe you Hooli guys can't get that figured out :-\

  • @SalmanRavoof
    @SalmanRavoof 4 года назад +1

    It's been years since I graduated as an engineer, but watching this made me realize I should've taken mathematics even more seriously.

  • @etmax1
    @etmax1 8 лет назад

    Very interesting, good to see I'm not the only person interested in reducing overheads.

  • @acolombo
    @acolombo 11 лет назад +1

    I totally agree. I don't watch most of the videos Google Developer uploads beacuse they're too many, but I watch only the ones I'm interested in.

  • @DustinRodriguez1_0
    @DustinRodriguez1_0 7 лет назад

    There's a problem with the delta approach. Latency. Sure, you might save a few kilobytes... but those additional requests each incur latency. So whatever raw transfer speed improvements you would get (which are not simply linear due to packet sizes, saving 1 byte inside a packet isn't nearly as important as saving 1 byte that would overflow to a new packet) would almost certainly be eaten up by the latency of those requests, especially given the way consumer connections are throttled on upstream. (Not that requests for these things would be large and run into throttling themselves, but considering the ways in which the throttling is done, especially in circumstances where the network connection is being used, such as in a family home)

  • @RahulAhire
    @RahulAhire 3 года назад

    This video is really a gem

  • @FranciscoLopez0
    @FranciscoLopez0 11 лет назад

    Really great video, learned a lot and am excited to try-out some of these methods.

  • @gfetco
    @gfetco 11 лет назад

    Love it! Is there more? Can you please make a playlist for these educational videos?

  • @Ferdii256
    @Ferdii256 8 лет назад +1

    Very great explanation! Thank you!

  • @DavidTanGQ
    @DavidTanGQ 8 лет назад +1

    Could you please explain why the kiwi's in the PNG with the extra two columns of pixels were not compressed? I would've expected the second kiwi to have fit in the 32k window, and perhaps part of the third kiwi as well.

  • @georgigeorgiev2219
    @georgigeorgiev2219 9 лет назад

    Thank you very much.
    GOD bless you.

  • @RonJohn63
    @RonJohn63 9 лет назад

    "Minification" reminds me of code obfuscation which companies used back in the day to distribute source code (back when the world was much more than Windows, OSX and Linux) while making it difficult for the user to read it, and -- even before that -- the tokenezation performed by the MS BASIC interpreter.

  • @fashnek
    @fashnek 11 лет назад

    There is a mistake, in that the sorted list does not have the same cardinality as the source set. The correct result would be
    [0,1,1,0,1,0,1,1,1,1,1,1]. He wanted to make an ideal set [0,1,1,1,1,1,1,1,1,1] to demonstrate the compression. In a real world example, it would be fine to have multiple entries that were the same. Those would lead to a value of zero in the delta output like I showed.

  • @danfr
    @danfr 8 лет назад +1

    So I guess this history lesson is the reason Chrome is being cautious and hiding brotli behind a flag in Canary even though brotli was also invented at Google; while Firefox is going ahead and releasing it into the wild in the very release I'm downloading the update for now.
    That said, assuming the proxy issues haven't disappeared (which they may not be as big an issue in the years since); it does tell me that those bugs won't be as big an issue for bzip2 or brotli in HTTPS traffic, which is a growing category of traffic.

  • @hexanet
    @hexanet 10 лет назад

    The encode time(s) are a much larger part of latency than I would have thought.
    The 50ms of encode time per 100kb of data figure is non negligible for files that rarely change like CSS or JS. Serving them a pre-compressed files makes a lot of sense. For CMS frameworks we just need the tools to do it for us.
    Is the given amazon example based on the default gzip DeflateCompressionLevel (6)? And is this based on an average server side? Or a local machine?

  • @fashnek
    @fashnek 11 лет назад

    The section around 27:00 is incorrectly stated and misleading -- "GZIP is inflating the smaller file" is /wrong/. Those red numbers are not an indictment of GZIP at all and are not indicating that GZIP is harmful or "scary". They're an indictment of the genetic algorithm-based minifier tools, which make the data inherently /less compressable/. In other words, they make GZIP a little bit less helpful, NOT harmful. GZIP is no less of a "silver bullet" with this argument. GA minification is.

  • @hay534
    @hay534 11 лет назад

    I believe this is because people like the idea of being expert in one thing but just a few have eager and interest in discover more about what it is said to be important?

  • @Jirayu.Kaewprateep
    @Jirayu.Kaewprateep Год назад

    📺💬 I have been working with unreal-engine for 3 years and we found the problem of the game is that objects are in game sizes and when we move pixels that is a lot of data.
    🥺💬 I understand also games and the GZip is involved in the compression of pixels when allowed some function working with data when it is in compression format.
    📺💬 Huffman codes present in bits code by the order of the number of words present in the context that should be Huffman because more frequently present data is used less number of bits represent or priority in order for symmetric characters or word encoding.
    📺💬 Yui you should correct the comment first 🥺💬 It is true if it is Huffman encoding, text line a, an, the or pronunciation will have the priority bits but the longest word matching is to find the number of them in sequences they are present in the context. Huffman is good for reading words because there are not much of repeating words when the table is but how about logging and number locations and bits represents⁉
    🥺💬 I also read WinZip which is the reason why DAT format compression is over 70 percent for some text format, it supports bot the longest search and Huffman.
    🧸💬 Advantage of GZIP is you still can work with data when it is in compression format.
    📺💬 Delta compression where we can send patches to update the data to the client by the client continues working on the original file.
    🧸💬 There is an application not only to update the text file for JSON parameters value but visualization of objects, screen transformed data, mouse pointer, and keyboard types and rules.
    🐑💬 It is a good application but a security concern, they invented it for many years back but an attacker can replay on this transmission communication by reading from encryption packages because of real-time transform limitations today loing encryption algorithms are helpful with this method.
    📺💬 Horizontal Delta compression 🧸💬 Transferring of data in field format grouping and priority you can reduced sizes of communication packages because the data reply required to process are arrived at the same time.

  • @JensVanHerck
    @JensVanHerck 11 лет назад

    its an official google channel, I guess allot of people subscribed over the years but not many watch every vid ?

  • @ramprasath9086
    @ramprasath9086 5 лет назад

    very great explantion!sir

  • @mikedotexe
    @mikedotexe 8 лет назад

    This video is amazing.

  • @Oswee
    @Oswee 5 лет назад

    Great talk!

  • @snetsjs
    @snetsjs 8 лет назад +1

    Doesn't LZMA offer the best web solution because it has the smallest storage and network footprint and the fastest decoding? I ask this because it seems encoding time isn't any where near as important since it is done once before deployment.

    • @JohnJones1987
      @JohnJones1987 8 лет назад

      +snetsjs gzip is about 2x faster at decompression than any lzma, although lzma compresses significantly better, getting the content to you (for decompression) quicker, so there's an awkward tradeoff. Basically, the faster the network, the more attractive gzip is.

    • @JohnJones1987
      @JohnJones1987 8 лет назад +1

      +John Jones Oh, I just saw the bit where he provides the numbers which is why you thought lzma is faster than gzip. Those stats are garbage. LZMA is way better at compression than gzip, but takes much longer to compress and a little bit longer to decompress. The LPAQ and bzip stats look reasonable, so its really only LZMA that looks weird to me. It's all about the data your trying to compress at the end of the day, but these stats are weird.

  • @troooooper100
    @troooooper100 8 лет назад +2

    the idea to make your api results complete hell to deal with for saving 100 bytes, yea seems like a bad route.

  • @michaelmoore5438
    @michaelmoore5438 7 лет назад

    Wow, how about a 90% compression system that is math based and not huff based, what effect would that have on the net....video streaming, etc.

  • @LaurentPerche
    @LaurentPerche 11 лет назад

    From 25:00 how do you sort the combination of digits since 3 & 2 are listed twice?

  • @sk8rz21
    @sk8rz21 7 лет назад +1

    24:50 there are 2 2's and 2'3s

  • @ah64Dcoming4U
    @ah64Dcoming4U 7 лет назад

    can we get this in 1080 or higher?

  • @Mr69barracuda
    @Mr69barracuda 9 лет назад

    OWWWWwwwww MY HEAD~ My head.... I need ibuprofen & compression.

  • @AdamRicheimer
    @AdamRicheimer 11 лет назад

    I was hoping to hear about sdch.

  • @dipi71
    @dipi71 7 лет назад

    About this newfangled WebP format: yeah, not too great, sounded too good to be true; Wikipedia cites researchers describing much more blurry results than with JPEG, and no significant reduction of size in memory and storage.

  • @JoshuaJamison
    @JoshuaJamison 11 лет назад

    Drinking game. Take a drink every time he says "Fantastic".
    Good luck.

  • @mieszkogulinski168
    @mieszkogulinski168 7 лет назад

    5:37 - as of 2017, Firefox, Safari and Edge still don't support WebP

    • @mieszkogulinski168
      @mieszkogulinski168 7 лет назад

      11:50 - large size plagues also native JavaScript, I was working on a small Web app - shop with custom T-shirts - and it got to about 1.1 MB of JavaScript bundle (Browserify), and 700 kB after minification. Most of the size was used up by Firebase library and React library.

  • @rudde7251
    @rudde7251 7 лет назад

    Is this solved in HTTP/2?

  • @ah64Dcoming4U
    @ah64Dcoming4U 7 лет назад

    playing 0:10 over and over to understand how he got such speed

  • @JackQuark
    @JackQuark 11 лет назад

    I use chrome everyday, and here is the developer.

  • @delphibit
    @delphibit 8 лет назад +5

    Me ( aka super sexy bald guy) trolololol

  • @primodernious
    @primodernious 6 лет назад

    what about convert the whole data file into a gigantic expandable numerical array, then sum all the values into a gigantic floating point number and then store this number into a new file? or would that not work?

  • @SurvivalSquirrel
    @SurvivalSquirrel 8 лет назад +3

    There is not that much win with something else then gzip. But so much bla bla.

  • @LaurentPerche
    @LaurentPerche 11 лет назад

    From 25:00

  • @myselfe-mail6078
    @myselfe-mail6078 8 лет назад +1

    ЛЫСЫЙ ИЗ БРАЗЗЕРС АХАХАХАХАХА! КРУТО Я ПОШУТИЛ:)

  • @Xeratas
    @Xeratas 11 лет назад

    i wonder, uve 400k subscribers but only around 100-1000 clicks per video. why is that

  • @FernandoBasso
    @FernandoBasso 9 лет назад +1

    Six down votes? Why?

    • @saeidyazdani
      @saeidyazdani 9 лет назад +2

      +Fernando Basso Until now, 12 people came to watch cats...but got bombarded with knowledge

  • @MohamedHomaid
    @MohamedHomaid 11 лет назад

    Thanks GoogleDevelopers.

  • @navin27in
    @navin27in 11 лет назад

    maybe because of the sheer volume of video uploaded..people selctively view videos..

  •  6 лет назад

    i think its good stuff .

  • @portblock
    @portblock 7 лет назад

    sounds like its the middle boxes problems, not yours. If the browser and server determine what they can use, the all is good, if there is a middle box in the way, then let them get the complaints.
    Also if I may add, a lot of page laoding times I see are not due to a 2-5% saving in content size, but rather poor devlopers using all these frameworks and causing so much overhead and including 3rd party stuff, java, css, etc,, from 3rd party sites, its horrible.

  • @GoatTheGoat
    @GoatTheGoat 8 лет назад +2

    Great info here. But I cringe every time I hear gif mispronounced. 'Choosy developers choose gif.'

    • @briankelley4568
      @briankelley4568 8 лет назад +1

      +Ryan Patterson Jraphics interchange format?

    • @scotthannan8669
      @scotthannan8669 8 лет назад +4

      +Ryan Patterson Choosy developers choose more than GIF and they also don't pronounce "Graphics" with a "J".... JIF is like a bad dream

    • @butterfury
      @butterfury 8 лет назад +2

      +Ryan Patterson It will always be Jif to me.

    • @bonbonpony
      @bonbonpony 8 лет назад

      Yeah, and "jizzip" :P most obviously from JiNU ZIP :P
      If all Google developers are that competent, that would explain the constant degradation of RUclips and their other websites :q

    • @robl4836
      @robl4836 7 лет назад +1

      That's how the guy who created the format pronounces it... "J"if. So that is the correct way I guess.

  • @agausmann
    @agausmann 9 лет назад +14

    Thank you for not pronouncing "gif" like "jif."

  • @shawn576
    @shawn576 7 лет назад

    If it were a human, it would be able to drink and drive. lol
    18:10

  • @Vidmich
    @Vidmich 11 лет назад

    remove bzip support because of broken proxy software? and you call this idea "smart"? are you kidding??

  • @Alexey112
    @Alexey112 11 лет назад

    399 thousents people wanna be programmers

  • @user-of1zz5vw9l
    @user-of1zz5vw9l 6 лет назад

    ひどいじじ。やっぱり😠