But it's still worthwhile to look at this:. The opposite is the case. The slower the client's internet connection or route to the server the more advantage you get out of gzip compression or compression in general. Decompression is about 3 times as fast. Also, most people most of the time are not able to completely saturate their high bandwidth internet connection with a single download. To transmit a 50kB file, you thus need 0. Today, I stumbled across the Squash Compression Benchmark site, which shows nice graphs of a variety of compression algorithms gzip among them measured on different computers.
Note how the speed of your computer has a huge influence on what's true and what's a lie in terms of "compression is worth it". What's true for your desktop gaming rig may not be true for your low-cost mobile phone.
To elaborate: All screenshots but the very last are for the "E-Desktop Intel Core i3" machine a typical "no special, not awesome" desktop computer , whereas the very last is for "Raspberry 2" a not-quite-terrible, but still low-power ARM mini-computer.
Stack Overflow for Teams — Collaborate and share knowledge with a private group. Create a free Team What is Teams? Collectives on Stack Overflow. Learn more. What is the speed impact of gzip on files for HTTP transfer? Ask Question. Asked 10 years, 2 months ago. Active 2 years ago. Viewed 16k times. Robert Martin. Robert Martin Robert Martin On Ubuntu, it is probably located at:. You are likely to find it at:. Next, restart Apache. Configure this module in the default configuration file, which should be located at:.
Some 21 percent, or about one in five active websites, use Nginx , according to the Netcraft survey. Nginx has a helpful guide for enabling its gzip module.
The steps described below are based on this guide. To begin, navigate to the nginx. For Ubuntu, as an example, this is probably located at:. Locate the gzip section. I tried the system version of gzcat "Apple gzip This is free software. Written by Paul Eggert. For what it's worth, I also tested pigz 2. This is more like what I expected which makes sense as I did most of my development on my laptop.
I still don't know why GNU gzip is so much slower on that Debian machine. I am far from the first to point out that it's faster to use zcat than Python's gzip library. The xopen module, for example, can use the command-line pigz or gzip programs as a subprocess to decompress a file, then read from the program's stdout via a pipe.
This approach provides a basic form of parallelization as the decompression is in a different process than the parser for the file contents. It's not hard to roll your own using the subprocess module, but there are a few annoying details to get right. For example, what if the gzip process fails because the file isn't found, or because the file isn't in gzip format?
The process will start successfully then quickly exit. So, when is that error reported? A exciting new feature added within the last month! If I read the benchmark correctly, on the test system the overall performance doubled.
Ruben Vorderman one of the xopen authors then did the next step of submitting a patch to Python to Add pipesize parameter to subprocess. By default chemfp uses my gzio wrapper to libz. It can be configured to use Python's gzip library, or to used a subprocess.
0コメント