Does minifying and concatenating JS/CSS files, and using sprites for images still provide performance benefits when using HTTP/2?
Asked Answered
B

5

47

With the new HTTP/2 protocol the overhead created by repeated HTTP requests to the same server has been greatly reduced.

With this in mind, are there still any significant performance advantages to minifying and concatenating JavaScript/CSS files, and combining images into sprites? Or are these practices no longer useful when HTTP/2 is being used?

Bismuthous answered 20/2, 2015 at 13:27 Comment(9)
Minifying isn't concatenating : it reduces the whole size. Just like sprites can usually mean less repetition in palettes and a better overall compression.Radiotelephony
@dystroy . ehm ok. but that is not related to the question. I asked if these 3 different technologies still have any effect when http2 protocol is in place.Bismuthous
As minifying and "spriting" reduce the total size of the files, then yes, they still have an effect. What is unclear here ?Radiotelephony
I think it is related to the question. you ask "So is it then still useful to minify and concatenate javascript and CSS files" to which the answer is "yes" because <insert what @dystroy just touched on>.Riyal
@dystroy 'spriting' doesn't necessarily reduce the total size. (a png header is just a few bytes, so that doesn't have much effect. The image itself could be larger, because it always needs to be a rectangle). The purpose of Spriting is mainly to reduce the amount of requests (one for each image).Bismuthous
I really don't see how this is opinion based. It's asking about performance benefits from minifying / concatenating / spriteifying assets when HTTP/2 is used. That seems anything but opinion based to me.Vidovic
I've reworded the question to sound_ more objective, but come on, people! Even before my edit it seemed pretty clear to me that the OP was asking about the performance benefits of these practices, not something squishy like "Is this a good idea or not?".Vidovic
Related: docs.google.com/presentation/d/…Vidovic
related idlewords.com/talks/website_obesity.htmOwlet
A
50

They're still useful. HTTP/2 reduces the impact of some of these practices, but it doesn't eliminate their impact.

Minification remains as useful as ever. Although HTTP/2 introduces new compression for message headers, that has nothing to do with minification (which is about message bodies). The compression algorithms for message bodies are the same, so minification saves just as much bandwidth as it did before.

Concatenation and sprites will have less of an impact than before, but they will still have some impact. The biggest issue with downloading multiple files instead of a single file with HTTP/1 isn't actually an HTTP-side problem, per se: there is some bandwidth-based overhead in requesting each file individually, but it's dwarfed by the time-based overhead of tearing down the TCP/IP session when you're done with one file, then starting up a new one for the next, and repeating this for every file you want to download.

The biggest focus of HTTP/2 is eliminating that time-based overhead: HTTP/1.1 tried to do this with pipelining, but it didn't catch on in the browser (Presto is the only engine that got it completely right, and Presto is dead). HTTP/2 is another attempt, which improves on HTTP/1.1's methods while also making this kind of thing non-optional, and it stands to be more successful. It also eliminates some of the bandwidth-based overhead in making multiple requests, by compressing headers, but it cannot eliminate that overhead completely, and when downloading multiple files, those requests still have to be made (as part of a single TCP/IP session, so there is less overhead, but not zero). So while the impact of concatenating and spriting is proportionally smaller, there is still some impact, especially if you use many files.

Another thing to consider, when it comes to concatenation and spriting, is compression. Concatenated files of similar types tend to compress better than the individual files do, because the compression algorithm can exploit similarities between the concatenated pieces of data. A similar principle applies to sprites: putting similar images in different regions of the same file usually results in a smaller file, because the image's compression can exploit similarities in the different regions.

Anh answered 20/2, 2015 at 14:36 Comment(4)
When you mention "Http/1.1 tried this", are you referring to the Keep-Alive header?Franctireur
@Katana314: No; Keep-Alive was introduced sooner (though it wasn't standardized until HTTP/1.1). Pipelining does use Keep-Alive, but it tried to go beyond what Keep-Alive typically did.Anh
May Presto rest in peace. I would still use Opera, but most websites are getting unusable in it. You said nothing more than I did, but you were more accurate. +1 for that!Cup
It's probably worthwhile to point out Presto had crazy heuristics for pipelining; it didn't blindly use pipelining.Soane
A
4

So far, all the answers tacitly assume that you'll want to download ALL the .CSS and .JS files for every page. A benefit from using http/2 and keeping .CSS and .JS files separate is that you can only bring down the ones you need, and not downloading something is always faster than efficiently downloading it.

Annals answered 30/8, 2017 at 3:21 Comment(0)
C
1

Yes, it is still useful.

Along-side with gzip compression, you page will weight less.

Imagine you are using a very slow GPRS (56Kbps, 500ms ping) network.

You have 50 tiny images, 30 javascripts and 20 css files.

This means that, with 2 parallel connections, you must wait over 100 * 500ms just for the requests.

Now, each image is about 3-4kb. Which might take a few milliseconds (5-8?).

Now, the CSS files and Javascript range from 20Kb to 600Kb.

This will kill your website with a huge transfer time.

Reducing the time to transfers the files will increase the 'speed' at which the website will load.

So, YES, it is still useful!

Cup answered 20/2, 2015 at 14:4 Comment(3)
with HTTP/2.0's multiplexing there are not 100 requests (as in your example, if I understand it correcty) , but just 1.Bismuthous
@lxer: There's only one connection, but there are still 100 requests. That's much, much better than 100 connections and 100 requests, which is the status quo (not counting pipelining). But it still doesn't beat 1 connection and 1 request.Anh
I think the mistake here is you are assuming that requests happen serially in HTTP/2. That's not the case. You can have as many requests as you want happening over the same HTTP/2 connection in parallel. So there's not 2 "parallel connections" in your example; there's 100 "parallel connections": a 500ms wait time.Vidovic
F
1

Minifying JS can still reduce the size of many symbols; inflatedJargonSymbolizerTokenManager will become _a. One example I found showed that JQuery GZipped was still twice the size of JQuery.min GZipped.

I also want to note that while you didn't imply otherwise, dystroy's comment is correct, and in fact contradicts the badly-written Wikipedia explanation; "Concatenating" JavaScript files might be less useful now. Minifying them still has its benefits. Just wanted to mention that, in case you happened to get some information there. In fact I'd edit the page myself if I wasn't worried about getting into an edit battle.

CSS likely has fewer opportunities for symbol reduction. Theoretically, all it would get is whitespace and comment removal.

Franctireur answered 20/2, 2015 at 14:37 Comment(0)
M
1

This may be a little late, but I want to point out a few alternative points that should be covered too.

The first is that minification normally employs some sort of uglification for JavaScript, which has benefits outside bandwidth - it prevents people from easily analyzing the code, which prevents normal users from using verbose methods and ideas malicious actions - even well-built sites can have problems with this. Of course, this is no substitute for security, and advanced users could always decipher uglified code.

The other is that not all browsers or connections are going to be using HTTP/2, at least not immediately - so if the performance of some HTTP/2 feature is barely noticeable on HTTP/2 clients, why not benefit those connecting still over HTTP/1.1?

Lastly, at the end of the day, the best way to determine how anything impacts the speed of your server is to benchmark it.

Madera answered 20/7, 2017 at 14:56 Comment(0)

© 2022 - 2024 — McMap. All rights reserved.