Why bundle optimizations are no longer a concern in HTTP/2
Asked Answered
H

4

42

I read in bundling parts of systemjs documentation that bundling optimizations no longer needed in HTTP/2:

Over HTTP/2 this approach may be preferable as it allows files to be individually cached in the browser meaning bundle optimizations are no longer a concern.

My questions:

  1. It means we don't need to think of bundling scripts or other resources when using HTTP/2?
  2. What is in HTTP/2 which makes this feature enable?
Hellenic answered 16/6, 2015 at 7:42 Comment(2)
Even if this were true, it will be a while before you can assume that everyone is using HTTP/2Lovato
I know that adoption may take long, but it is very strange for me that what change to http protocol could enable this feature?!Hellenic
P
14

HTTP/2 supports "server push" which obsoletes bundling of resources. So, yes, if you are you using HTTP/2, bundling would actually be an anti-pattern.

For more info check this: https://www.igvita.com/2013/06/12/innovating-with-http-2.0-server-push/

Parmer answered 16/6, 2015 at 8:6 Comment(6)
I really need to be convinced by a benchmark showing that pushing hundreds of scripts to the client as the effect as pushing one bundle only.Roundly
@GuillaumeD. good point...technically , normally downloading one file will (in some cases) still finish faster than the same file split over thousands of files if the web server is accessing each file, because of the overhead in accessing each of the files......also if the client is saving each file on a drive cache (because of the overhead in creating each cached file). Only the connection limit benefit becomes obsolete, other optimizations are not automatically counter-indicated because of HTTP2. HTTP2 could still see a performance benefit from using bundling, etc, in certain scenarios.Moorhead
I don't think this is a one fits all solution thou, with more and more functionality built into small modules, I think some bundling will be required. For Example React may appear as a single big js file, but in-fact its probably hundreds of es modules.Primaveras
@Moorhead Apart from the time required to send the JS file over the network we should also consider the time taken to parse and compile the javascript bundle by the server. As the size of the bundle increases the time to parse and compile the javascript at the server end increases.Inhumanity
@Inhumanity if server is setup in a good way, with side caching in-effect, and you warm-up the server, then No, not really. Users will only ever experience the speed of downloading already-compiled files.Moorhead
so @Inhumanity you are right, it must be considered, but it can be handled by an appropriate server setup, which pre-processes and caches the bundles before any user access it.Moorhead
I
68

The bundling optimization was introduced as a "best practice" when using HTTP/1.1 because browsers could only open a limited number of connections to a particular domain.

A typical web page has 30+ resources to download in order to be rendered. With HTTP/1.1, a browser opens 6 connections to the server, request 6 resources in parallel, wait for those to be downloaded, then request other 6 resources and so forth (or course some resource will be downloaded faster than others and that connection could be reused before than others for another request). The point being that with HTTP/1.1 you can only have at most 6 outstanding requests.

To download 30 resources you would need 5 roundtrips, which adds a lot of latency to the page rendering.

In order to make the page rendering faster, with HTTP/1.1 the application developer had to reduce the number of requests for a single page. This lead to "best practices" such as domain sharding, resource inlining, image spriting, resource bundling, etc., but these are in fact just clever hacks to workaround HTTP/1.1 protocol limitations.

With HTTP/2 things are different because HTTP/2 is multiplexed. Even without HTTP/2 Push, the multiplexing feature of HTTP/2 renders all those hacks useless, because now you can request hundreds of resources in parallel using a single TCP connection.

With HTTP/2, the same 30 resources would require just 1 roundtrip to be downloaded, giving you a straight 5x performance increase in that operation (that typically dominates the page rendering time).

Given that the trend of web content is to be richer, it will have more resources; the more resources, the better HTTP/2 will perform with respect to HTTP/1.1.

On top of HTTP/2 multiplexing, you have HTTP/2 Push.

Without HTTP/2 Push, the browser has to request the primary resource (the *.html page), download it, parse it, and then arrange to download the 30 resources referenced by the primary resource.

HTTP/2 Push allows you to get the 30 resources while you are requesting the primary resource that references them, saving one more roundtrip, again thanks to the HTTP/2 multiplexing.

It is really the multiplexing feature of HTTP/2 that allows you to forget about resource bundling.

You can look at the slides of the HTTP/2 session that I gave at various conferences.

Icaria answered 16/6, 2015 at 9:53 Comment(4)
Its currently not that simple really, you still need to parse your sources, determine what resources are required to be pushed, including nested resources. Otherwise you are still paying the latency cost for each nested resource. Simply scanning the <link> and <script> tags of the html wont get you all the resources. 30 Resources may in fact be 2/3/4 /... round trips using ES6 modules. Also you have to track what you have already pushed.Primaveras
It is that simple. The server does not need to parse any resource. Jetty and other servers can push resources, nested resources and dynamically loaded scripts without problems.Icaria
I agree, but still, the server still needs to read each file and the client still generally needs to cache/keep it somewhere. In some special cases, where the file IO is taxed, bundling could achieve benefit with HTTP2. Maybe when dealing with XBRL taxonomies or other situations where tens of thousands of files are needed, you can eliminate a costly part of file system overhead from accessing many files through bundling.Moorhead
I believe the bandwidth you save by only asking for the specific things you need outweights the parsing bottleneck.Sulphonamide
M
16

Bundling is doing a lot in a modern JavaScript build. HTTP/2 only addresses the optimisation of minimising the amount of requests between the client and server by making the cost of additional requests much cheaper than with HTTP/1

But bundling today is not only about minimising the count of requests between the client and the server. Two other relevant aspects are:

  • Tree Shaking: Modern bundlers like WebPack and Rollup can eliminate unused code (even from 3rd party libraries).
  • Compression: Bigger JavaScript bundles can be better compressed (gzip, zopfli ...)

Also HTTP/2 server push can waste bandwidth by pushing resources that the browser does not need, because he still has them in the cache.

Two good posts about the topic are:

Both those posts come to the conclusion that "build processes are here to stay".

Minister answered 26/8, 2017 at 14:12 Comment(1)
Thanks, I assume this answer is still correct as of 2024? Or do JS importmaps and/or HTTP/3 change anything? Doesn't seem to be the case...Coltson
P
14

HTTP/2 supports "server push" which obsoletes bundling of resources. So, yes, if you are you using HTTP/2, bundling would actually be an anti-pattern.

For more info check this: https://www.igvita.com/2013/06/12/innovating-with-http-2.0-server-push/

Parmer answered 16/6, 2015 at 8:6 Comment(6)
I really need to be convinced by a benchmark showing that pushing hundreds of scripts to the client as the effect as pushing one bundle only.Roundly
@GuillaumeD. good point...technically , normally downloading one file will (in some cases) still finish faster than the same file split over thousands of files if the web server is accessing each file, because of the overhead in accessing each of the files......also if the client is saving each file on a drive cache (because of the overhead in creating each cached file). Only the connection limit benefit becomes obsolete, other optimizations are not automatically counter-indicated because of HTTP2. HTTP2 could still see a performance benefit from using bundling, etc, in certain scenarios.Moorhead
I don't think this is a one fits all solution thou, with more and more functionality built into small modules, I think some bundling will be required. For Example React may appear as a single big js file, but in-fact its probably hundreds of es modules.Primaveras
@Moorhead Apart from the time required to send the JS file over the network we should also consider the time taken to parse and compile the javascript bundle by the server. As the size of the bundle increases the time to parse and compile the javascript at the server end increases.Inhumanity
@Inhumanity if server is setup in a good way, with side caching in-effect, and you warm-up the server, then No, not really. Users will only ever experience the speed of downloading already-compiled files.Moorhead
so @Inhumanity you are right, it must be considered, but it can be handled by an appropriate server setup, which pre-processes and caches the bundles before any user access it.Moorhead
I
5

Bundling is still useful if your website is

  1. Served on HTTP (HTTP 2.0 requires HTTPS)
  2. Hosted by a server that does not support ALPN and HTTP 2.
  3. Required to support old browsers (Sensitive and Legacy Systems)
  4. Required to support both HTTP 1 and 2 (Graceful Degradation)

There are two HTTP 2.0 features that makes bundling obsolete:

  1. HTTP 2.0 Multiplexing and Concurrency (allows multiple resources to be requested on a single TCP connection)
  2. HTTP 2.0 Server Push (Server push allows the server to preemptively push the responses it thinks the client will need into the client's cache)

PS: Bundling is not the lone optimization technique that would be eliminated by the insurgence of HTTP 2.0 features. Features like image spriting, domain sharding and resource inlining (Image embedding through data URIs) will be affected.

How HTTP 2.0 affects existing web optimization techniques

Isaiasisak answered 26/6, 2017 at 4:36 Comment(3)
In theory, HTTP/2 is allowed over HTTP. In practice, most browser only support it in HTTPS. Source: http2.github.io/faq en.wikipedia.org/wiki/HTTP/2#EncryptionPlantaineater
That is why I said it requires HTTPS because support coverage won't be good with HTTP onlyIsaiasisak
Correct, this was just to provide more context.Plantaineater

© 2022 - 2024 — McMap. All rights reserved.