How to optimally serve and load JavaScript files?
Asked Answered
P

3

7

I'm hoping someone with more experience with global-scale web applications could clarify some questions, assumptions and possible misunderstandings I have.

Let's take a hypothetical site (heavy amount of client-side / dynamic components) which has hundreds of thousands of users globally and the sources are being served from one location (let's say central Europe).

  1. If the application depends on popular JavaScript libraries, would it be better to take it from the Google CDN and compile it into one single minified JS file (along with all application-specific JavaScript) or load it separately from the Google CDN?
  2. Assetic VS headjs: Does it make more sense to load one single JS file or load all the scripts in parallel (executing in order of dependencies)?

My assumptions (please correct me):

Compiling all application-specific/local JS code into one file, using CDNs like Google's for popular libraries, etc. but loading all of these via headjs in parallel seems optimal, but I'm not sure. Server-side compiling of third party JS and application-specific JS into one file seems to almost defeat the purpose of using the CDN since the library is probably cached somewhere along the line for the user anyway.

Besides caching, it's probably faster to download a third party library from Google's CDN than the central server hosting the application anyway.

If a new version of a popular JS library is released with a big performance boost, is tested with the application and then implemented:

  • If all JS is compiled into one file then every user will have to re-download this file even though the application code hasn't changed.
  • If third party scripts are loaded from CDNs then the user only has download the new version from the CDN (or from cache somewhere).

Are any of the following legitimate worries in a situation like the one described?

  • Some users (or browsers) can only have a certain number of connections to one hostname at once so retrieving some scripts from a third party CDN would be result in overall faster loading times.
  • Some users may be using the application in a restricted environment, therefore the domain of the application may be white-listed but not the CDNs's domains. (If it's possible this is realistic concern, is it at all possible to try to load from the CDN and load from the central server on failure?)
Padishah answered 1/8, 2012 at 16:18 Comment(5)
It's good to be concerned about performance, but in my experience the overhead of loading JavaScript is insignificant in comparison to the performance of the application itself, both client-side code and server (database) transactional code.Dagmardagna
@Pointy: 80% of the end-user response time is spent on the front-end - to OP: read that link carefully, I trust their advices.Misdoing
In my experience, googles CDN performance is so good it greatly outweighs any other concerns.Concoff
@TomaszNurkiewicz I'm thinking about web applications specifically, because that's my background and that's what I perceived the topic of this question to be. What's more, I find that Yahoo! statement to be sufficiently vague as to be worthless: it's not supported by any sort of explanation of what "end-user response time" means, for example.Dagmardagna
@TomaszNurkiewicz now, that said, I have indeed seen sites - mostly those with lots of 3rd-party content - that spend an outrageous amount of time downloading a crazy number of scripts, images, small CSS files, etc etc. I'm giving the OP the benefit of the doubt that if he's worried about performance he won't be making such obvious errors :-)Dagmardagna
C
8

Compiling all application-specific/local JS code into one file

Since some of our key goals are to reduce the number of HTTP requests and minimize request overhead, this is a very widely adopted best practice.

The main case where we might consider not doing this is in situations where there is a high chance of frequent cache invalidation, i.e. when we make changes to our code. There will always be tradeoffs here: serving a single file is very likely to increase the rate of cache invalidation, while serving many separate files will probably cause a slower start for users with an empty cache.

For this reason, inlining the occasional bit of page-specific JavaScript isn't as evil as some say. In general though, concatenating and minifying your JS into one file is a great first step.

using CDNs like Google's for popular libraries, etc.

If we're talking about libraries where the code we're using is fairly immutable, i.e. unlikely to be subject to cache invalidation, I might be slightly more in favour of saving HTTP requests by wrapping them into your monolithic local JS file. This would be particularly true for a large code base heavily based on, for example, a particular jQuery version. In cases like this bumping the library version is almost certain to involve significant changes to your client app code too, negating the advantage of keeping them separate.

Still, mixing request domains is an important win, since we don't want to be throttled excessively by the maximum connections per domain cap. Of course, a subdomain can serve just as well for this, but Google's domain has the advantage of being cookieless, and is probably already in the client's DNS cache.

but loading all of these via headjs in parallel seems optimal

While there are advantages to the emerging host of JavaScript "loaders", we should keep in mind that using them does negatively impact page start, since the browser needs to go and fetch our loader before the loader can request the rest of our assets. Put another way, for a user with an empty cache a full round-trip to the server is required before any real loading can begin. Again, a "compile" step can come to the rescue - see require.js for a great hybrid implementation.

The best way of ensuring that your scripts do not block UI painting remains to place them at the end of your HTML. If you'd rather place them elsewhere, the async or defer attributes now offer you that flexibility. All modern browsers request assets in parallel, so unless you need to support particular flavours of legacy client this shouldn't be a major consideration. The Browserscope network table is a great reference for this kind of thing. IE8 is predictably the main offender, still blocking image and iFrame requests until scripts are loaded. Even back at 3.6 Firefox was fully parallelising everything but iFrames.

Some users may be using the application in a restricted environment, therefore the domain of the application may be white-listed but not the CDNs's domains. (If it's possible this is realistic concern, is it at all possible to try to load from the CDN and load from the central server on failure?)

Working out if the client machine can access a remote host is always going to incur serious performance penalties, since we have to wait for it to fail to connect before we can load our reserve copy. I would be much more inclined to host these assets locally.

China answered 1/8, 2012 at 17:32 Comment(5)
"All modern browsers request assets in parallel, so unless you need to support particular flavours of legacy client this shouldn't be a major consideration." Could you give more information on how modern browsers request assets and executed them? Which browsers are you including in "all modern browsers"? Even if the browser requests the assets in parallel, maybe headjs would still be useful to allow us to explicitly state the order of loading & execution based on dependencies?Padishah
For example: You could use headjs to load all application-specific JS, and third party JS, in parallel. Once that's complete you may want to load compiled JS templates (in the background asynchronously so that when an action happens which requires the dialog template, it's already there and can be rendered instantly. See TwigJS) in order of probability (that it's required next) then dependency (template can inherit from each other)Padishah
In times gone by, scripts would block all requests from starting, including those for images and CSS, until the contents had been downloaded, parsed and executed. Few browsers in current circulation do this - I've added a little detail to the answer, including a link to the Browserscope network table.China
My tool of choice for dependancy management and client-side script optimization is require.js, which can handle the example use case in your second comment. You are likely to want a compact/minify step to generate the initial payload, with app code within that file to handle preemptive loading of additional scripts.China
Re: "inlining the occasional bit of page-specific JavaScript isn't as evil as some say." The problem with a "bit" of inline script is that in a team environment, future script will often be added to the original inline script, turning a bit of script into quite a bit of script over time. Inline script may then become part of the dynamic content that browsers do not cache. External script files (<script src="...) can be cached by browsers, which helps website performance by reducing downloads with the browser caching of static content defined in external script files.Topsail
D
4
  1. Many small js files is better than few large ones for many reasons including changes/dependencies/requirements.
  2. JavaScript/css/html and any other static content is handled very efficiently by any of the current web servers (Apache/IIS and many others), most of the time one web server is more than capable of serving 100s and 1000s requests/second and in any case this static content is likely to be cached somewhere between the client and your server(s).
  3. Using any external (not controlled by you) repositories for the code that you want to use in production environment is a NO-NO (for me and many others), you don't want a sudden, catastrophic and irrecoverable failure of your whole site JavaScript functionality just because somebody somewhere pressed commit without thinking or checking.
Demisemiquaver answered 1/8, 2012 at 16:56 Comment(0)
P
1

Compiling all application-specific/local JS code into one file, using CDNs like Google's for popular libraries, etc. but loading all of these via headjs in parallel seems optimal...

I'd say this is basically right. Do not combine multiple external libraries into one file, since—as it seems you're aware—this will negate the majority case of users' browsers having cached the (individual) resources already.

For your own application-specific JS code, one consideration you might want to make is how often this will be updated. For instance if there is a core of functionality that will change infrequently but some smaller components that might change regularly, it might make sense to only compile (by which I assume you mean minify/compress) the core into one file while continuing to serve the smaller parts piecemeal.

Your decision should also account for the size of your JS assets. If—and this is unlikely, but possible—you are serving a very large amount of JavaScript, concatenating it all into one file could be counterproductive as some clients (such as mobile devices) have very tight restrictions on what they will cache. In which case you would be better off serving a handful of smaller assets.

These are just random tidbits for you to be aware of. The main point I wanted to make was that your first instinct (quoted above) is likely the right approach.

Platonic answered 1/8, 2012 at 17:9 Comment(1)
Thanks. I said compiling because there could also be some pre-processing on some assetsPadishah

© 2022 - 2024 — McMap. All rights reserved.