
Javascript Compression and Combining Issues

Let's start with a simple one: how JS compression can ruin our mood. And how to lift it back :)
UPD launched a website acceleration contest . In the prizes: monitor, webcam, mouse. Everything is hyper-fast.
Javascript compression
In general, it’s worth mentioning right away that compression of JavaScript files will give us only a 5-7% reduction in size relative to regular gzip, which can be used everywhere (no, really, everywhere - starting from the Apache configuration via .htaccess and ending with static compression through mod_rewrite + mod_mime and the nginx (or LightSpeed) configuration, but back to the topic: we want to minimize JavaScript files, what’s the best way to do this?
Two years ago a review of current tools was performed , during which time the situation did not change much (except Google Compiler appeared ) But all in order.
- Let's start with a simple one. JSMin (or its clone, JSMin + ). It works very universally (by the principle of finite state machines ). Almost always the minimized file is even executed. Additional gain (hereinafter relatively simple gzip) - up to 7% in the case of the advanced version, i.e. quite a bit. The processor eats moderately (the advanced version, JSMin + is stronger, and the memory is significant), but does not analyze the scope of the variables, and therefore can not reduce their length. In principle, it can be applied to almost all scripts, but nuances are sometimes possible. For example, conditional comments are deleted (this is treated) or various constructions are incorrectly recognized (for example, it is
+ +
converted to++
, it breaks the logic), it is also treated, but it is more difficult. - YUI Compressor . The most famous (until recently, also the most powerful) tool for compressing scripts. It is based on the Rhino engine (as far as we know, the roots of the engine are somewhere near the Dojo framework, i.e. a very, very long time ago). It compresses scripts perfectly, works on the scope (it can reduce the length of variables). The compression ratio is up to 8% over gzip, however, the processor eats be healthy (due to the use of the Java virtual machine), so you should be careful when using it online. Also, due to the shorter lengths of the variables, various problems are possible (and there are even potentially more problems than for JSMin).
- The Google Closure Compiler has appeared recently, but has already gained public confidence. It is based on the same Rhino engine (yeah, there is nothing new under the moon), but it uses more advanced algorithms for reducing the size of the source code ( excellent overview in all details ), up to 12% relative to gzip. But here it is worth being triple cautious: a very substantial part of the logic can be cut out, especially during aggressive transformations. However, jQuery already uses this tool . In terms of processor costs, it is apparently even heavier than YUI (this fact has not been verified).
- and packer . This tool is already a thing of the past due to the development of communication channels and lagging processor capacities: for compression in it (an algorithm similar to gzip) is performed using a JavaScript engine. This provides a very significant (up to 55% without gzip) code size reduction, but additional costs up to 500-1000 ms for unpacking the archive. Naturally, this becomes irrelevant if the processor capacities are limited (hi, IE), and the connection speed is very high (+ gzip is almost always used and supported). Additionally, this optimization method is most prone to various bugs after minimization.
The summary is here: check JavaScript not only on the server where it is developed, but also after optimization. Best of all - using the same unit tests. You will learn a lot about the described tools :) If this is not critical, then just use gzip (best static with maximum compression) to serve JavaScript.
Javascript Association Issues
After we’ve figured out how to compress JavaScript files, it’s good to touch on the topic of combining them. The average site has 5-10 JavaScript files + several inline snippets of code that can somehow call plug-in libraries. As a result, there are 10-15 pieces of code that can be combined together (the advantages of this are the sea - from the download speed on the user side to the server failover under DDoS - here each connection will be counted, even static).
But back to the rams. Now we will talk about some automation of combining "third-party" scripts. If you have full access to them (and understand web development), then it’s not a big deal to fix problems (or exclude a number of problematic scripts from the union). Otherwise (when the set of scripts does not want to merge without errors in any way) the following approach is just for you.
So, we have 10-15 pieces of code (some of them are in the form of embedded code, some are in the form of external libraries, which we can also "merge" together). We need to guarantee their independent performance. What does it consist of?
If we have a JavaScript error in the file, then the browser stops executing this file on an error (some of the oldest ones also stop executing all the JavaScript files on the page in this case, but we are not quite talking about that). Accordingly, if the very first library that we want to merge into a common file generates an error, then in all browsers our client logic will fall apart after merging. Sad
Additionally, it is worth noting that the inline code is quite difficult to debug. You can either exclude it from the union (for example, by placing a call to the merged file before or after the code), or if you use it, cancel the union of files at all.
backward compatibility
What can we do about it? The easiest way: to exclude problematic files (in this case, errors can appear only at the stage of combining, separate files can work out with a bang) from the logic of combining. To do this, you will need to track in which place the error occurs, and collect the configuration to combine for each such case.
But you can do a little easier. For JavaScript, we can use the construct
try-catch
. Yeah, did you catch the idea? Not yet? We can enclose the entire contents of the files that we combine, enclose in try {}
, and in catch(e) {}
cause the connection of an external file in approximately the following way:try { ... the contents of the JavaScript library ... } catch (e) { document.write ('original JavaScript file call'); // or console.log ('you need to exclude the JavaScript file from the union'); }
At the same time, the user will have only one file loaded, if there are no problems. If there were errors, then all external problematic files will be connected in the same order. This will provide backward compatibility.
Performance issues
Obviously, this approach is not the “most correct”. It would be most logical to identify JavaScript errors, eliminate them, and upload a single file for all users. But this is not always possible. It is also worth considering that the
try-catch
design is heavy for execution in browsers (adds 10-25% to the initialization time), you should be careful with it. But the described approach can be used wonderfully for debugging specifically combining JavaScript files: after all, it allows you to precisely determine which files are "streaming" (if there are several dozen files, this is very, very useful).
A small summary
After further minimizing your JavaScript files, be sure to check their functionality. And debugging the correctness of combining JavaScript files can be easily automated, or even set up backward compatibility if specific scripts cannot be debugged.