Static compression of css and js files (process automation)

    Due to the lack of mod_gzip on the hosting, I had to implement css and js compression through static compression. On the other hand, it may be better ... But, one thing immediately arises. Although it is possible to carry out such an operation manually, it is extremely inefficient, it is desirable to automate all this. Here is one of the simplest options for such automation, implemented in php, and is shown here.

    To begin with, we will deal with the task. There is a local version of the site (Apache, php), actively changing and adjusting. In the process, we should have compressed versions of css and js files (you should see the finished result when testing immediately).

    So we need:
    1. Find all css and js files (even if they are in subfolders)
    2. Create a compressed version (and not every time, but only if the file has been modified)
    3. Implement the automatic return by the server of the current version of the file (in compressed form, if this supports the browser)

    Of course, it is not difficult to add file consolidation here, if necessary, but this is not about that ...


    So, create a php page (let's call it, for example, ready.php), which will contain all the code we need. Opening this page in a browser, we will start the compression process. If the site is dynamically built in php, then you can register include ('ready.php') and, if necessary, compression will be performed automatically. Of course, the address in include () must be registered real. There is one nuance here, to prevent this file from being called on the hosting (it may actually not be on the hosting, but you can forget to delete the line), you can specify some unique condition for the local server, for example:

    if (mb_eregi ("local root folder address", $ _SERVER ['DOCUMENT_ROOT'])) {
      include ('ready.php');
    }

    The address of the root folder on your server and it is unlikely to coincide, but you can think of something else ...
    In ready.php we write such php code:

    function ready ($ dir) {
    $ dir = $ _SERVER ['DOCUMENT_ROOT']. $ dir;
    $ ext = array ("js", "css");
    for ($ i = 0; $ i <count ($ ext); $ i ++) {search ($ dir, $ ext [$ i]);
    }}
    function search ($ dir, $ ext) {
    $ dirH = opendir ($ dir);
    while (($ file = readdir ($ dirH))! == false) {
      if ($ file! = "." && $ file! = ".." &&! mb_eregi (". gzip", $ file)) {
        if (filetype ($ dir. $ file) == "dir") {
          search ($ dir. $ file. "/", $ ext);
        } else {
          if (fnmatch ("*.". $ ext, $ file)) {
            if (! mb_eregi ("gzip", $ file)) {
            // next line will show all found files
            // print $ dir. $ file. " 
    "; $ adr = substr ($ dir. $ file, 0, strrpos ($ dir. $ file, ".")); $ timeF = filemtime ($ dir. $ file); if (is_file ($ adr. ". gzip.". $ ext)) { $ timeG = filemtime ($ adr. ". gzip.". $ ext); } if ($ timeF> $ timeG) { // the next line will show the files to be compressed // print $ dir. $ file. "- GZIP
    "; // minimize (we need yuicompressor and its real address) exec ("java -jar yuicompressor.jar". $ adr. ".". $ ext. "-o". $ adr. ". gzipY.". $ ext); // compress if (is_file ($ adr. ". gzipY.". $ ext)) { shell_exec ("gzip -9 -n -f -c". $ adr. ". gzipY.". $ ext. ">". $ adr. ". gzip.". $ ext); unlink ($ adr. ". gzipY.". $ ext); } else { shell_exec ("gzip -9 -n -f -c". $ adr. ".". $ ext. ">". $ adr. ". gzip.". $ ext); } }}}}}} closedir ($ dirH); } // Here we write the address where the files are ready ("address");

    As a result, we get compressed copies of all js and css files with names like name.gzip.js and name.gzip.css (if not, then first, check the address and access rights).

    Next, you need to ensure that the north returns the current version of the file (without regard to the cache). This is done by adding the filemtime label to the file name. On php this is implemented as standard, for example:

    .css "rel =" stylesheet "type =" text / css ">

    The finished link should be of this type:


    This has been discussed more than once and the details can be found by request, for example, “accelerate your site, practical css / js” ...

    Add redirection rules to the htaccess file (we immediately take into account the presence of a compressed version):

    Rewriteengine on
    ReWriteCond% {HTTP: accept-encoding} gzip
    RewriteRule ^ (. * \.) V = [0-9.] + \. (Js | css) $ /$1gzip.$2 [QSA, L]
    ReWriteCond% {HTTP: accept-encoding}! Gzip
    RewriteRule ^ (. * \.) V = [0-9.] + \. (Js | css) $ / $ 1 $ 2 [QSA, L]
    
      Header set Content-Encoding: gzip
      Header set Cache-control: private
    

    That's all.
    As a result, we continue to work comfortably with js and css files and at the same time, we always have their compressed versions, which are fed to the browser.

    Also popular now: