Leverage browser caching for 3rd party JS
Asked Answered
R

3

34

I have set Expiry on my httpd.conf

ExpiresActive On
ExpiresDefault "access plus 1 month"
ExpiresByType image/gif "access plus 1 month"
ExpiresByType image/jpeg "access plus 1 month"
ExpiresByType image/png "access plus 1 month"
ExpiresByType text/css "access plus 1 month"
ExpiresByType text/javascript "access plus 1 month"
ExpiresByType application/x-javascript "access plus 1 month"

This helps with browser caching for images, fonts files, site own css and js files. But I also have external JS included in my website:

http://connect.facebook.net/en_US/sdk.js (20 minutes)
http://apis.google.com/js/client.js (30 minutes)
https://apis.google.com/js/rpc:shindig_random.js?onload=init (30 minutes)
https://platform.twitter.com/widgets.js (30 minutes)
https://www.google-analytics.com/analytics.js (2 hours)

Google Pagespeed Insights says for the upper files: Setting an expiry date or a maximum age in the HTTP headers for static resources instructs the browser to load previously downloaded resources from local disk rather than over the network.

How to leverage browser cache this external JS files ? Any Help ?

Rapscallion answered 14/7, 2016 at 14:26 Comment(0)
M
46

An annoying issue, Indeed. Not one that is as easily fixable I'm afraid. But what you can do is use a cron.

Firstly, keep in mind that Google are very unlikely to penalise you for their own tools (Like Analytics). However, as mentioned before, it can be fixed using a cron, which basically means you load the JavaScript locally and pull updated scripts.

How to do this:

First of all, you need to download the script that you're running. I will be using Google Analytics as an example (this appears to be the most problematic script people complain about, but you can replicate this for any external scripts).

Look in your code and find the name of the script, in our case it is: google-analytics.com/ga.js. Pop this URL into your web browser and it will bring up the source code. Simply make a copy of it and save it as ga.js.

Save this newly created JavaScript file onto your webserver, in my case:

- JS
  - ga.js

Next you will need to update the code on the pages that are calling your script and just change the directory that is calling the JavaScript file. Once again in our case, we will be changing this line:

ga.src = ('https:' == document.location.protocol ? 'https://ssl' : 'http://www') + '.google-analytics.com/ga.js';

to

ga.src = ('https:' == document.location.protocol ? 'https://ssl' : 'http://www') + '.yoursite.com/js/ga.js';

At this point, your site will now run the script from your website locally! However, this means the script will never update. Unless you re-run this short process every week. That is up to you.. but I'm far too lazy for that.

This is where the CRON comes into play:

Just about every single hosting service will have an option for you to set up cron jobs. On Hostinger it is on your Hosting Panel, on GoDaddy you will find it under the Content option.

Put the following script into your cron, and all you need to do is change the absolute path to the variable $localfile. What this script does is pull the updated script from Google for the ga.js file. You can set the time frame on how often you want it to run this process. Ranging from once every hour to once a month and beyond.

If you're also doing this for external files other than Google Analytics, then you will also need to change the variable $remoteFile. So $remoteFile is the URL to the external JavaScript file and the variable $localFile you will put the path to your new locally stored file, simple as that!

<?
// script to update local version of Google analytics script

// Remote file to download
$remoteFile = 'http://www.google-analytics.com/ga.js';
$localfile = 'ENTER YOUR ABSOLUTE PATH TO THE FILE HERE';
//For Cpanel it will be /home/USERNAME/public_html/ga.js

// Connection time out
$connTimeout = 10;
$url = parse_url($remoteFile);
$host = $url['host'];
$path = isset($url['path']) ? $url['path'] : '/';

if (isset($url['query'])) {
  $path .= '?' . $url['query'];
}

$port = isset($url['port']) ? $url['port'] : '80';
$fp = @fsockopen($host, '80', $errno, $errstr, $connTimeout );
if(!$fp){
  // On connection failure return the cached file (if it exist)
  if(file_exists($localfile)){
    readfile($localfile);
  }
} else {
  // Send the header information
  $header = "GET $path HTTP/1.0\r\n";
  $header .= "Host: $host\r\n";
  $header .= "User-Agent: Mozilla/5.0 (Windows; U; Windows NT 5.1; en-US; rv:1.8.1.6) Gecko/20070725 Firefox/2.0.0.6\r\n";
  $header .= "Accept: */*\r\n";
  $header .= "Accept-Language: en-us,en;q=0.5\r\n";
  $header .= "Accept-Charset: ISO-8859-1,utf-8;q=0.7,*;q=0.7\r\n";
  $header .= "Keep-Alive: 300\r\n";
  $header .= "Connection: keep-alive\r\n";
  $header .= "Referer: http://$host\r\n\r\n";
  fputs($fp, $header);
  $response = '';

  // Get the response from the remote server
  while($line = fread($fp, 4096)){
    $response .= $line;
  }

  // Close the connection
  fclose( $fp );

  // Remove the headers
  $pos = strpos($response, "\r\n\r\n");
  $response = substr($response, $pos + 4);

  // Return the processed response
  echo $response;

  // Save the response to the local file
  if(!file_exists($localfile)){
    // Try to create the file, if doesn't exist
    fopen($localfile, 'w');
  }

  if(is_writable($localfile)) {
    if($fp = fopen($localfile, 'w')){
      fwrite($fp, $response);
      fclose($fp);
    }
  }
}
?>

That is it, and should fix any issues you're having with Leverage Browser Caching third party scripts.

Source: http://diywpblog.com/leverage-browser-cache-optimize-google-analytics/

NOTE:

In truth, these files don't tend to have a great effect on your actual page speed. But I can understand the worry you have with Google penalising you. But that would only happen if you had a LARGE amount of these external scripts running. Anything Google related will not be held against you either as I stated earlier.

Mims answered 14/7, 2016 at 15:8 Comment(3)
This can't be done with all 3rd party scripts. Many of them will need to be accessed from their original domain to enable cookie setting and other domain-specific behavior.Ariew
An alternate solution that preserves cookies and other domain specific behavior is to proxy those requests through your server, but instruct the proxy to add an expiry date (and don't forget to make sure the proxy only accepts URLs you expect). I've seen a WordPress plugin do this before.Jimmy
Does this require editing each plugin that loads an external script?Gooseherd
E
2

Not sure if this code snippet will help someone, but anyway this is how I cache an external js file.

<script>
 $.ajax({
 type: "GET",
 url: "https://www.google-analytics.com/analytics.js",
 success: function(){},
 dataType: "script",
 cache: true
 });
</script>
Effable answered 19/7, 2017 at 10:15 Comment(2)
I tested this method however the script did not cache.Gooseherd
you probably need wrap the script into jQuery function otherwise your WordPress build will display an errorEffable
M
1

If you are on WordPress, you can use "Cache External Scripts" plugin for this. With minimal plugin code tweaking, you can add support for the other 3rd party javascript files in addition to the Google ones

Mulford answered 25/10, 2018 at 2:18 Comment(1)
it's worth mentioning that this plugin requires php knowledge and the ability to read and understand what you're looking at. if you're not a PHP programmer, you will not be able to use this plugin.Dominance

© 2022 - 2024 — McMap. All rights reserved.