Wouldn't it be nice if we could cache across domains?
You probably already know that caching content in the browser can make accessing your site faster. Great! But... this only helps for repeat users. They still have to download all the files the first time.
This is obvious though, right? How could your browser possibly know what your files contain before they have ever accessed your site?
Well, they can. There has been a recent trend of using public CDNs, such as Google Hosted Libraries, Google Web Fonts and cdnjs. They allow you to link to an external address, such as http://cdnjs.cloudflare.com/ajax/libs/jquery/1.8.1/jquery.min.js. When requested, the server sends the appropriate caching headers, such that browsers will use the file for whichever websites request it. This is speeding up the web for your users and reducing the burden of data on the internet for everyone.
However, this implies a lot of trust between you, the webmaster, and the CDN owners. Personally, I trust Google not to insert malicious code as it would severely affect its reputation, but you may not. Furthermore, some countries limit requests to certain domains, so you could inadvertingly be limiting your audience. Finally, whether they are doing it or not - and I am not claiming they are - you are giving the CDNs the opportunity to occasionally track your users.
What am I proposing?
We should add a new attribute to
<script> tags, called
The value should be a hex-encoded string representing the hash of the file you
wish to use.
If the browser has a file in its cache that has the same hash, it should
use that. Otherwise, it should request it from the server and validate the hash.
We still speed up the web by caching, but:
- We don't need to trust 3rd parties.
- We distribute the responsibility for serving these common files to everyone, rather than just relying on some very large players. This removes the single point of failure that is Google CDN.
- We reduce the number of requests, as the cache remains valid for as long as
the hash matches. It's
ETagson steroids... sort of.
- We may be improving the security of the caches, as browsers can validate the contents of the cache before running it and makes it that much harder for malicious viruses to sabortage your site.