I’m optimizing this blog at the moment, and I’m learning quite some new things about compression, gzipping and other optimization techniques. I’m using the Yahoo! YSlow extension for Firefox to see if I’m heading in the right direction. In this article I’ll explain the three best techniques I used to speed up solutoire.com.

Setting up a CDN

First thing I created a CDN. Most browsers support only a few persistent requests to the same domain. By placing static files (css/js/images) to a cdn subdomain, visitors can load content with twice the allowed amount of persistent requests. I asked my hosting provider (and employer) Solware to create a new subdomain called “cdn” (cdn.solutoire.com). Then I moved all static files to the cdn subdomain. Without configuring YSlow, it doesn’t have impact on the score, but by adding the cdn domain to your YSlow configuration, the score should be higher, and the YSlow rule 2 should be graded ‘A’.

Note 1: the .htaccess file that’s in the root of your domain doesn’t configure Apache in for the subdomain, so create an other .htaccess for your cdn.

When you have a large site with lots of visitors and/or a huge amount of static files, a nice alternative is moving your files to Amazon S3. Amazon S3 is a paid (but cheap) storage provider service. I’m a big fan of their initiative:

Amazon S3 provides a simple web services interface that can be used to store and retrieve any amount of data, at any time, from anywhere on the web. It gives any developer access to the same highly scalable, reliable, fast, inexpensive data storage infrastructure that Amazon uses to run its own global network of web sites. The service aims to maximize benefits of scale and to pass those benefits on to developers.

Note 2: having to many cdn domains results in more DNS lookups. According to Yahoo! a DNS lookup approximately takes 20-120 ms. DNS lookups are cached by most browsers, so there’s a delay only for first-time visitors.

Compressing *.js and *.css using the YUI Compressor

A week ago John Resig posted a very interesting article called “JavaScript Library Loading Speed” in which he wrote the following:

When distributing a piece of JavaScript code it’s traditional to think that the smallest (byte-size) code will download and load the fastest. This is not true – and is a fascinating result of this survey. Looking at the speed of loading jQuery in three forms: normal, minified (using Yahoo Min), and packed (using Packer). By order of file size, packed is the smallest, then minifed, then normal. However, the packed version has an overhead: It must be uncompressed, on the client-side, using a JavaScript decompression algorithm. This unpacking has a tangible cost in load time. This means, in the end, that using a minifed version of the code is much faster than the packed one – even though its file size is quite larger.

While I knew about the fact that packed files had a delay because of the evaluation time, I never thought it took a significant amount of time. Anyway, I stopped packing my files, and started minifying them, using the YUI Compressor. A great thing about this tool is that it’s able to compress both javascript files and css files. The downside of using this compressor is that it’s written in Java, and there doesn’t seem to be a decent online YUI Compressor service. So I started looking around for Apache Ant build scripts. I found an excellent explanation about compressing files using the YUI Compressor and Ant, written by the Yahoo! engineer Julien Lecomte. I’m not going to explain how to do this, because I would tell you the same as Julien. I can push you in the right direction by referring to a previous post on this blog called “Automate builds with Ant” in which I explain how to set up the Aptana IDE to use Apache Ant.

Configuring the .htaccess file

The .htaccess file is the Apache configuration file for the folder the file resides in. It’s better to change the httpd.conf file, but most of you won’t have access to that file (because of shared hosting).

Removing ETags and Last-Modified headers

ETags, or Entity tags, are a mechanism used by servers and browsers to compare cached files with files from the server. Yahoo! explains the problem with ETags:

The problem with ETags is that they typically are constructed using attributes that make them unique to a specific server hosting a site. ETags won’t match when a browser gets the original component from one server and later tries to validate that component on a different server, a situation that is all too common on Web sites that use a cluster of servers to handle requests.

When the ETags and Last-Modified headers are removed, files will stay in cache till the header expires, and there are no checks performed to see if the file on the server-side has changed. So I also remove the Last-Modified header in my .htaccess.

Gzipping content

Gzipping your content before it’s sent to the browser, may reduce the filesizes with 70%. Almost 90% of internet traffic goes through browsers that support gzipped content. This speeds up your website quite a bit and saves bandwidth, especially when you have serve a lot of javascript and css files. Gzip doesn’t compress images.

Add Expire headers

When expire headers are set, the browser caches content from the server on the clientside. First-time visitors make lots of requests, the second time visitors visit the page, the browser uses the cached content. So when you set the expire headers for files, the browser knows when to throw away the cached files, and fetch new ones. I use different expire headers for different filetypes.

Note: it should be obvious you should only set expire headers for static content. Because the browser caches dynamic content, it’s only refreshed when the header expires. When you do set headers for dynamic content, make sure the expire period is small.

My .htaccess looks something like this, it’s made from examples from several articles at AskApache.com:

Header unset ETag
FileETag None

# Cache for quite some time
<FilesMatch "\.(ico|pdf|flv|jpg|jpeg|png|gif|js|css|swf)$">
Header set Cache-Control "public"
Header set Expires "Thu, 15 Apr 2010 20:00:00 GMT"
Header unset Pragma
Header unset Last-Modified
</FilesMatch>

# Cache for 2 days
<FilesMatch "\.(xml|txt)$">
Header set Cache-Control "max-age=172800, public, must-revalidate"
Header unset Pragma
Header unset Last-Modified
</FilesMatch>

# Cache for 2 hours
<FilesMatch "\.(html|htm)$">
Header set Cache-Control "max-age=7200, must-revalidate"
Header unset Pragma
</FilesMatch>

# Gzip compression
<FilesMatch "\.(js|css)$">
SetOutputFilter DEFLATE
</FilesMatch>

YSlow gives me grade D

While I can notice the speed increase after the optimization steps, YSlow still gives me grade D (67). This is because of the ads I added to my site. I use Google Adsense to generate some income from this blog, but the javascript that’s served from the Google servers aren’t cached and it causes the low grade from YSlow. I did everything that’s within my power (see comment by nicolash), but for now, Google is the bottleneck. I can’t do anything about it, because changing the ads code from Adsense would make me violate the Adsense TOS. Bummer…