On this page:
Web developers often spend considerable (and needed) effort to ensure their architecture performs well. n-tier architectures, performance of the middle tier and databases are given top priority.
Oftentimes the performance of the client side (in particular the perceived performance/download speed) is neglected or given lower priority.
People at Yahoo, Google and elsewhere have found that as much as 80% of the user’s time can be spent waiting for things other than the main HTML to download. Furthermore, browsers may not be caching as many files as you think.
(This is not discouraging performance tuning your overall architecture of course; its no good if it takes a long time to receive the first bytes of your page in the first place!)
Techniques in the old days
In the “old days” of the Internet, dial-up was the primary way to access the Internet. “Web standards” were a distant dream.
All sorts of tricks were used to speed up web sites: smaller images, omitting quotes in HTML attributes (yes, people went that far!), using fixed width tables in IE 5 (the best browser at the time, using
<table style="table-layout:fixed">), balancing server response buffering vs creating the whole page first and then sending it all in one response, etc.
But some techniques considered back then are still relevant today, e.g:
- Putting images on different domains (though now other resources are worth considering too)
- Compressing the server output using gzip (a bit hit and miss on earlier versions of IIS though)
- Cache control using the Expires header
- Use CSS (though in those days I used it mostly for font definitions and browser sniffing was needed to load different CSS files!)
Techniques now; some of the same
Fast forward to now, and broadband is widespread today (but don’t forget, dial-up is still around!).
Web standards encourage CSS-based layouts, reducing overall HTML bloat (some pages my colleagues and I have converted are 75, even 80% smaller!). Table-based layout and the font tag are mostly put to rest by people using such techniques (though its easy to forget that even into 2007 most of the web is not this way, and even with modern techniques we sometimes see sites where “div-bloat” and “span-itis” replace table- and font-heavy HTML!).
Nate Koechley, a senior developer a Yahoo gave a great presentation at @media07, called High Performance Web Sites summarizing a number of recent findings and suggestions.
In recent months, others at Yahoo as well as Google and elsewhere have been providing a lot of analysis. Perhaps the highest level summary guidelines would be the following:
- Use GZip on all text-based output (which should be optimized anyway via web standards!)
- Help browsers ensure they can cache as many of your files as possible
- Disperse your content through things like content distribution and extra sub-domains (though be aware of extra DNS-lookups that might result)
Rather than dive into the details here (for now), the following may be useful starting points:
- Exceptional Performance from the Yahoo Developer Network. Some gems from them include the following
- Optimizing Page Load Time from a Google engineer. (One of the many interesting tips was to ensure HTTP keep-alives were enabled. They usually are but for a major client of ours last year as we launched a new web site, all was fine until the actual launch day when things were so slow. For a couple of days the hardware engineers were looking into this and had brought in a Cisco engineer. I decided to have a quick look. Using the excellent LiveHttpHeaders extension for Firefox, I noticed that HTTP keep-alives were not set. There had been some hardware problems or something which turned this off. Amazing how much difference this setting can make!)
- Reducing HTTP requests from fiftyfoureleven.com (also makes a note about the impact of http packet sizes worth reading)
Some useful tools
- YSlow is a plugin for Firebug that measures the 13 tips Yahoo mentions (see articles above)
- Firebug extension for Firefox is indispensable for all manner of web development. It has a tab graphically to show content being requested, how long it takes, etc. Very useful to visually see how well your page components download
- LiveHttpHeaders extension for Firefox allows you to see HTTP traffic in real time
- iehttpheaders is plugin for Internet Explorer, similar to LiveHttpHeaders
If you had only once choice, go with gzip. You can combine it too.
Some time ago I wondered if minification might actually reduce the repeating patterns that make compression effective. A quick test found it was better to avoid minification. I unfortunately never saved the test files.