With Google now factoring page loading speed into search engine rankings, having a fast website has become even more important. Exactly how much of a factor speed is in search rankings is open to debate, although anecdotal evidence suggests that speed can affect rankings. One of the first steps companies need to take in improving their website speed is a web performance audit. This article details what to expect in a web performance analysis.
A web performance audit is an independent analysis of the factors that contribute to the response time of a website. In addition to measuring the size and rendering speed of the various components that make up a web page, a web performance audit should offer recommendations to speed up the response time of initial and subsequent page loads. Depending on its depth, a web performance audit can also include optimized code examples to replace older less efficient code (XHTML, CSS, JavaScript, and backend PHP, Perl, Java, etc.).
Performance audits come in two flavors, front-end and back-end. Front-end audits are more common than back-end because about 80% to 90% of the time users wait for pages to load are spent on the front-end (Souders 2007). Back-end audits are still useful for performance because you can potentially gain orders of magnitude improvements in performance from back-end optimization. A front-end audit typically includes the following:
A back-end audit explores the factors that effect the delivery of the front-end content. Back-end audits typically review server setups, CMS and module configuration, and SQL query efficiency. An example outline of a back-end audit follows:
A hybrid performance audit combines both front-end and back-end analyses into one complete report. Hybrid audits analyze the two main components that contribute to web page speed, the server configuration that delivers the content to the user, and the content itself and how quickly the content renders.
Automated performance tools like YSlow, Page Speed, and WebPageAnalyzer.com can give you some useful metrics and guidelines. Performance experts use these and other tools to analyze your site, review your code in detail, and triage and implement recommendations.
What can you expect to find when reviewing a site for web performance? Here are some common performance problems we've found when reviewing web sites.
Figure 1 shows an example of a long time to deliver the first byte, over 4 seconds before any data is sent. This delays the time for the display of useful content. A long TTFB indicates a server performance issue, typically from an overloaded shared or dedicated server.
For secure transactions you often see a slow SSL connection (see Figure 2). Secure Sockets Layer handshakes are resource intensive and can slow down https pages. One solution is to offload SSL connections to a specialized server.
When browsers load and execute JavaScript they stop other rendering duties including downloading other resources. Misplaced JavaScript (typically placed after CSS instead of before), can block the downloading of subsequent files. Here is an example where JavaScripts were placed between CSS files (see Figure 3), note the misplaced CSS file after a JavaScript file.
With the average web page requiring more than 43 objects per page object overhead now dominates web page latency. Multiple objects (external JavaScript and CSS files typically) within the HEAD of XHTML documents are particularly harmful to web page performance because they delay the rendering of useful body content that users can interact with. See Figure 3 for an example. One solution is to combine JavaScript files into fewer files, and do the same for CSS. Another solution is to "split the payload" and load only the CSS and JavaScript required by the page (before the onload event) and defer loading of the rest of the JavaScript and CSS after the body content loads (by attaching a node say with JavaScript after the onload event fires).
One trend that we're seeing lately with the advent of Web 2.0-enabled pages is the growth of JavaScript. Libraries like jQuery, Prototype, YUI, and Dojo are used and combined with other behavior (menus in particular) to create more interactive and responsive web interfaces. Unfortunately, all of this JavaScript has to be downloaded and parsed. We've seen total JavaScript payloads larger than 500K! A couple solutions present themselves, compress the JavaScript (with GZIP or deflate) which typically compresses these text files by 75% to 80%. Even better, combine and minify the JavaScript files before compression. The best solution is to rethink the behavior of the page in question, and substitute standards-based methods (CSS drop-down menus for example) for JavaScript.
One of the useful metrics that Page Speed gives is an indication of how much of the CSS is actually referenced in a target web page. In recent analyses we've found a fair amount of CSS going unused within pages and sites. To combat CSS bloat remove legacy CSS left over from past designs, and adopt a more object-oriented approach ala Nicole Sullivan (see OOCSS.org for more information).
Removed unused CSS
45% of CSS (estimated 31.7kB of 70.4kB) is not used by the current page.
HTTP compression is a standards-based way to compress the textual content of your web pages. HTTP compression or "content encoding" uses GZIP and deflate to compress your XHTML, CSS, and JavaScript at the server, and modern browsers decompress these files automatically to speed up web page downloads and save bandwidth. Even though on average HTTP compression can save 75% or more off of textual files, only 66% to 89% of compressible text is compressed on the Web (Ramachandran 2010).
You can test your site for HTTP compression using the above tools or Web Page Analyzer or Port80Software.com tools.
On average images make up 64.3% of the average web page size, and more than 75% of all HTTP requests, so optimizing images is one of the first places to start when speeding up web pages. While lossy methods can significantly reduce web image sizes, not everybody wants to change the quality of their images. You can losslessly trim unnecessary bytes and blocks from your images using a tool like Smushit from Yahoo! (see Figure 4)
The XHTML provides the skeleton to display the objects in a web page. Making this skeleton lightweight is one key in fast web page display. Overuse of tables for layout and embedded styles and JavaScript are two common problems. One thing to avoid is saving HTML files from Word files. Word outputs a large amount of unnecessary code:
By website optimization on 30 Sep 2010 AM