Home » Server optimization » Page 3

Serving 2012 Olympic Tickets: London2012 15, CoSport 0 – olympic website performance

The race is on to get tickets to the 2012 Summer Olympics in London, England. Millions of people (some 2.3 to 2.7 million signed up to request tickets so far) are clamoring to attend events featuring the likes of Michael Phelps, Ian Thorpe, and Usain Bolt. To distribute the load in the initial ticket request phase, Olympic organizers are using a batch system to accept requests over a six week period. Even with this even-handed approach, the servers were under tremendous load in the early hours of this past Tuesday. We investigate and show how performance could be improved.

Read more

ETags Revisited – configure entity tags to improve cache performance on websites

An Entity tag (ETag) is a unique identifier assigned to a specific version of a given resource on a web server. ETags are used as a cache control mechanism to allow client browsers to make conditional requests. This lets caches work more efficiently by reusing unchanged resources on the client, and avoiding full server responses if the content has not changed. Efficient caching saves bandwidth and improves performance by delivering only what the client needs, not what it already has.

Read more

Automatically Speed Up Your Site with mod_pagespeed for Apache

mod_pagespeed is an open-source Apache 2.2+ module designed to automatically speed up your site by optimizing the various components of your web pages. The module does this by rewriting page resources using filters that adhere to web performance best practices. mod_pagespeed is the server version of the client-side Firefox module that analyzes web pages using similar best practices.

Read more

Velocity 2010 Web Performance Conference

I just got back from the Velocity 2010 Web Performance and Operations Conference in Santa Clara, California. Now in its third year, the conference is a must-see if you are a web performance or network operations engineer. And engineers were aplenty this year, the conference was sold out with more than 1500 attendees.

Read more

HTTP Compression – content encoding can compress xhtml, css, and javascript for faster web page download times web compression

HTTP compression, otherwise known as content encoding, is a publicly defined way to compress textual content transferred from web servers to browsers. HTTP compression uses public domain compression algorithms, like gzip and compress, to compress XHTML, JavaScript, CSS, and other text files at the server. This standards-based method of delivering compressed content is built into HTTP 1.1, and most modern browsers that support HTTP 1.1 support ZLIB inflation of deflated documents. In other words, they can decompress compressed files automatically, which saves time and bandwidth.

Read more

Web Page Performance Thesis – web page response time measurement, modeling and monitoring

For his doctoral thesis at the University of Glasgow, Thiam Kian Chiew studied web page performance. As part of his research, Chiew explored the different factors that affect web page speed, testing and modeling the key components to web page download times. His findings are summarized below.

Read more

Offload Resource Intensive Scripts to Improve Server Performance

In a previous speed tweak we showed how upgrading our web server to a solid state drive and faster processors improved response times by 35% to 50%. With the Web Page Analyzer on the same server, the average response time was 11.87 seconds with typical response times after ramp-up ranging from 23 to 36 seconds. While this was an improvement over the old server, there were still response time and availability issues caused by hosting the analyzer script on the same server. This article shows how offloading a resource intensive script can dramatically improve your server response times.

Read more

Solid-State Drive Web Server Test – upgrade server speed with benchmarking tools

This article shows how response times improved after upgrading to a new dedicated web server with solid-state drives and faster processors. With the increased load due to our free Web Page Analyzer, we realized that a different server configuration may be necessary. We first benchmarked our current server to establish a baseline. Then we tested a faster server to see the improvement in response times. After the upgrade, the web server tested 35% faster. In a future article, we’ll look at moving the resource hungry analyzer tool to a separate server to improve performance for the WebsiteOptimization.com website.

Read more

Optimize Parallel Downloads to Minimize Object Overhead

With the average web page made up of more than 50 objects (Krishnamurthy and Wills 2006), object overhead now dominates the latency of most web pages (Yuan 2005). Following the recommendation of the HTTP 1.1 specification, browsers typically default to two simultaneous threads per hostname. As the number of HTTP requests required by a web page increase from 3 to 23, the actual download time of objects as a percentage of total page download time drops from 50% to only 14% (see Figure 1).

Read more