It’s not often as a Web performance consulatant and analyst that I find a book that is useful to so many clients. It’s much more rare to discover a book that can help most Web sites improve their response times and consistency in fewer than 140 pages.
Steve Souders’ High Performance Web Sites (O’Reilly, 2007 – Companion Site) captures the essence of one-side of the Web performance problem succinctly and efficiently, delivering a strong message to a group he classifies as front-end engineers. It is written in a way that can be understood by marketing, line-of-business, and technical teams. It is written in a manner designed to provoke discussions within an organization with the ultimate goal of improving Web performance
Once these discussion have started, there may some shock withing these very organizations. Not only with the ease with which these rules can be implemented, but by the realization that the fourteen rules in this book will only take you so far.
The 14 Rules
Web performance, in Souders’ world, can be greatly improved by applying his fourteen Web performance rules. For the record, the rules are:
Rule 1 – Make Fewer HTTP Requests
Rule 2 – Use a Content Delivery Network
Rule 3 – Add an Expires Header
Rule 4 – Gzip Components
Rule 5 – Put Stylesheets at the Top
Rule 6 – Put Scripts at the Bottom
Rule 7 – Avoid CSS Expressions
Rule 8 – Make JavaScript and CSS External
Rule 9 – Reduce DNS Lookups
Rule 10 – Minify JavaScript
Rule 11 – Avoid Redirects
Rule 12 – Remove Duplicate Scripts
Rule 13 – Configure ETags
Rule 14 – Make AJAX CacheableFrom the Companion Site [here]
These rules seem simple enough. And, in fact, most of them are easy to understand, and, in an increasingly complex technical world, easy to implement. In fact, the most fascinating thing about the lessons in this book, for the people who think about these things everyday, is that they are pieces of basic knowledge, tribal wisdom, that have been passed down for as long as the Web has existed.
Conceptually, the rules can be broken down to:
- Ask for fewer things
- Move stuff closer
- Make things smaller
- Make things less confusing
These four things are simple enough to understand, as they emphasize simplicity over complexity.
For Web site designers, these fourteen rules are critical to understanding how to drive better performance not only in existing Web sites, but in all of the sites developed in the future. They provide a vocabulary to those who are lost when discussions of Web performance occur. The fourteen rules show that Web performance can be improved, and that something can be done to make things better.
Beyond the 14 Steps
There is, however, a deeper, darker world beneath the fourteen rules. A world where complexity and interrelated components make change difficult to accomplish.
In a simple world, the fourteen rules will make a Web site faster. There is no doubt about that. They advocate for the reduction object size (for text objects), the location of content closer to the people requesting it (CDNs), and the optimization of code to accelerate the parsing and display of Web content in the browser.
Deep inside a Web site lives the presentation and application code, the guts that keep a site running. These layers, down below the waterline are responsible for the heavy lifting, the personalization of a bank account display, the retrieval of semantic search results, and the processing of complex, user-defined transactions. The data that is bounced inside a Web application flows through a myriad of network devices — firewalls, routers, switches, application proxies, etc — that can be as complex, if not more so, than the network complexity involved in delivering the content to the client.
It is fair to say that a modern Web site is the proverbial duck in a strong current.
The fourteen rules are lost down here beneath the Web layer. In these murky depths, far from the flash and glamor, parsing functions that are written poorly, database table without indices, internal networks that are poorly designed can all wreak havoc on a site that has taken all fourteen rules to heart.
When the content that is not directly controlled and managed by the Web site is added into this boiling stew, another layer of possible complexity and performance challenge appears. Third parties, CDNs, advertisers, helper applications all come from external sources that are relied on to have taken not only the fourteen rules to heart, but also to have considered how their data is created, presented, and delivered to the visitors to the Web site that appears to contain it.
Remember the Complexity
High Performance Web Sites is a volume (a pamphlet really) that delivers a simple message: there is something that can be done to improve the performance of a Web site. Souders’ fourteen rules capture the items that can be changed quickly, and at low-cost.
However, if you ask Steve Souders’ if this is all you need to do to have a fast, efficient, and reliable Web site, he should say no. The fourteen rules are an excellent start, as they handle a great deal of the visible disease that infects so many Web sites.
However, like the triathlete with an undiagnosed brain tumor, there is a lot more under the surface that needs to be addressed in order to deliver Web performance improvements that can be seen by all, and support rapid, scalable growth.
This is a book that must be read. Then deeper questions must be asked to ensure that the performance of the 90% of a Web site design not seen by visitors matches the 10% that is.
Leave a Reply