I’d like to introduce you to this great book by Steve Souders. There already have been several reports on the Internet about it, for example on the Yahoo Developers Blog. There is also a video of Steve Souders talking about the book.
The book is structured into 14 rules, which, when applied properly, can vastly improve the speed of a web site or web application.
Alongside with the book he also introduced YSlow, an extension for the Firefox extension FireBug. YSlow helps the developer to see how good his site complies with the rules Steve has set up.
I had the honour to do the technical review on this book, and I love it. Apart from some standard techniques (for example employing HTTP headers like Expires or Last-Modified/Etag), Steve certainly has some tricks up his sleave:
For instance he shows how it is possible to reduce the number of HTTP requests (by inlining the script sources) for first time visitors, while still filling up their cache for their next page load (see page 59ff).
The small down side of this book is that some rules need to be taken with care when applied to smaller environments; for example, it does not make sense (from a cost-benefit perspective) for everyone to employ a CDN. A book just can’t be perfect for all readers.
If you are interested in web site performance and have a developer background, then buy this book (or read it online). It is certainly something for you.
The book has been published by O’Reilly in September 2007, ISBN 9780596529307.
Some more links on the topic:
- Yahoo!’s Chief Performance Guru Talks about Writing his New Book
- Slides from his presentation at WebEx 2007
- Podcast on YSlow
You sound pretty bitter. Why?
Copy and paste might not be the most efficient way of combining JavaScript files, but the method will do the job. The point is that a pretty good way of speeding up a web page is reducing the number of HTTP requests. By combining the files you reduce the number of requests from x to 1. Usually pretty good.
Please remember that these methods are supposed to be the “cutting edge” of optimization. Usually you would do this on high profile sites and have a deployment process that will combine the files in a more efficient way.
I do agree 100% that you can’t regularly deploy your App by doing copy and paste. Nobody who is working professionally on this will use copy and paste. I understood Steve referring to a small “standard” web site that someone wants to improve.
In the end the question is rather whether the rules Steve puts out are meant to be for an average site owner. On that side I think we rather have problems with improperly compressed (or not even compressed) images.
All in all I agree that Steves does a not-so-good job at explaining to whom this book and his practises actually matter. Bringing down a page of 200 HTTP requests to 100 makes sense. If you only have 20 requests, then leave the page alone. If you run Yahoo, you might want to bring it down to 2 requests.
It is the long standing battle between maintainability and optimization. If you run a huge web site that addresses a huge audience, it might make sense to delve into the topic of how to speed the page up to an unreasonable degree :)
The root of the problem is not particularly on Steve’s side. It’s rather the general perception of optimization. There are people who want to optimize the heck out of everything. And those tend to be very vocal. Still, I think a book like Steve’s is a better way of satisfying such people than web sites giving wrong suggestions.
Steve’s book and YSlow are to be used by a small audience. And by those who want to tune everything.