Today I ran accross the blog entry by Marcelo Calbucci, called "Web Developers: Speed up your pages!".
It’s a typical example of good idea, bad execution. Most of the points he mentions are really bad practice.
He suggests reducing traffic (and therefore loading time) by removing whitespace from the source code, to write all code in lower case (for better compression?!?), reduce code by writing invalid xhtml, and to keep javascript function names and variables short. This is nit-picking. And results in a maintenance nightmare.
For big sites, e.g. Google, the white space reduction tricks make sense. But they have enourmous numbers of page impressions. Saving 200 bytes tops by stripping whitespace is nearly worthless for smaller sites. And not worth the trouble. Additionally I bet that Google does not maintain that page as such, but has created some kind of conversion script.
Other thoughts are quite nice but commonplace. Most of the comments (e.g. by Sarah) posted at that article reflect my opinion quite well and deal with each point in more detail.
For most dynamic pages the bottleneck for responding to a client request is the script loading (or running) time. I suggest the writer to read some articles about server caching (my thesis also deals with that topic) and optimization.
Often also the latency between client and server can be held responsible for considerable delays. As the client has to parse the HTML file to decide what files to load next, delays can sum up.
All in all, it’s a good idea to deal with the loading time of a page. But you have to search at the right place.