Title: Page 116 – Alex Kirk

---

 * 
   ## 󠀁[Announcing Wizlite: Collaborative Page Highlighting](https://alex.kirk.at/2006/01/16/announcing-wizlite-collaborative-page-highlighting/)󠁿
   
 * January 16, 2006
 * So I’m [not the first](http://ajaxian.com/archives/announcing-wizlite-collaborative-page-highlighting)
   to write about my project? Well. Nice ;)
 * [Wizlite](http://wizlite.com/) takes the good old highlighting marker from paper
   to web. People get different colors and mark important sections on any homepage.
 * Users can create groups and wizlite away on a certain topic (either private or
   public).
 * You’d have to use it to experience how fun it is ;) So: [http://wizlite.com/](http://wizlite.com/)
 * wizlite, collaborative, highlight, groups
 * [Web](https://alex.kirk.at/category/web/), [wizlite](https://alex.kirk.at/category/projects/wizlite/)
 * 
   ## 󠀁[Speed up your page, but how?](https://alex.kirk.at/2006/01/03/49/)󠁿
   
 * January 3, 2006
 * Today I ran accross the blog entry by Marcelo Calbucci, called "[Web Developers: Speed up your pages!](http://bravenewword.typepad.com/brave_new_word/2005/11/web_developers_.html)".
 * It’s a typical example of good idea, bad execution. Most of the points he mentions
   are really bad practice.
 * He suggests reducing traffic (and therefore loading time) by removing whitespace
   from the source code, to write all code in lower case (for better compression?!?),
   reduce code by writing invalid xhtml, and to keep javascript function names and
   variables short. This is nit-picking. And results in a maintenance nightmare.
 * For big sites, e.g. Google, the white space reduction tricks make sense. But 
   they have enourmous numbers of page impressions. Saving 200 bytes tops by stripping
   whitespace is nearly worthless for smaller sites. And not worth the trouble. 
   Additionally I bet that Google does not maintain that page as such, but has created
   some kind of conversion script.
 * Other thoughts are quite nice but commonplace. Most of the comments (e.g. by 
   Sarah) posted at that article reflect my opinion quite well and deal with each
   point in more detail.
 * For most dynamic pages the bottleneck for responding to a client request is the
   script loading (or running) time. I suggest the writer to read some articles 
   about server caching ([my thesis](https://alex.kirk.at/papers/caching-strategies/diploma_thesis.html)
   also deals with that topic) and optimization.
 * Often also the latency between client and server can be held responsible for 
   considerable delays. As the client has to parse the HTML file to decide what 
   files to load next, delays can sum up.
 * All in all, it’s a good idea to deal with the loading time of a page. But you
   have to search at the right place.
 * web, speed, xhtml, script, caching
 * [Code](https://alex.kirk.at/category/code/)

 [Previous Page](https://alex.kirk.at/page/115/?output_format=md&term_id=1122) [Next Page](https://alex.kirk.at/page/117/?output_format=md&term_id=1122)