So I’m incredibly biased in this discussion (I work on Dojo), but good tools help a lot.

Firstly, a “compressor” (like the one we build Dojo with) reduces the size of the code significantly. Running the variant of Prototype included in the scriptaculous 1.5 beta through the Dojo compressor took 33% off the file size without changing the file’s semantics at all.

Next, a good build system will put all your scripts into a single file. Say what you will about file size, but the limiting factor in perceived performance today is network latency combined with the synchronous nature of browser rendering engines, not network throughput. The less you have to reach-out-and-touch the network (even for a 304), the better off you are. On top of that, something like Dojo or JSAN allow you to grab just what you need, reducing the overall size of a library based on your usage of it. A capable package system even lets you do this without changing the way you develop and without cutting you off from the rest of the available APIs, should you need them.

Lastly, there’s much to be said for gzip encoding on the wire and a good HTTP server configurationn. The best kind of data is the kind you don’t have to (re)send, and the next best thing is the gzipped kind.

So yes, large libraries are a problem, but developers need some of the capabilities they provide. The best libraries, though, should make you only pay for what you use. Hopefully Dojo and JSAN will make this the defacto way of doing things.

Regards