When Ajax began to rise, there was quite a movement towards the prototype javascript library. This was also pushed by the great Ruby on Rails. Then came the visual effects of script.aculo.us. They look great, they really do. But for what price? Lots of KB of code.
alex@www:~/scriptaculous/$ du *.js -ch --apparent-size
23K controls.js
18K dragdrop.js
21K effects.js
28K prototype.js
899 scriptaculous.js
12K unittest.js
8.7K util.js
109K total
This is unacceptable for just a library. Most of these KB’s have to be downloaded and do not provide any functionality per-se. Broadband is not an argument here. To load or not to load 100kb is relevant.
I therefor really like the Sack of Ajax. It takes only about 4K:
alex@www:~/sack/$ du *.js -ch --apparent-size
3.9K tw-sack.js
3.9K total
Now this does not give us all the script.aculo.us stuff. For that case I suggest to just reuse the relevant parts of it. Just let the user download what you really use. Maybe one day we will see a reduced script.aculo.us. Or an alternative using Sack.
UPDATE: I now recommend to use protoype.js again, in a reduced version just for AJAX.
I agree that broadband and HTTP compression allows larger amounts of data be transfered to the user in less time. But it does not become irrelevant. I just wanted to show how fast a hundred kilo bytes can come together without any functionality added.
Additionally I’m not so sure how the size of the javascript increases in memory. This still needs some testing.
I’m just trying to alert developers that they should overthink the way they structure their application. Don’t load too much at once but try to do this over time. Try to present the user the initial view as quickly as possible.
I think library size *is* an issue to some degree. Of course it varies with the situation.
True that bandwidth is increasing and so. But what is also increasing is the amount of stuff we pump thru these bandwidth pipes. I think *percieved* speed stays roughly constant.
A good analogy would be the hard disk. There was a time when 20MB was big. We were able to fill them at the time. Now a few hundred GB is big (order of 10^3). We are finding ways to fill these hundreds of GBs. Percieved storage space stays roughly constant.
One of the ideas that ajax tries to bring us is to not waste so much user’s time with page requests. This is why I think it is important not to have bloated ajax apps.
Once the js library is loaded, it doesn’t need to be loaded until a new page refresh is done. But bloating is not only about download time. It’s also about memory consumtion and responsiveness.
I hope we don’t see too much ajax being used simply for eye-candy (animations and so) just for the sake of it. That happened with flash.
So I’m incredibly biased in this discussion (I work on Dojo), but good tools help a lot.
Firstly, a “compressor” (like the one we build Dojo with) reduces the size of the code significantly. Running the variant of Prototype included in the scriptaculous 1.5 beta through the Dojo compressor took 33% off the file size without changing the file’s semantics at all.
Next, a good build system will put all your scripts into a single file. Say what you will about file size, but the limiting factor in perceived performance today is network latency combined with the synchronous nature of browser rendering engines, not network throughput. The less you have to reach-out-and-touch the network (even for a 304), the better off you are. On top of that, something like Dojo or JSAN allow you to grab just what you need, reducing the overall size of a library based on your usage of it. A capable package system even lets you do this without changing the way you develop and without cutting you off from the rest of the available APIs, should you need them.
Lastly, there’s much to be said for gzip encoding on the wire and a good HTTP server configurationn. The best kind of data is the kind you don’t have to (re)send, and the next best thing is the gzipped kind.
So yes, large libraries are a problem, but developers need some of the capabilities they provide. The best libraries, though, should make you only pay for what you use. Hopefully Dojo and JSAN will make this the defacto way of doing things.
Regards
I second everything Alex just said. I just wanted to add that the download time (and parsing) is often insignificant to the time spent manipulating the DOM. As a reminder I point you to Peter-Paul’s “Benchmark – W3C DOM vs. innerHTML” article at http://www.quirksmode.org/dom/innerhtml.html (and this just tests very basic DOM manipulations)
Hey, glad you like SACK, and yes: one advantage of it is that it is small, but as discussed here that is sometimes not an issue. It certainly isn’t an issue in really complex applications, because the overall code impact is probably small. However, as mentioned above, really large libraries can slow a browser down.
The main reason I wrote SACK however was because I wanted something that allowed me (and others) to use AJAX quickly, simpley, and without a lot of code that wasn’t needed for simple communication. Basically.. simplicity.
As said – it’s not quite the same as something like prototype (or Dojo!), but for many people that want to create AJAX apps it is nice, neat, and simple enough to satisfy their needs.