Thanks to Jesse, I decided to read up on Google's Ajax Libraries API, a "content distribution network and loading architecture for the most popular open source JavaScript libraries" (their words). In simpler terms, Google will host some of the most common JavaScript libraries for you, such as jQuery, prototype, script.aculo.us and MooTools (my words).

The immediate benefit is not having to host it myself. That's cool. I don't use a lot of JavaScript, but as you probably know, I'm a fan of offloading things to other people when it can save me the hassle of doing it myself - FeedBurner (now another Google property) is one of the most famous examples of this. But I'm not sure if JavaScript will work in this arena. First and foremost, because I use so little of it, I'm not sure if it really matters. Only 30K or so on my individual archives (such as this one). Even if every visitor I had to the site downloaded the pages fresh, instead of caching them, and every page that I had had those pages - a virtual impossibility, since they don't use themselves - we'd only be talking about 1GB per month or so. That's hardly worth the effort for me - though it may be for you. But what about the user experience? Maybe that is worth it.

According to Ajaxian, there are two ways to invoke the files. You can either call the files directly through a standard SCRIPT tag, like most people do, or you can call them as a google.load parameter. I'll take a look at both. First, the standard call, looking at MooTools (since that's what I use here). It should look something like this for the compressed version:

<script src="http://ajax.googleapis.com/ajax/libs/mootools/1.11/mootools-yui-compressed.js">
</script>

Unfortunately, though it seemed to load correctly, I couldn't get it to work with my installed version of Slimbox. I also couldn't get the uncompressed version to work - both kept getting errors. So I decided that I'd try the google.load method instead. To try this, you have to first call the Google jsapi library, then the google.load function with the appropriate parameters. For MooTools, it should look like this:

<script src="http://www.google.com/jsapi"></script>
<script>
 // Load MooTools
 google.load('mootools', '1.11');
</script>

The advantage to this method is twofold: First, it worked, and second, if you decide to change libraries, all you have to do is change the parameters, you don't have to go hunting down the new URL.

As to load times, you're actually loading two scripts - the jsapi script and the mootools script (the compressed version from above). So that is two connections, rather than one. In my experience, I found that the load time was actually 2-3 times slower than loading from my own personal shared server, even though it was coming from Google, and my version of the compressed file was a couple of K larger than Google's. This, even after the file had loaded and (presumably) cached. So for me, since the bandwidth doesn't matter much, isn't going to matter.

One other issue that may be nice to implement is the ability to pull the latest stable verison of the library. Obviously this could cause some problems when the latest version breaks compatibility - but there are people who like to live on the edge. For them, it would be nice if you could pass a parameter like 'stable' into the google.load function and get whatever the last version is, so that you don't have to keep in mind a particular version number.

If you need to find your version number or a URL or something, Google provides a helpful cross-reference for finding out the various options that are offered.

The problem with compatibility when loading directly? I have no idea. It seems that the compressed file was being loaded when called directly, but when called via the google.load, it worked - so I'm not sure. I would have said that perhaps a module wasn't there. But I didn't care to bother with it, so I just went back to my own local file and it seems to be running just fine.

Comments (10)

2-3 seconds slower. Thats a lot for the user. I would think the extra DNS lookup also adds to that. The jsapi file is pretty large considering it just loads the js files..

How did you test load times? I imagine the main advantage to using google's version of the libs is for people who are a distance (as in hops) away from your server.

Hi Mike -

Simply by loading the page a few times and watching the load times through the Firebug plugin, on a few different connections (different as in not all on my own network at home I mean). Not terribly scientific, but it does return some information - better than nothing.

Nice article. As far as your suggestion for being able to pull the latest version of an api, you should be able to omit the revision number when you load the api: "2.x" - '2' is the version number, 'x' is the revision number. If you leave out the revision number, google will return the latest stable revision of that version.

ex. google.load("maps" , "2"); // returns the latest revision
ex. google.load("maps" , "2.14"); // returns revision 2.14

http://code.google.com/apis/ajax/documentation/#Versioning

New versions (not to be confused with revisions) are often times not backwards compatible... so you wouldn't want your script to always load the newest version. For example, the new version 3 of the maps api is not guaranteed to be backwards compatible. But all revisions should be backwards compatible. So by using the method above (leaving out the revision number), those of us on the edge can sleep soundly that we are using the latest code supported by our applications.

And as far as load times go... any content delivery network (such as google's) should be much faster than loading files from a single server. With the CDN, chances are the entire request (including DNS lookup) didn't even make it out of your city. CDN's usually house their servers in close proximity to local ISPs, and ISPs usually cache DNS info. So the requests leaves your house and reaches your ISP. If the DNS is cached, then the request gets passed on to a server that is *most likely* right up the street. The CDN server then delivers a cached file back down the same path. I have a real hard time believing that the CDN was slower than your server... but I am always game for being proven wrong. Check out this article first:

http://www.stevesouders.com/blog/2008/05/27/google-ajax-libraries-api/trackback/

Hi Ryan -

Thanks for the extra information. I haven't updated this (or anything) for a while, and it's certainly possible that Google's CDN has been built out and changed this.

It's also possible that my custom mootools build was a part of the issue, as well as my PC. But extra information never hurts, so thanks again!

Great article - one small point, though. In the second paragraph, you said:

"I'm not sure if it really matters. ... we'd only be talking about 1GB per month or so. That's hardly worth the effort for me - though it may be for you."

The way I see it, the object of this CDN (or I suppose any CDN for that matter) is not just about the convenience to the web developer -- it's for the sake of the users, too.

For example, if a user were to visit this site, then follow a link to my site (or vice versa) and we're both using, say, jQuery v.1.3.2 served via Google's CDN, then the user will already have this file cached, saving him/her a 30kb download. This, in my opinion, is the real benefit of using this service.

When you think about it, it's really quite silly for every website to have to have an individual copy of jQuery on their server, don't you think??!!

It's this concept that gave rise to the popularity of the web application - the "one to many" approach, as opposed the "many to many" approach of desktop applications, where virtually every user requires his/her own version of the app.

Hi Chad,

Thanks for the useful info. We like to keep our websites fast loading, so based on your experience I think we'll be including jquery on site rather than through the Google Ajax libraries.

Peter

Hi Jordan -

That's a good point, thanks for making it. Sometimes I need to be reminded that I'm not alone in the world. :)

This is EXACTLY what we need. Force users all over the world to give additional data to Google, through the use of these downloaded APIS, and then Google can use it in profiling users to then sell everyone's private data to businesses. Genius.

@Mark - there is no "force" involved here. This is just a suggestion for making the web a faster and better place... which is what Google is all about. You don't HAVE to use it. Besides, any information Google would gather from their API services would be mostly anonymous and statistical. You are worse off using Facebook or LinkedIn (which I am sure you have an account with both) than your are using Google's APIs.

I host my personal and company email with Google, as well as the email for all of my clients. I use Google Checkout... meaning my credit card information is stored with Google. I use FeedBurner (owned by Google), Calendar, Docs, and many more... and Facebook is still the one that pisses me off day-to-day with adds using my personal information. Never have I seen an add in google that was like "Hey, you're single and 27 living in Nashville, CLICK HERE".

So chill out ;) Google's not out to get you.

Leave a comment