Ajax and Rich Internet Application FAQ

Shall I use JSON or XML as AJAX transport format?

There actually is no reason to use XML instead of JSON, if you don’t plan to use XSLT. JSON is part of JavaScript which has been around since the dark ages of the web. Also, there are very fast server-side JSON implementations and JSON is the more compact protocol, thus saving bandwidth.

Isn’t XSLT much faster than to use JSON and modify the DOM with JavaScript?

First, it is highly unlikely that there are more browsers that understand XSLT than {a: ‘b’}. In fact, with JSON you are on the safe side. For a client-side templating engine like EJS, the performance is not an issue.

Processed EJS templates simply add strings together.  They do not use eval() or have a processing step which is probably why typical templates run slow.

Also, XSLT is loaded from another file while EJS can be packaged with the JavaScript payload.  This necessitates an extra request by the client.

Is it better to render the HTML on the client or an the server side?

There is a general decision, you need to be aware of: Are you building a Rich Internet Application or do you want to beautify some HTML output with additional (JavaScript-)effects? Is the site required to be usable without JS? If you can rely on JS and if it’s a RIA, then I would suggest using a client-side MVC framework like JavaScriptMVC, which comes with client side controllers, template engine and model classes. Your server will mainly provide JSON-API functions and the basic page grid then, and not deliver complete HTML pages. Lots of logic (and therefore code) will move to the browser/JavaScript: In effect, you will need more JavaScript developers and less server-side developers for your projects. On the other hand, you will see a gain in consistency and also performance, as you have way less client/server communication, once the JS code is loaded (which can be pretty fast, thanks to compression). Many popular Web applications are using this approach.

Which JavaScript library shall I use (Dojo, jQuery,…)?

The choice of JavaScript DOM/Ajax library is IMO not so important, as long as it doesn’t leak memory, has a small code size and offers fast DOM queries. See

http://www.domassistant.com/slickspeed/{.moz-txt-link-freetext}

The real problem you will have, is that normal event binding causes problems because of memory leaks (circular references which can’t be handled by the garbage collection of most browsers) and also because you need to loop through all elements and refresh everything after doing changes to the DOM (for example after AJAX requests that modify the HTML). So, at the end of the day, you will want to use event delegation, which comes for free with the controllers of JavaScriptMVC.

I know there is JavaScript “integration” in my favorite Java/PHP/Python/Perl development framework. Am I ready for building a Rich Internet Application now?

Server-side generated JS, like what you will get when using any framework that offers JavaScript support (I’m not talking about JSON/REST here), can be a pain, because almost certainly, you will have a very limited feature set and maybe also problems with deployment and testing, because the common server-side frameworks can’t really test JavaScript code nor is there a smart way to optimize generated code or structure it. Most of the approaches I’ve seen just generate some inline JavaScript which is good for enhancing forms a bit with new input elements/validators or offer some nice visual effects – but that’s it. If something like this needs to be customized a lot and grows big, you might be in trouble. Again, the question is “JavaScript enhanced Web site” or “Rich Internet Application”? To generate JavaScript with a server-side language/framework¬† has it’s limits and is very often not consistent, if you use hand-written JavaScript code at the same time.

What features should a JavaScript framework for Rich Internet Applications offer?

I would argue that a good JavaScript framework allows an average developer to produce effective and structured code. jQuery and the like however are just browser abstraction layers (like Zend_Db is an abstraction layer for databases). Raw jQuery will almost certainly produce the same mess as raw PHP or any other server-side language that does not imply a clean application structure (MVC). You end up with reinventing the wheel and you will have different code to do similar things. Or it will just be plain slow, because without dispatching events (event delegation) you end up with looping over elements to attach events and effects, which simply gets slow for many elements. Also you will have no overview and especially inline JavaScript can’t be compressed, which means you will transfer the same source code with every request. That said, I vote for separating server-side code, JavaScript and HTML and have clearly defined interfaces. A sound architecture is a great help for all developers, as soon as there are more lines of code than fit on a screen. You don’t want to start guessing, if the code you look for is hidden in a JavaScript file, in a template, in a view helper or some other file.

Which browsers are commonly supported for Rich Internet Applications?

Compatibility is not a big issue anymore, if you are fine with supporting IE 6+, Safari 3+ (incl. iPhone), Firefox 2+ and Opera 9.5+. IE < 6 is very hard to support (I would simply refuse to do so). IE 6 is slow with JavaScript, but the real pain is it’s lack of good CSS support.

I want to use W3C standards only (W3C DOM, XML and semantic XHTML) for my Rich Internet Application. JSON, innerHTML and DIV elements are evil, right?

No. If you have a closer look, then you will notice that semantic HTML (for example, using the table element to render a table instead of DIVs) or not using innerHTML sometimes offers very bad performance for DOM manipulations, thus render the complete application useless. innerHTML very often delivers the best performance for inserting content to the DOM. So, don’t decide on certain implementation details upfront, especially if you plan to support IE6, which simply is an old browser and needs some special treatment. JavaScript forces you to be very pragmatic sometimes.

See http://www.quirksmode.org/dom/innerhtml.html

JSON is a standard because it is implied by EMCAScript. See JSON vs. XML.

Using parallel requests to get data from the server seems to be a good idea to speed up performance. Is there any downside?

Depends. A major issue with modern Web applications is that the initial loading causes many parallel requests, as you just deliver the page grid and the JavaScript client side code then renders all the other elements from client-side templates and loads the corresponding data (if needed) using the mentioned JSON-APIs. A constant overhead is produced when using comet. Parallel requests frequently cause problems, as some server-side session handler implementations lock the session (other requests have to wait then, until the session is accessible again). Also the many requests might require lots of memory and server processes.

I want to thank Justin Meyer (maintainer of JavaScriptMVC) for all the interesting discussions and his input on the JSON vs. XSLT debate.