A few rules to keep in mind when using Ajax


Sun 19 February 2006 By nuxeo

In this entry, I try to sum up some rules that should be looked over when
adding AJAX features in a CMS.


keep accessibility




A CMS is used by a wide variety of people and WAI rules should be kept in
mind. All functions and screens should be usable even though javascript is
not enabled.



This means that AJAX features have to be implemented with methods like
graceful degradation to avoid the CMS to be dependent on Javascript
features.



Most of the time, this is quite easy to implement a Javascript enhanced
version over a regular version, but sometimes two version has to be
developed in parallel, like gmail for example does.



There are few exceptions for this rule:


  • it's ok to develop managment screens or backoffice screens in
    javascript

  • same things for some products like webmails, calendars: people that
    really need accessibility for these very specific products will use
    specialized pieces of softwares.

  • the product is very specific, and ajax is used for example to reduce
    network bandwidth. example: MacRumorsLive



one state, one url




This is probably one of the most important point. When developing ajax
interfaces, it's very tempting to develop advanced screens that let the user
play around without doing direct calls.



The javascript retrieves datas asynchronously from the server and refresh
the interface, the user makes changes, and the javascript display these
changes directly.



One of the common approach is to use a "model-driven" infrastructure that
let the different interface elements get synchronized with a local pool of
data, using xml transformation or, like in some recent developments, using
things like CTAL. In other words, if the user changes the data through one
interface, like an input text, a button, or anything else, the local data
pool is changed and other part of the screens are refreshed
consequently.



It's a great approach as long as the rendered screen looks exactly as it
would look if another user would load the same url in her browser. 
This means that the server state should always be synchronized with what the
user is doing.



So having a local pool of data (what people call "model-driven") is more
likely to be an optimisation to minimize exchanges by having a local cache,
and should be done with this rule in mind.



Of course the benefit of this approach also let the client-side calculate
the interface itself. In this case the javascript becomes a template engine
and push the server to a single role wich is: data provider; but this has to
be done with the first rule in mind: a javascript template engine won't work
under lynx, and having two mecanisms to do the same thing can be hard to
keep growing together.



Back to the one state, one url rule.



If this rule is not respected, third party softwares, like search engines,
or xml-rpc callers, won't be able to use the web application because they
use urls as single state elements. Moreover, users most of the time expect a
given url to give them the same result (that's why the "favorite" button
exists)


use test-driven approach, and automate tests




Most of the time, a piece of javascript is not well tested. Developers
should work in a test-driven development (TDD) way like they do in their
main language.



All actual Javascript toolkit now has everything needed to set up test
automation, and let the developpers benefit from TDD.



Moreover, Javascript pieces should be tested over all possible execution
contexts. A tool like Selenium can be used to automate functional test over
all kind of browsers.



References:

(Post originally written by Tarek Ziadé on the old Nuxeo blogs.)


Category: Product & Development