Summary
Following on the heels of Jack Herrington's Ajax anti-patterns article, Gojko Adzic describes a few additional Ajax bad practices.
Advertisement
Building Ajax and rich-client applications is a relatively new experience for many developers. As a result, such applications often exhibit more than their fair share of poor coding and design practices, according to Gojko Adzic's recent article, Breaking the Web. Adzic notes that:
The paradigm shift from server-side to client-side workflow created a void in best practices for Web development. Like any new cool and funky technology, Web 2.0 has many nice new features, but comes with a set of new problems, at least new in the area of client-side browser development. Plainly ignoring these issues may cause big problems from support to serious security exploits—but there is no need to re-invent the wheel...
He then lists several commonly observed poor Ajax design and development practices, or anti-patterns, including:
Making links hard to use
Taking the Ajax approach too far – so that it becomes impossible to use links for navigating through the application. Hyperlinks are one of the main reasons why web became so popular – they are incredibly flexible, can be bookmarked, stored, exported, interlinked and all those fine features are quickly disregarded in favour of a bit of background processing...
Google Maps offers a good example—‘link to this page’ button which gives the visitor the navigation state serialised into GET parameters.
Not paying attention to form onSubmit() return codes
A common way to handle forms in the background is to intercept the submission with an onSubmit handler. After processing, handlers typically return false, to stop the form from actually being submitted. If the onSubmit handler throws an exception, the form will get submitted and the page will reload.
Not notifying a user of a pending form submission
With normal form submission, users get instant notification when their request is being processed, and that does not happen automatically with Ajax forms. Neither does the browser stop the user from re-submitting the same form, or trying to edit a field once it has already been submitted.
Being defeated by the browser's caching of GET requests
Browser can (and typically will) cache GET requests. So, when only GET is used, there is typically also some form of cache prevention, either on the server (HTTP headers) or on the client (request timestamp added to the URL to make it unique). Browser cache is a good thing if used properly, and there is no reason to turn it off completely... [Instead] use GET for documents which are not user-specific, and using POST to execute procedures and other user-specific requests.
Too many concurrent connections
Browsers limit the number of concurrent requests to a same server, so too much Ajax might kill the user experience, especially if the requests start timing out.
Exposing too much application API
Ajax makes it really easy to call back-end APIs, so developers may get tempted to expose a lot more than they should, thinking that it will go unnoticed because URLs are not displayed in the address bar.
To what extent do you think current Ajax toolkits shield—or expose—the developer to such poor practices?