I don't often agree with The Register, but this column by Bill Thompson makes an awful lot of sense. In discussing "web 2.0" and asynchronous xmlhttp, people elide the difficulties of distributed development:
Ajax is touted as the answer for developers who want to offer users a richer client experience without having to go the trouble of writing a real application, but if the long term goal is to turn the network from a series of tubes connecting clients and servers into a distributed computing environment then we cannot rely on Javascript and XML since they do not offer the stability, scalability or effective resource discovery that we need.
I first ran across this issue back in 1995, when PPD introduced VisualWave. Wave was a cool product - you used the normal GUI builder to paint an interface, and then the system would "automagically" html-ify it for you. Marketing touted this as "instant web access" for our customers who wanted to push their apps out to the net.
Well, not so fast. Most applications written for the desktop had a number of baked in limitations - all too common were things like:
- Only one user at a time assumed
- One database connection, with one username/password assumed
- Any cache scheme assumed a single user
And so on. getting a UI on the web was (relatively) simple; getting the application to actually function there wasn't. The intervening decade hasn't really changed that much. Whenever you deal with network resources, you have to be ready to deal with failure gracefully - and I get the distinct impression that most developers tossing around the "web 2.0 mojo" aren't thinking about that. It's going to come back to bite them.
Technorati Tags:
Ajax, web2.0