This post originated from an RSS feed registered with Agile Buzz
by Laurent Bossavit.
Original Post: Software sphexishness
Feed Title: Incipient(thoughts)
Feed URL: http://bossavit.com/thoughts/index.rdf
Feed Description: You're in a maze of twisty little decisions, all alike. You're in a maze of twisty little decisions, all different.
The next paradigm in programming should be modeled after how humans deal with procedure and exceptions, and learn from it. HUMANS are incredibly flexible and adaptable, and can respond in real-time to change.
Definitely a profound idea. I'm far from sure that we're anywhere near ready for that - we're encumbered by ideas about cognition and computing that make it difficult to even think of stating the problem that way.
It's not just humans who are flexible - so are many other systems geared to survival, or at least stability. Stable systems show equifinality - they get to the same end state irrespective of starting conditions (or perturbations) within some range of tolerance. Species in an ecosystem are intricately interdependent, but most ecosystems can afford to lose one species.
The software systems we're able to design also show interdependence, but when one module fails (even a small one) typically the whole thing fails. The famous Ariane crash is a good example of this fragility; the bug that was the ultimate cause of the crash had caused a fault that was entirely irrelevant, at the time it occurred, to keeping the rocket in the air. A more proximate cause of the crash was that the main guidance computer interpreted an error message as if it had been a command to the boosters.
Think about it. That's a bit like a car driver hearing someone sneeze... But rather than disregard the "utterance" as irrelevant, the driver is compelled to assing meaning to it. Arbitrarily, the driver decides it means "turn right all the way". As errors go, it's totally bizarre.
And yet so many systems "designed" under the prevailing assumptions as to what constitutes "design" exhibit this kind of behaviour, this mix of awesome intelligence and utter stupidity. (I'm reminded of the quality Doug Hofstadter calls "Sphexishness".) The lesson ? It's time to confront the hypothesis that there's something wrong with prevailing assumptions as to what constitutes "design". For instance, the idea that emergent behaviour makes systems inherently unsafe, whereas "designed" behaviour is inherently safer.