Summary
The classic essay on "worse is better" is either misunderstood or wrong. And in citing it in our design arguments, we short change both our customers and our craft. I revisit this essay, and reflect...
Advertisement
Recently, I was interviewed by a reporter who was doing a story on the
difference between east coast engineers and west coast engineers (yes,
that old chestnut is again being revisited). This in turn got me thinking
about Dick Gabriel's classic note "Worse is Better", which is often
credited with first articulating the distinction between the MIT (or east
coast; after all, what else is there on the east coast) approach to
engineering (do the right thing, no matter how complex it makes the code)
and the Berkeley (or west coast) approach to engineering (make the code
simple, even if it makes the user do more work).
The notion that worse is better has become something of a truism in the
programming industry. The usual examples are the C language (worse than
lisp, but it dominated anyway), Unix (or more recently, Windows) as
opposed to Multics or VMS, and (in a completely different arena) VHS tape
over Beta. Each of the dominant technologies, it is pointed out, was worse
than the alternative, but the worse technology became the standard
anyway. The moral to the story, or the reason that people bring the
principle up in argument, is to convince whoever is on the other side of
the argument that we should set our sights on the quick and dirty, less
elegant solution to a problem, because "worse is better."
Of course, this received wisdom is just so much crap. The arguments
simply don't hold up. But the damage that this principle has done to the
technology industry is real, and we should start pushing back before we
do any more harm than we already have done.
First, the arguments. Gabriel's original writing (which can still be
found, like everything else, either through Google or by going here) makes an
interesting read, and a number of good points. Reading it again not only
reminds one of how well it is written, but debunks a lot of the usual
common knowledge about the article. For example, the piece never
contrasts the east coast and west coast engineering styles; the two groups
it talks about are the MIT/Stanford style (one group) and the New
Jersey style of design. Not nearly so interesting in these days as the
notion that the contrast in styles is between west coast and east coast.
The rest of the paper is an excellent analysis for why Lisp lost out to
C as a programming language, even though Lisp was a superior
language. Or at least superior on the grounds that Dick found most
important. But this doesn't necessarily show that Lisp was in fact
superior to C; it can just as easily be taken to show that the metrics
that were cited in the article were not the ones that were taken to be
most important by those choosing a programming language. The fact that C
produced faster code, was easier to master, was easier to use in groups,
and ran well on less expensive hardware were not considerations that
Gabriel found important. But others did. On those metrics, the dominance
of C as a programming language was an example of better is better, not
worse is better.
The old chestnut of beta and vhs is open to the same sort of alternate
interpretation. On the "worse is better" interpretation, the superior
quality beta was beaten out by the clearly inferior vhs tape format
because of some inexplicable perversity of human nature (or the
machinations of clever marketing people, or the hubris of Sony, who
owned the Beta brand). But the vhs tapes were capable of recording
longer programs on a single cassette, and could be played on cheaper
recorders, and had a number of different suppliers. Thus there were a
set of metrics on which vhs was the superior technology, and these
metrics seemed to be the ones that most in the market found to be
important. Vhs beat out beta not because worse is better, but better in
some dimensions (cost, time to record) beat out better in other
dimensions (picture quality).
Even the case of Unix vs. Multics misses an important point. While
Multics may have been a better operating system, Unix had the twin
advantages of actually existing, and running on a wide variety of fairly
cheap hardware. Windows (and DOS) ran on even cheaper hardware, and
though it was easy to argue on any technical grounds that you wanted
that Unix was a better OS than DOS, the property of existence on really
cheap hardware turned out to the the metric of goodness that appealed to
the customer. The emergence of Linux as a real choice is beginning to
give us more evidence in this particular choice; perhaps a better OS
is something that people will choose when other things are
equal. But when they chose DOS over Unix, things weren't equal.
In all of these cases, there is an alternate interpretation of the
choices that were made that lead us to the conclusion that worse is not
better. Instead, what we see is that better is a complicated notion, and
can depend on a variety of different metrics. It may be disappointing to
find out that what we geeks think of as better may not be what our
customers think is better. But finding this out shouldn't surprise us
too much.
Of course, worse is better is a much catchier slogan than
better depends on your goodness metric or some other, equivalent
phrase that would actually reflect what was going on. And there is wisdom
and insight in the original article that can be used by designers, even
under the catchier (but less historically accurate) slogan.
My problem with the slogan is that it has become a catch phrase for
those who either haven't read the original article, or have read it and
either have forgotten what it really talked about or never understood it
in the first place. As a catch phrase, it is often used to justify
shoddy design, or following the crowd rather than doing what is right,
or as short-hand for the real claim that our customers are too stupid to
either appreciate or deserve high quality products. Why spend the time
doing things right, this line of reasoning goes, when we all know that
worse is better. You are far better off giving the customer
something that you know is less than what you could produce, because
they (those simple customers) will think that it is better.
The end result of this thinking is sloppy products that don't work, are
hard to use, or are unreliable (or all of the above). We try to convince
our customers that this is the way software has to be, and then turn
around and convince ourselves that they won't pay for anything
better. But we short-change our customers, and we cheapen our craft, when
we put up with this sort of thinking. When the original article was
produced, I don't think that this is what the author had in mind; even
if it was, it is time for us to reject the simple-minded interpretation
of the slogan, and start putting out software that really is better (on
the dimension of goodness that our customers have, not necessarily our
own).
Actually, the VHS vs. Betamax comparison is usually used as an example of why open standards (VHS) are superior to proprietary ones (Beta); even if the latter are technically superior the former will gain huge advantages due to network effects.
I like your interpretation of the Worse is Better article because it really does matter how you define better.
Most technical arguments fail to get past the stance that states that I like it therefore it must be better. The original article stood out because it highlighted all of the ways that Lisp was technically much better than C, even though, as you point out, the criteria that were selected were not the criteria that most people would use.
The old way of saying this would be "Horses for Courses", or "One size does not fit all", but to date it seems as if the software development community does not realize this. We are all so busy chasing the latest fad that we forget to make sure that the tools we are using are appropriate for the task.
I read Gabriel's article for the first time about a year ago, and I didn't come away with your interpretation of it. My impression of what Gabriel was saying was that it's better to not spend the time getting things <i>perfect</i> because a) you may never release a product, b) your competition will already have released a product that fills the same need, and c) after that, your better product will never be able to grab enough mindshare.
This is true, and it's why the software industry has migrated toward more iterative (RAD/agile/TDD) development in recent years. You build a working design, get it to your customers, and then modify it after you see what they want. This has nothing to do with releasing poor software; it has to do with being responsive to users' needs.
I think this has always been the philosophy behind Lisp, although it stemmed from the needs of research, not professional application development. But it's an excellent philosophy. The problem was that Lisp the language is awkward, hard to debug and (back then) very slow. No wonder C won. When you put the same features in a pleasing, usable, C-like package (such as Ruby or Python), the public is much more accepting.
I just wish the Lisp people would get over it. Their excellent design principles have survived. Their language didn't. So what?
Thanks Jim, for a timely 'call to arms'. There's no particular threat; there never is. We just need shaking up from time to time. We need reminding that striving for better quality software is never a bad thing and never a waste of time. Thanks again, Jim! :-)
There are several aspects to 'worse is better'. For new techonology, such as video recorders, 'worse is better' often means technology that is affordable and practical is preferable. LISP had some great features but it crawled on a 286.
For replacement technology, where users have the choice of sticking with what they have, 'worse is better' reflects the tendency for new technology not to offer enough benefits to overcome the pain of migration. Currrent technology may be inferior but it's a devil you know.
The Worse Is Better argument is that most of the solutions which prove most successful over the long term are those which grow organically, rather than being well-engineered up-front. (Lets remember that Gabriel examined both this theory and its converse.)
Gabriels point was that it didnt matter all that much how good the original seed idea was. If an idea can be published in a relatively raw form, it will be knocked about and changed and may morph over time into something very good indeed. Linux is the poster child for this approach.
If you apply the Worse Is Better argument to Lisp versus C, it explains that it was the fact that Lisp was pretty well designed early on that prevented its being changed enough to become generally useful. Nobody had such inhibitions about C.
For example, the piece never contrasts the east coast and west coast engineering styles; the two groups it talks about are the MIT/Stanford style (one group) and the New Jersey style of design.
It is, in fact, comaring the cannonical East Coast vs. West Coast styles. The paper is comaring the MIT/Stanford approach, which, beacuse MIT is the East Pole (from which all points are west), is the East Coast, with Berkeley, which is the corresponding West Pole. The mention of New Jersey is a joke: he's not talking about anyone who does work in New Jersey. Rather, New Jersey is simply the butt of many jokes on the east coast (the "armpit of America"), and so he's naming it as the center of shoddy workmanship. It would be like someone at Stanford comparing the Stanford/MIT school with the Fresno school.
> It is, in fact, comaring the cannonical East Coast vs. > West Coast styles. The paper is comaring the MIT/Stanford > approach, which, beacuse MIT is the East Pole (from which > all points are west), is the East Coast, with Berkeley, > which is the corresponding West Pole. The mention of New > Jersey is a joke: he's not talking about anyone who does > work in New Jersey. Rather, New Jersey is simply the butt > of many jokes on the east coast (the "armpit of America"), > and so he's naming it as the center of shoddy workmanship. > It would be like someone at Stanford comparing the > Stanford/MIT school with the Fresno school.
Is this a troll?? UNIX originated at Bell Labs in Murray Hill, New Jersey, and the "PC-loser" example discussed in the article was present in the original versions of UNIX developed entirely in New Jersey, before Berkeley ever saw it. Anyone who knows UNIX history would read the article this way.
> I think this has always been the philosophy behind Lisp, > although it stemmed from the needs of research, not > professional application development. But it's an > excellent philosophy. The problem was that Lisp the > language is awkward, hard to debug and (back then) very > slow. No wonder C won. When you put the same features in a > pleasing, usable, C-like package (such as Ruby or Python), > the public is much more accepting.
I would disagree with your assertion that Lisp is "awkward, hard to debug and ... slow". It's true that some people are put off by the parentheses, but once you get used to it, it's fairly intuitive. The power of sexprs lies in the fact that you're actually writing your code in an AST. The semantics are simple and unambiguous. And, because your code is already in an AST, it's easy to write programs that write/manipulate programs.
As far as debugging goes, I've had much more luck with the lisp debuggers than with using gdb to debug C. You'll never have your code fandango on core because of a char *foo = "Hello World/0"; do_something_stringish(foo); in Lisp.
Insofar as testing goes (and the power of macros and code introspection) here's a link to Peter Seibel's book-in-progress Practical Common Lisp where he writes a unit test suite similar to JUnit in 26 lines of code. (http://www.gigamonkeys.com/book/practical-building-a-unit-test-framework.html) One of the particularly nice points is that when an assertion fails, rather than simply turning your GUI bar red, it drops you into the debugger where you can examine and manipulate the stack to actually fix the problem. I've yet to see something like that in C.
As far as speed goes -- you're right that speed is less of an issue than it used to be. To emphasize the point, I'll mention that the folks at Naughty Dog are writing games for the Playstation2 in a dialect of Lisp: (http://lemonodor.com/archives/000610.html)
> I just wish the Lisp people would get over it. Their > excellent design principles have survived. Their language > didn't. So what?
It seems that Lisp is in a bit of a renaissance. There's some interesting Lisp and Scheme hacking going on right now. Greenspun's 10th Rule of Programming is still in force. DISCLAIMER: I'm not one of the "lisp people". I'm a Java programmer by profession. I've started to delve into Lisp primarily because of all of the shortcomings I've found in languages like Java, C, C++, etc.
Hehe... I had not heard the "Worse is Better" argument before. It's like saying that "Evil will always triumph, because good is dumb.". It makes absolutely no sense.
> If an idea can be > published in a relatively raw form, it will be knocked > about and changed and may morph over time into something > very good indeed. Linux is the poster child for this > approach.
Is it? Doesn't Linux have have a desktop share of under 1% compared with Microsoft's over 95%. For all its morphing, it has yet to morph into an operation system of choice for the general public and remains a niche product for political and technical specialists.
> Doesn't Linux have have a desktop share of under > 1% compared with Microsoft's over 95%. For all its > morphing, it has yet to morph into an operation > system of choice for the general public and remains a > niche product for political and technical specialists.
Even assuming your statistics, let's grade on a curve.
Until IBM jumped on the bandwagon, Linux had no marketing machine, no massive cash flow, no lucrative stock options to motivate programmers. What it *did* have was an ever-growing band of people who wanted their very own UNIX system on second-hand hardware, and before them the "political and technical specialists" who put their coding skills where their mouths were.
That Linux can even challenge Microsoft, already a juggernaut before Linus Torvalds cut the first line of code, demonstrates either the fragility of Microsoft's hegemony (as Mr. Gates reportedly believes) or the power of a sound design and motivated volunteers to unsettle a corportate behemoth.
Oh, and let's only count dominance on a desktop PC ... not, say, deployment as a platform for webservers, fileservers, and other infrastructure of the Internet.
> I would disagree with your assertion that Lisp is > "awkward, hard to debug and ... slow". It's true that some > people are put off by the parentheses, but once you get > used to it, it's fairly intuitive. The power of sexprs > lies in the fact that you're actually writing your code in > an AST. The semantics are simple and unambiguous. And, > because your code is already in an AST, it's easy > to write programs that write/manipulate programs. > [snip rest of "why Lisp is great"]
These are standard arguments for Lisp to try to convince people to use the language. But my point is that C and C-like languages need no such arguments. The trends of the marketplace make all other arguments irrelevant.
Really, it's as irrelevant as someone saying, "Don't buy a DVD player -- laserdiscs are better." Add as many arguments of technical superiority here as you want ... what format is on the shelves when you go to the store?
My thinking is that rather and support Lisp itself (which I deem a lost cause), a better approach is to find a language that most closely resembles Lisp and that is thriving. For me, that language is Python.
Flat View: This topic has 34 replies
on 3 pages
[
123
|
»
]