Summary
A lighthearted comparison of how successful fiction authors and language designers can accidentally paint themselves into a corner in their first work.
Advertisement
I love reading the Harry Potter books, and recently came up with an interesting analogy between writing a series of fiction books and designing a programming language.
I'm sure that when J.K. Rowling wrote the first Harry Potter book (planning it as the first of a series of seven) she had developed a fairly good idea of what kind of things might eventually happen in the series, but she didn't have the complete plot lines for the remaining books worked out, nor did she have every detail decided of how magic works in her world.
I'm also assuming that as she wrote successive volumes, she occasionally went back to earlier books, picked out a detail that at the time was given just to add color (or should I say colour :-) to the story, and gave it new significance. For example, I doubt that when she invented the name Voldemort she had already decided that "I am Lord Voldemort" would be an anagram of "Tom Marvolo Riddle" -- Tom Riddle just isn't a very convincing name for a character to develop into the most evil person in the world, but apparently she couldn't come up with a good anagram of Voldemort by itself. (And the idea itself of an anagram is a bit lame, just as are the allitterating names of Hogwarts' founders and some of the other students.) I also don't think that she had thought of Scabbers, Ron's pathetic pet rat, as an animagus in disguise (AFAICT the fact that he's missing a toe doesn't come up in the first two books at all). Maybe she regrets the emphasis on Quidditch (I know do :-). And so on -- you get the idea.
In a similar vein, I had never thought of iterators or generators when I came up with Python's for-loop, or using % as a string formatting operator, and as a matter of fact, using 'def' for defining both methods and functions was not part of the initial plan either (although I like it!).
Just like the successive Harry Potter books are required to have "continuity" (we can't have Dumbledore's taste in sweets change drastically in book 3), successive versions of Python are constrained by pretty serious backwards compatibility requirements.
Sometimes it's easy to go back and generalize a feature; for example, the transformation of built-in conversion functions like int(), str() and list() into built-in classes feels like a stroke of genius (if I may say so myself :-). On the other hand, the recent discussion on python-dev of the exception hierarchy, summarized in PEP 348, shows that early choices sometimes are less than ideal, even if they're not fatally flawed. The mismatch of naming conventions and API quality in the standard library is another example.
Aha! You have hit a very good point, right on the head!
Do you support, or strive for, or "just utterly" break backwards compatibility? Too many language makers (or their corporate founders), say, oh, let's trash it. No one is still using VB 3.0, anyway. [Yes, VB has been quite good for staying compatible with it's MS BASIC roots. However, VB.NET has quite avertly changed that, now.]
Personally, I admire Python for staying true to it's old standards. I'd hate to see the functional commands vanish. This is breaking BW-Compat. If "no one" were ever using them. It would be alright. However, it is those few programs using it. They will become broken.
I don't know why, compiler people presume. Oh, let's just keep the name, and totally rewrite the grammar, as being Ok? In my opinion, it isn't.
If you want to change Python on it's head. Fine. Just call it something else, or a new variant. But, I would suggest, don't call it Python v4.0, remove 30% of the previous language, and then add a totally new standard practice of operation.
Python has saved my ass, many a time. Thank you Guido.
I don't think Lucas has ever been shy about saying he changed things. In fact, I think he got caught in a non-yagni moment and may have tried to force things in there that really didn't need to be there, like the Jabba scene in episode IV. And every change in Empire was just, well, not necessary.
I am so glad I still have the originals on VHS.
Not that this has anything to do with Python and Harry Potter.
Although I'm wondering how far one could go with this and start drawing wacky parallels between the book vs. movie version and then start talking about evolutions in languages and so on and so forth.
Alas, I would try for the sheer delight of it, but I'm on vacation and I'll (thankfully) be away from a computer and the phone and tv for a week. So my brain is shut down. Yaaayyy......
All this seems to come up because people try to extend the language rather than extend the libraries. The cool thing would be to write a language where extension through libraries has all the syntactic benefits of language extension. Seems like there have been a lot of halfway attempts at this.
I would be curious to know if you have ever thought that some change was just too drastic that it really called for another language. One cannot, I presume, just keep adding features to a language before it presents a facade to the programmer that is just too difficult to maintain.
Can a language be functional, declarative, procedural, OO, statically typed, late binding, uncle tom cobley and all and remain easily accessible or shall we admit that there is not one ring to bind them all.
> Can a language be functional, declarative, procedural, OO, > statically typed, late binding, uncle tom cobley and all > and remain easily accessible or shall we admit that there > is not one ring to bind them all.
Part of the problem is that we're not really talking about 'language' per se, but about instruction sets for controlling machines. All the important stuff relating to these instruction sets was pretty much determined in the days of Fortran. Sure there's been progress but it's all been 'icing' and sophistication. Even OO pogramming is nothing more than a way of packaging the same old instructions.
Languages convey ideas between consious beings. Instruction sets simply command machines to carry out actions. If that's all we're trying to achieve then I think the bulk of the work has been done and the efforts going into modifying Java, Python, Ruby, Perl, .net, et al. are of commercial and technical interest only - to give them a perceived marketing edge.
The interesting acedemic problem domain of creating a language that a computer can (in any sense) understand has barely been touched. I think that it in progress in AI research that will be the driving force behind genuinely new development is programming languages.
> Do you support, or strive for, or "just utterly" break > backwards compatibility? Too many language makers (or > their corporate founders), say, oh, let's trash it. <snip long rant>
I just don't understand this kind of whining. If you want to use VB 1.0, then why did you upgrade to vb.net? If you liked Python 1.5.2, then keep that version on your machine.
On a related note, I wonder if more time is lost trying to maintain backward compatibility and cementing in bugs for fear of people depending on the buggy behavior.
I like this analogy. I'm reading Xenocide now, the third part of the Ender's Game series by Orson Scott Card. I was wondering about some similar cases in it where the author may have planted some seed in the first book, or maybe just as likely, subsequently sifted through it and embellished things that weren't originally intended to have much meaning. There are also a few cases where it seems like the author may be thinking "Damn! Why did I say this particular thing in the previous book? Now I have to explain why it makes sense or how it fits in with the current thread."
Of course, with movies and books and such, we expect each sequel to be weaker, more inconsistent and nonsensical than its predecessor (eg. The Matrix), whereas with software, we expect the opposite (whether that's a reasonable expecatation or not is another question).
> I'm reading Xenocide now, > the third part of the Ender's Game series by Orson > Scott Card. I was wondering about some similar cases in > it where the author may have planted some seed in the > first book...
If I'm not mistaken, Card came up with the second book, Speaker for the Dead, but needed to come up with and write the first book, Ender's Game, in order to provide background. Perhaps he already had Xenocide in mind as well.
> If I'm not mistaken, Card came up with the second book, > Speaker for the Dead, but needed to come up with > and write the first book, Ender's Game, in order to > provide background. Perhaps he already had Xenocide > in mind as well.
If I recall correctly, Ender's Game started as a short story. Always preferred Speaker for the Dead, the people/situations felt more real.
As Card himself explains, he wrote Ender's Game first, then Speaker for the Dead, and then went back and *expanded* (rewrote) Ender's Game to give better context for Speaker. Having read all of the series, I think Speaker is the best, but we digress...
Part of the problem stems from the "Worse is better" school of programming language development. People start off with a language but don't spend enough effort to boil things down to as few concepts as possible. Seems to me that Smalltalk and Ruby are still the best, but not perfect, examples of language design.
I think it also a mistake to regard a programming language as a language for a specific purpose. It may have started from the need to solve a particular problem, but a language will always spread far beyond its original domain. Your design must stand up to that. It is impossible to cover all potentialities but by laying a good foundation it will be much easier.
A basic rule: Be as restrictive as possible in the beginning, only permit as little as possible. It is much easier to relax the rules later than to restrict them. This avoids a lot of backward compatibility problems.
Example: Java interfaces imply that all methods are public abstract and many coding standards advise against supplying the superfluous keywords. This is actually inconsistent since no access specifier implies package visibility in Java. This also means it is difficult to extend the interface concept later to include more fine-grained access rights a la Eiffel.
Flat View: This topic has 17 replies
on 2 pages
[
12
|
»
]