|
Re: Optimization: Me versus Guy Steele
|
Posted: Sep 29, 2005 7:44 AM
|
|
> "The mistake many language designers make is in failing to > study history, failing to look at what worked and didn't > work in the past, Steele says.
To what do you think he is referring to here? Static optimization? Has it really failed? I see no evidence of that. What about template metaprogramming and other generative programming techniques? As far as I can tell they have been very successful: ATLAS (Automatically Tuned Linear Algebra Software), Lex, Flex, Yacc, and Bison, Blitz++, YARD (Yet Another Recursive Descent Parser), FFTW (Fastest Fourier Transform in the West) etc. History as I know it says very clearly, that generative programming hold an enormous amount of promise when performance is an issue.
History AFAIK has very little to say so far about dynamic profiling and rewriting, as it is a relatively immature technology. That fact that it has worked well for Java so far is completely inconsequential, because Java is inefficient to begin with.
> "For example, we're trying to use the dynamic compilation > ideas from the Java HotSpot compiler to provide a > productivity boost. Essentially, programmers shouldn't > have to worry too much about optimizing while they're > writing programs. Instead, that optimization can be done > by compilers, either ahead of time or on the fly." > > By measuring how the programming is behaving, he explains, > information can then be fed back into a compiler so it can > reorganize the program while it's running." > Guy is on target here.
The problem with this approach is that it doesn't factor in the cost of measurement and reorganization. An already optimized program will suffer greatly, because CPU cycles will get wasted on measurement and calculation. On the fly optimization would simply not work for finely tuned libraries.
|
|