The Artima Developer Community
Sponsored Link

News & Ideas Forum (Closed for new topic posts)
Changing the Zen of Programming

6 replies on 1 page. Most recent reply: Sep 19, 2002 1:52 PM by Bill Venners

Welcome Guest
  Sign In

Go back to the topic listing  Back to Topic List Click to reply to this topic  Reply to this Topic Click to search messages in this forum  Search Forum Click for a threaded view of the topic  Threaded View   
Previous Topic   Next Topic
Flat View: This topic has 6 replies on 1 page
Bill Venners

Posts: 2284
Nickname: bv
Registered: Jan, 2002

Changing the Zen of Programming Posted: Aug 28, 2002 8:11 AM
Reply to this message Reply
Advertisement
"Dynamic languages, network-centric platforms and reliable (but evolving) systems all require that the programmer change his or her view about what he or she is doing. Rather than controlling all of the aspects of what we do, we need to give up a measure of control and accept that we cannot have full knowledge of the systems we are building," says Jim Waldo in this OSOpinion article:

http://www.osopinion.com/perl/story/19163.html


Bill Venners

Posts: 2284
Nickname: bv
Registered: Jan, 2002

Re: Changing the Zen of Programming Posted: Sep 15, 2002 9:30 PM
Reply to this message Reply
> Dynamic languages, network-centric platforms and
> reliable (but evolving) systems all require that the
> programmer change his or her view about what he or
> she is doing. Rather than controlling all of the
> aspects of what we do, we need to give up a measure
> of control and accept that we cannot have full
> knowledge of the systems we are building.
>
> What we will know is a minimum set of
> behaviors with which to interact, not the details of
> (or even the identities of) the components that
> make up our systems. Even if there is a point
> where we do have such knowledge, over time
> the system will change in ways we could not
> have foreseen.

One difficulty in building systems that take in new implementations of agreed-upon interfaces is getting everyone to understand and agree on all aspects and details of those interfaces. When people disagree on the semantics of the interfaces, then their parts may not work together. Because the systems are dynamic, you can't fully test it before you ship it out the door. You don't know yet all the pieces it will have over its lifetime.

How do you achieve a reliable system when you can't test it? I suspect part of the answer will be testing the parts against their interfaces. But I think this will be a big challenge even if programmers change their way of thinking and accept the idea of programming with a knowledge only of the interfaces.

Jim Waldo mentions C# in his article. One thing Microsoft did in .NET was use something they call strong names during linking. In .NET you can say you want to link only to a particular version of a DLL from a particular vendor. This allows you to ensure that the DLL you use is exactly the one you tested with. When I was at Microsoft a few months back, I was basically told that programming to interfaces is a great idea in theory, but in practice it doesn't work. In practice, there are bugs in implementations and vagueness and differing interpretations of the specifications of DLL semantics. There is a lot of truth to that. I think it will be a challenge to figure out how to get the theory to work in practice in dynamic systems composed of parts made by many different people.

Jim Waldo

Posts: 34
Nickname: waldo
Registered: Sep, 2002

Re: Changing the Zen of Programming Posted: Sep 17, 2002 9:44 AM
Reply to this message Reply
While I agree on much of what you have said concerning the history, I disagree on the remedy...

One conclusion is that that you cite, which allows everyone to control the libraries to which they will link (which, btw, we also provide in Java/Jini with the introduction of preferred classes). But there are other solutions as well.

One is to insure that the interfaces can be understood. This often means defining simple interfaces, which is harder than defining complex ones that are specified in vague or incomplete ways. But going to the trouble to really define the interface is worth the time that it takes, and can go a long way in making sure that everyone agrees on what the interface means.

Another is to really restrict yourself to programming to the interface, and not try to cheat. DLL hell is a well-known phenomenon, but it was generally caused by the implementations of the DLLs making changes to the global environment in ways that were not part of the interface and not documented. The problem was one of using non-interface-defined mechanisms to allow communication between components in a way that was either more efficient or hidden from various members of the competition.

Reliable systems that are made reliable by testing the whole of the system are reliable systems that can't be changed. Systems that can be changed but include implementations that rely on side-effects won't ever be reliable. We can build systems that are both reliable and open to change, but only if we really practice the discipline of programming to well-specified interfaces. The fact that we have failed in this discipline in the past doesn't mean that it can't be done; merely that we haven't.

Bill Venners

Posts: 2284
Nickname: bv
Registered: Jan, 2002

Re: Changing the Zen of Programming Posted: Sep 17, 2002 2:22 PM
Reply to this message Reply
> One conclusion is that that you cite, which allows
> everyone to control the libraries to which they will
> link (which, btw, we also provide in Java/Jini with
> the introduction of preferred classes). But there are
> other solutions as well.
>
Yes, if you link exclusively to known implementations of interfaces with which you've tested your app or system, you may get reliability but it's not a dynamic system.

> One is to insure that the interfaces can be
> understood. This often means defining simple
> interfaces, which is harder than defining complex
> ones that are specified in vague or incomplete ways.
> But going to the trouble to really define the
> interface is worth the time that it takes, and can go
> a long way in making sure that everyone agrees on
> what the interface means.
>
The challenge here is how do you get people to do that? People of various talents will be designing APIs. People will have time pressures in which to produce those designs. It takes talent and time to come up with a simple API. In practice, therefore, complex hard-to-understand, vaguely documented APIs will happen.

> Another is to really restrict yourself to programming
> to the interface, and not try to cheat. DLL hell is a
> well-known phenomenon, but it was generally caused by
> the implementations of the DLLs making changes to the
> global environment in ways that were not part of the
> interface and not documented. The problem was one of
> using non-interface-defined mechanisms to allow
> communication between components in a way that was
> either more efficient or hidden from various members
> of the competition.
>
I hadn't thought of DLL hell in terms of changes to the global environment. I always imagined vaguely defined interface semantics leading to implementations doing things that were surprising to clients. I suppose the DLL hell problem is made better by Java (and .NET) preventing a client from going around an interface at runtime, but implementations can still do things with the environment via static methods. So once again, it looks like it is up to all those programmers writing implementations of APIs to be good citizens. That's why I think it will be a challenge, because a lot of those programmers will, because of lack of experience, talent, taste, time, and clear API documentation, do bad things in their implementations.

> Reliable systems that are made reliable by testing
> the whole of the system are reliable systems that
> can't be changed. Systems that can be changed but
> include implementations that rely on side-effects
> won't ever be reliable. We can build systems that are
> both reliable and open to change, but only if we
> really practice the discipline of programming to
> well-specified interfaces. The fact that we have
> failed in this discipline in the past doesn't mean
> that it can't be done; merely that we haven't.
>
Well, it sounds like it is up to designers and implementers of APIs. If designers and programmers do a good job, then dynamic systems in which the parts are made by many different people will work. Perhaps the fact that a higher level of quality is required to make the systems work will force the builders of the system parts to reach that level of quality. The web is an example of a dynamic system. Lots of different manufacturers of web servers and clients take part in the web, but it still seems to work. I do think that testing components against their interfaces will be an important part of the process by which programmers will achieve the required level of quality in their implementations. Nevertheless, I think that given the generally low level of quality of code that I tend to see in most places, it will be a challenge to achieve reliability in dynamic systems in which parts are made by different people who don't know each other.

Jim Waldo

Posts: 34
Nickname: waldo
Registered: Sep, 2002

Re: Changing the Zen of Programming Posted: Sep 18, 2002 11:32 AM
Reply to this message Reply
Well, wait a second...

Any reliable system is going to require that programmers be competent. If you are going to assume incompetent programmers who are rushed, don't have time to think, and don't have time to design, then I will agree that the interface will probably be badly designed and implemented. But given that set of assumptions, you won't get a reliable system no matter what you do (the fact that most code is written in such an environment now is also an indication of why our systems lack reliability now).

Just testing the crummy code that incompetent programmers write under unreal deadlines isn't going to give reliable systems. I'm sure we can all think of examples of systems that have undergone lots of testing that still don't work very well.

If we are going to have reliable systems, we need to start with good designs and competent programmers (maybe I've just posited a round square; from a contradiction anything follows so perhaps what follows is vacuuously true, but I don't think so). Once we have that, we can add the needed evolvability by using dynamic code and strong interface/implementation distinctions.

Bill Venners

Posts: 2284
Nickname: bv
Registered: Jan, 2002

Re: Changing the Zen of Programming Posted: Sep 18, 2002 10:28 PM
Reply to this message Reply
Yes, that's a good point. Testing poorly designed and implemented software just finds problems in it, which may not be so easy to fix. Testing doesn't make bad software robust.

But I think there is a significant difference between monolithic and dynamic systems that makes it harder for even competent developers to make dynamic systems that are reliable. (By dynamic systems, I mean in particular dynamic systems in which the parts are made by different people unknown to each other who define and program to publicly known interfaces.) The difference is that a dynamic system can't be tested with all its parts together. It can't be integration tested. Differences between various developers' interpretations of interface specifications--differences that can cause the system to fail--can't be resolved at integration test time. As a result, the software pieces will be released with those varying interpretations expressed in the code. And those differences are then free to cause the system to fail when the consumer is trying to use it.

For example, if I'm working at a company helping build a large monolithic system, I will likely need to program to interfaces. Designers will be defining interfaces between the various subsystems of the monolithic system. They will publish interface specifications internally. Implementers will create implementations of those interfaces. If I were to create an implementation, I might write unit tests that verify my piece works as predicted by the specification.

At some point all the parties will get together do integration. We'll try to get the parts to work together. And often, we'll discover not only bugs, but mismatches in understanding of the semantics. We'll talk to each other, talk to the designers of the interface specs, resolve the misunderstanding, clarify the vague area of the specification. We'll go back to our desks and make changes, then test again. When the monolithic system achieves the required level of reliability, it can be shipped.

In a dynamic system (made by people who don't know each other), people don't get to resolve those misunderstandings or clarify vagueness in the specification before releasing their pieces of the system. That's the difference that I think makes building dynamic systems much more challenging even for competent programmers.

I mentioned DLL hell on Windows as one example of this. Windows plus applications is a dynamic system of the sort we are talking about. Another example that I've noticed recently is DVDs. Turns out quite a few manufactured DVD disks don't play in all DVD players. Why is that? There is a specification that DVD manufacturers and DVD player manufactures all pay a rather large fee to get access to. But in practice, certain DVD disks don't work in certain DVD players.

I guess reliability is really a question of to what extent do you need reliability? How many 9s in your 99.99...% reliable do you really need? In the case of DVDs, if I buy a DVD and it doesn't work in my player at home, I just take it back. So long as the cost of returned DVDs doesn't put the DVD manufacturers out of business, maybe the level of reliability is OK from their viewpoint.

The extra challenge in dynamic systems beyond the already significant challenges of monolithic systems, is in creating interfaces that are very well defined and understood by all parties. I don't think it's impossible to meet that challenge, just difficult.

Bill Venners

Posts: 2284
Nickname: bv
Registered: Jan, 2002

Re: Changing the Zen of Programming Posted: Sep 19, 2002 1:52 PM
Reply to this message Reply
One example of a dynamic system that has worked well is the Java Platform. Many different people are producing Java Platforms. Far more people are producing Java programs. Java Platform vendors have no way to test their Java Platform with all possible Java programs. This is similar to DVD player manufacturers not being able to test their player with all possible DVDs.

In the case of Java, there has been a lot of effort to define the specifications--The Java language specification, the JVM specification, and the javadoc "specs" of the Java API. But there is also a huge battery of tests that Java Platform implementations must pass before they get to call themselves "Java." Those conformance tests are in effect testing the Java Platform implementation against the specification.

I develop the servlets for Artima.com on Mac OS X on a Java Platform made by Apple. I deploy them on Linux on a Java Platform made by Sun, and they work. Testkit has been partly developed on Mac OS X, Solaris, and Windows, and it works on all those Java Platform implementations. It is really quite amazing.

I think the battery of conformance tests that Java Platforms must pass is a critical reason Java programs tends to be so highly portable. I think that no matter how simple your interfaces are, how good your specifications are, withouts tests, humans being humans, they will screw up and the resulting systems will not be highly reliable.

Flat View: This topic has 6 replies on 1 page
Topic: The Thin Gray Line Previous Topic   Next Topic Topic: A Look at the Composite Design Pattern

Sponsored Links



Google
  Web Artima.com   

Copyright © 1996-2019 Artima, Inc. All Rights Reserved. - Privacy Policy - Terms of Use