When you write a piece of Java code, you know that that code will run on a variety of machines: Windows, Linux, the MacOS, the Palm, and so forth. Platform-to-platform Java portability works because the VM, the Java bytecodes, and the APIs your program uses adhere to strict specifications.
What if those specifications, and their implementations, were opened up and developed in an open-source manner? Would Java still preserve its remarkable platform-independence? Or would it be fragmented into a myriad of incompatible versions and implementations? How could you be sure that the servlet or MIDlet you just wrote will work when executed on a different VM and OS?
These were just some of the questions the Java Community Process (JCP) had to recently grapple with. The primary forum for Java's advancement, the JCP in theory is an open process: Anyone paying a small membership fee can participate. In practice, it has been branded as a politically charged club of large corporations, with Sun at the helm, all vying for a piece of the Java pie. Due to its restrictive licensing model, the open-source community has completely shunned the JCP.
In response to those charges, and for fear of missing out on the open-source action, the JCP adopted a new set of rules in November, 2002. The most important of those rules explicitly allows JCP members to develop new Java standards in an open-source manner, while still under the JCP's umbrella and official blessing.
Frank Sommers asked Sun Microsystems fellow and chief engineer Rob Gingell, who also chairs the JCP, about the impact of those changes. In this interview, which will be published in two weekly installments, Gingell tells us what causes fragmentation in a market place, and how Java can avoid that danger. He also talks about how competing companies can cooperate through the JCP, and how small companies and individuals can have their voices heard in the JCP.
Frank Sommers: For a long time, Sun has claimed that Java needed a high level of compatibility to preserve the "Write Once, Run Anywhere" promise, and that open-source licenses could not enforce that degree of compatibility. The JCP requires that specifications and reference implementations be accompanied by a Technology Compatibility Kit (TCK). All subsequent implementations of a JCP-originated Java standard must pass those TCKs if they claim to be compatible with that specification.
The latest changes to the Java Community Process—JCP 2.5, inaugurated in October, 2002—give Java Specification Request (JSR) leads leeway in deciding licensing policy for their work. JSRs can now be developed in an open-source manner. The resulting work, including the reference implementation, can also be licensed under an open-source license such as the GNU Public License (GPL). How does the need to ensure compatibility mesh with the new JCP policy?
Rob Gingell: Your question involves a confusion between the process to develop a specification and the manner in which the resulting work is licensed. The JCP is a process. It does not actually license anything directly. Project leaders under the JCP are the licensors. When Sun is the licensor, we use the Sun Community Source License (SCSL), a license that requires derivative uses of the work to maintain compatibility with the specification as part of the terms for having access to the work.
The JCP now requires that the work products of a JSR—specification, reference implementation, and conformance tests (TCKs)—are licensed separately from each other. The general principle is that those doing the work should be able to decide how they make their work available.
Frank Sommers: How does that licensing freedom still ensure that all JSR implementations are compatible?
To illustrate the question, suppose that I propose and develop a new Java-based API through the JCP—say, an API to interact with Java-enabled toaster ovens. After the work is complete, I release the reference implementation under an Apache-style license. Given that license's terms, anyone can now download my implementation and hack away at the code. At some point Big Bad Toasters, Inc., decides to fork my original Java toaster-API implementation such that the new code branch works only with their toaster ovens and breaks some of the code written against my original implementation. Can that occur under the JCP 2.5?
Rob Gingell: As specification lead of the Java toaster JSR, you have decided to make the reference implementation, and maybe the TCK, available under an open-source license. If Big Bad Toasters takes that reference implementation and creates an incompatible derivative work from it while still claiming to implement the specification, then they would be in violation of the specification license.
On the other hand, if they took your work and implemented a completely different specification from it, say, com.bigbadtoasters.Toaster
, that would be a legitimate, though annoying, thing for them to do. The specification intellectual property protection says that you can't lie to Java programmers about what Java is. The JCP defines that truth, and the materials produced in JSRs are used to validate it, and collectively the artifacts and the process work to maintain that assurance. In this case, however, they're not lying to anyone. They're not claiming it to be an implementation of the JSR, nor are they offering an artifact that would poach upon developers who were expecting it to be the JSR, since it lives in Big Bad Toasters' namespace.
If you make your reference implementation available without the requirement that others using it create implementations compatible to the original specification, a competitor, such as Big Bad Toasters, can create a different API and bootstrap their market effort with your work. If they then work in the marketplace to cause their API to become popular, that's permissible, unless they infringed on a patent or other intellectual property not licensed to them.
They can also operate much more communally to the same effect. For instance, they can subclass your JSR-specified API. They will then be compatible with all existing applications of the original API, but create some new functionality which has great appeal. If they're successful, it may come to pass that there is no application for toasters which ends up directly using your original API anymore, making it marketplace-irrelevant. Big Bad Toasters has done nothing harmful, but merely made its non-community-defined extensions more popular than the base defined by the JCP.
There's nothing about the JCP that prevents people from succeeding with Java, even where that success overshadows the JCP—that's what competition is all about. Indeed, the changes adopted in the JCP serve to increase the competition for compatible implementations. The main thing the JCP strives for is to ensure that those who write Java applications are not lied to by those who make the implementations they build and deliver upon.
Frank Sommers: Does a JCP JSR represent an "official" Java standard? What would prevent developers from favoring a non-JCP-developed solution to a problem over a relevant JSR?
Rob Gingell: The JCP reserves the namespaces java.*
and javax.*
. Other than namespace use, there's nothing about the JCP that requires anyone to use the APIs created through it. If there were, say, a javax.toaster.*
family of classes, there's nothing that stops anyone from working outside the JCP to create an org.othertoasters.*
family.
The JCP would tend to resist having a competing javax.othertoasters.*
activity under its roof, but couldn't do anything about someone setting up a completely different thing in competition.
One reason we don't see instances of many competing APIs is that there's a general appreciation that Java's value lies mostly in the over 3 million developers who see that an investment in a single set of skills gives them a wide market in which to work. Fragmentation would be inconsistent with the value proposition perceived by those developers, and thus counter-productive to the motivations that would lead one to want to make toaster-based APIs in the first place.
Ultimately any community is defined by what makes up its members' common self-interest, and while that self-interest might be codified into agreements and practices and process rules, what really makes it work is the shared set of values behind it. If you're building something you want developers to target, you're not well-served by fragmenting that developer pool.
Frank Sommers: You mentioned in an interview on www.sun.com that customers care mostly about binary compatibility, given that it's binary code that delivers the benefits of the software they use. When the JCP talks about compatibility, does it mean binary or source-level (API) compatibility? In general, could you please explain a bit the differences, and what the JCP's focus is in that regard?
Rob Gingell: We actually mean both source and binary level compatibility in the JCP.
We mean source so that the developers who have invested their energies in learning Java have a wide range of uses against which to apply that skill. Source compatibility is for programmers.
Binary compatibility is for products—and for the users of those products. Some of those users are also programmers who want to deploy anywhere, so they get a benefit from binary compatibility too, and that is the source of the notion of "write once, run anywhere".
The Unix industry was regarded as fragmented by customers because you couldn't interchange the binary artifacts between systems, either in whole or in part. Yet we had all those Unix standards, and everyone claimed to conform to them. So what went wrong?
Well, in some ways, nothing—only, we had an audience mismatch. Unix was very successful in having a technology standard—source code was relatively portable. Linux's rapid rise is in part due to a ready-made set of applications, and its historical evolution was partly driven by finding an application it couldn't run and then adding what was needed. The programmers actually did, and largely do, enjoy "open systems" in Unix.
End-customers, however, did not. For them, the Unix standards have a similar import that steel standards have to a person trying to buy a car. No one cares about them explicitly, nor makes a purchase decision based explicitly on them. The JCP manages Java in both spaces, providing both programmers and end-users of their work the benefits they're seeking.
Frank Sommers: Do you mean that for Java's continued success in the marketplace, binary compatibility is much more important than source code compatibility?
Rob Gingell: I think binary compatibility is much more market- and economically relevant than source compatibility independent of the technology. Java's power stems in part from being partitioned into two pieces: 1) the Java virtual machine (JVM), the basis for an instruction set architecture that is universal, and 2) the means used to target the JVM, which is largely, but not exclusively, Java. I wouldn't be surprised to see additional things targeting the JVM, and some of what we know of as "the Java language" to see some diversity in coming years as we consider more areas of computing.
There are very few models of the industry that are both simple and accurate. One which seems to pass that test says that the industry can be modeled by looking at the positive feedback loop among developers: Developers write applications. That produces volume, which then attracts more developers, and so on. And that model is fundamentally a model that applies to binaries. It explains much that source level compatibility doesn't explain.
For instance, Solaris has essentially 100% coverage of the Unix applications market. Every Unix application that exists has a Solaris/SPARC instance for it. You could not, therefore, imagine a more trivial recompilation exercise than to make the same application available for Solaris/IA32 [Solaris, Intel x86 Edition]. So how come it didn't happen? According to the source code theory of the world, that should have happened instantly.
Or, consider Alpha. How come Digital had to essentially buy off people to make Alpha versions of applications? Aren't they all Unix applications? Isn't it just a recompile, or maybe a recompile with a little work? How come they had to be paid to do it?
Then, when Linux came around, which is really a Unix/IA32 system, how come all the applications showed up?
The answer in all those cases relates to anticipated volume of binaries. Having a shared space of binaries is much more vital and powerful than having a shared space of source. That's not to say that shared spaces of source are not valuable in their own right. It's just that the properties that attend to them are not the ones that have historically explained economic behavior in the industry.
Frank Sommers: You witnessed first-hand the fragmentation of the Unix market in the 1980s and early 1990s. During all that time, Unix was not open-source. On the other hand, Linux, which is open-source, has thus far exhibited remarkable coherence, numerous Linux distributions notwithstanding: Most programs compiled for Linux run on RedHat, Mandrake, Caldera, SuSe, or any other Linux variety, without modification. Given the Unix experience, if the JCP's goal is to preserve Java compatibility, would Java not be better served in a fully open-source environment?
Rob Gingell: Fragmentation in the marketplace of products relates to the existence of binary standards. The key point in your question was the phrase "programs compiled for Linux," yielding a binary. Binaries are largely interchangeable between Linux distributions, and it's that attribute—and not the terms under which the source is available—that prevents, or causes, fragmentation. No one considers the PC marketplace "fragmented"—because there's one binary standard even though there's no source availability at all.
The paradox of Linux's marketplace surge isn't that it's got a community of developers—Unix always had that too until around 1990—but rather the fact that the volume computer, the PC, has a de facto binary standard for it shared across a number of suppliers with sufficient volume to matter. Conversely, the reason other Unixs are criticized for "fragmentation" is that they never did have a binary standard because, well, they all arrived on different instruction sets and, thus, had different binaries.
To be precise about it, we should be including an instruction set architecture reference when we talk about these binaries—Unix/IA32 is effectively the space defined by all the popular Linux distributions, and that's what you were referring to when you said "compiled for Linux." Indeed, several years ago, we at Sun stopped trying to capture binaries for Solaris/IA32, and simply told anyone who cared that we'd run the Unix/IA32 binaries from the far more voluminous Linux/IA32 community. We weren't going to beat them, and it wasn't an objective to do so. So we just joined them.
The market presence of that "installed base" is the most important factor in resisting fragmentation. If one of the Linux distributions starts making it necessary that most, or all, binaries for them have to be unique to that distribution, then that's going to fragment the market, even if the source is common to other distributions, and even if the source is available on open-source terms. That new binary probably wouldn't be successful unless it was someone with a very large percentage of the market, like Red Hat. They might do it either consciously, as a device to lock in the share of their distribution in the market, or unconsciously - gosh, this would sure be better!—but in any case they have to have the wherewithall to make it happen.
If the source is available, then the means are there to cope with that fragmentation. That's one appeal of open source. But we shouldn't confuse having the ability to recompile things and do things ourselves, and the freedom we think that provides, with a circumstance where that becomes an obligation. I have Linux systems I use that are convenient because I can sling binaries around. It'd be possible to use them if I could only sling source, but far less convenient—so much so in fact that that I'd hardly put up with it for long.
Java's proven to be successful because it, too, is a binary standard, and indeed the only existence proof of any such thing managed as such in the industry. Its binaries substitute universally for any instruction set architecture. Java applications are effectively Java/JVM binaries. There could be others and, indeed, there are Ada/JVM and FORTRAN/JVM binaries.
Java established that binary standard largely before there was a big population of applications. Linux on the other hand, is a Unix system produced 30 years after the start of Unix. There was a pile of applications that only needed to be compiled for it, and once done so, shared among all the distributions. The applications inertia came largely for free for Linux. That wasn't true for Java, which has had to withstand some determined attempts to force it to fragment at that all important binary level.
Which leads us to the last part of the question: Would Java be better served in a fully open-source environment? My answer is that it'd be differently served. The notions of community—involving more intellects than any one organization has—and the ideas of shared knowledge have a lot of appeal. But with respect to fragmentation, to be open-sourced or not is an orthogonal issue—what matters is the presence or absence of a binary standard.
Come back Monday, January 20, for Part II of this conversation with Rob Gingell.
The Java Community Process home page:
http://www.jcp.org/en/home/index
Overview of the Java Community Process:
http://www.jcp.org/en/introduction/overview
Timeline through which a JSR passes as it makes its way through the Java Community Process:
http://www.jcp.org/en/introduction/timeline
JCP FAQ:
http://www.jcp.org/en/introduction/faq
A brief summary of the Java Community Process procedures:
http://www.jcp.org/en/procedures/overview
Overview of Java Service Requests (JSRs):
http://www.jcp.org/en/jsr/overview
Have an opinion? Readers have already posted 1 comment about this article. Why not add yours?
Frank Sommers is founder and CEO of Autospaces, a company focused on bringing Jini technology to the automotive software market.
Artima provides consulting and training services to help you make the most of Scala, reactive
and functional programming, enterprise systems, big data, and testing.