Les Stroud
Posts: 7
Nickname: lstroud
Registered: Jul, 2003
|
|
Re: Software Engineering and the Art of Design
|
Posted: Jul 24, 2003 9:08 AM
|
|
Let me reply to a couple of these points. However, I will state up front that I do not feel that I actually have an answer for the question: is software art or science?
****** In response to the software / bridge analogy. I made the very same argument on my blog (www.mindmeld.ws) several months ago. Unfortunately, my blogs were lost in a server crash. However Darren Hobbs make a very good retort of my arguments in his article entitled <a href="http://www.darrenhobbs.com/archives/2003_05.html">Software, Buildings, Bridges & Testing</a>.
****** <i>I can imagine the electrical engineering was largely an art in the 50s and 60s <b> You imagine wrong. The science was different, but nonetheless still science. For example, small signal equations related to triodes and pentodes, and later individual transistors. Printed circuit boards did not yet operate at frequencies that required us to consider them wave guides. And so on.</i></b> Here I disagree with you. When these components and models that you mention were new, the application of them was far more akin to trial and error than it is to current day EE. There are endless examples of theories and models that turned out to be wrong. While there were more concrete mathematical models to consider, forming them into a functioning radio (aka Tesla and the Eiffel tower) required an inventor, not an engineer.
***** <i>Most businesses would prefer to get the software done sooner and bank of catching bugs during testing instead of taking the time to engineer a product<b> Let's leap to your summation, which answers this with a resounding NO!</b></i> Well, I'm not sure what your argument is, but I have yet to be at a company that was willing to spend the money for formal verification. In fact, I think that this is only prevalent in the defense industry and this is mainly because they have to spend the money or they loose it. Interestingly, I have spoken with some friends in the defense industry that are migrating away from the engineering approach toward a more IT-like approach in order to save cash.
***** <i>whether or not computer science will ever stabilize long enough to develop a model for it<b> There are already innumerable models (state, functions, higher order functions, lambda calculus, concurrency, modules, types, relational algebra, scoping, threads, eager and lazy evaluation, garbage collection, partial evaluation, ...). Is our science evolving? Why yes! But so is bridge building, electronic circuits and just about any branch of engineering - and just as rapidly too.</b></i> With building a bridge, the concepts have remained largely stable. Generally speaking, what we all learned in statics and dynamics class is not changing. A moment of inertia is a moment of inertia. The tensile strength required to support 10,000 pounds for a given material is constant. In bridge building the foundations of the science are extraordinarily stable (and have been since Newton) only the material properties have changes. These bridge builders / civil engineers have created a set standards (plug points -- variable in equations) where they can plug in the variations and evaluate the results against a known standard. So, while there are things changing in bridge building, they have isolated those changes using equations.
In software that has not, to this point, occurred. In fact, fairly recent and popular things like virtual machines and AOP have forced a reevaluation of the equations themselves. For instance, smalltalk and java made OO popular. Well, that popularity made traditional metrics like function points and LOC useless. They forced a reevaluation of how you actual write code with many people moving away from the big bang processes (invented in the 60s) to agile processes and test driven methodologies. This was caused by a foundation level change. The "physics" of software changed and the model that we used to develop it changed (independent of the underlying electronics ... or at least Sun would like you to believe it was independent :)). My point is that every ten years, or so, there is a tectonic shift in software. This shift could be caused by tools (like VB), it could be a change in languages used for development (like Java), or it could be in the underlying platform (like the Internet). In any case, the science has to change with it. While the fundamentals of the underlying theory may have achieved some constancy (like state machines) the perspective that they are approached from, the way that they are used, and the way that they are constructed fundamentally changes. So, while you can model those concepts (create a scientific model) it is difficult to translate that to engineering since engineering requires not only the model to be constant, but the application of the model to be constant (constant in the since that an equation is constant).
******
Sorry this was so long. Frankly I would love to find an engineering approach to software. I miss it. However, in recent years, I have found, empirically, the craftsman approach to be more predictable and to produce better software than any of my previous attempts at quantification.
******* One more response: This gets a little off topic, but I feel I should respond anyway.
<i><b>"Perhaps not, InfoSys (and Wipro) seem to consider quality a selling point: 1999 ISO9000 Recertification, Level 4 for Banking Business Unit, SEI CMM Level 5 assessed"</i></b> CMM and ISO9000 do not ensure software quality. The funny thing about CMM and ISO is that while there are agencies that set the standard, and agencies that are paid to certify (by the company that wants to be certified), there are no enforcers that ensure that the certifications match the standards. With that said, the standard does mean something. In my opinion, though they mean very little for software quality.
My little CMM rant:
Having actually implemented a CMM and ISO9000 program, I can tell you that in practice they have very little to do with software quality. In general, they are about organizational coherence and survivability. Essentially, they force an organization to create and document repeatable processes. This has many advantages that I won't go into here. However, it does not have the advantage of improving software quality. In fact, if the development process is bad, not only does it not help to properly engineer the software, but it makes it hard to change poor engineering practices that are ingrained in the documented, repeatable processes. Simply put it makes repeatable bad processes as often as it makes repeatable good practices.
I have actually had two experiences with projects being outsourced overseas to "CMM5" shops. In both cases, the results were horrid. In the first case it was a matter of getting what you ask for. This particular CMM5 shop was very good at documentation. However, this meant that in order to get them to write the first line of code, we had to document the bejesus out of the requirement. In fact, I would say we spent more time and resources writing requirements than they did developing and testing. You would think that it might not be a bad thing to make your requirements explicit. In this case, though, it was a three month project (regurgitating some data out of an existing db) which overran it's deadlines by more than double. Additionally, the delivered software required a significant amount of rework to actually function. This was mostly due to misinterpretation of the language that the requirements were written in (English).
Now, I thought that this was an isolated incident, until my second experience. They, too were level 5 (or claimed to be). Admittedly, they didn't keep up the level 5 charade for long. However, I have to say I have never seen a more disorganized cluster #**! in my life. They put 100 inexperienced college grads on a project that should have taken 10 guys. These people not only got whole requirements wrong, but they also made simple mistakes (like treating database booleans as integers and then wondering why their non zero values were always 1 :)). To make matters worse, they would catch exceptions and drop them. I mean is was truly very, very poor quality work.
With that said, I am sure there are some very good companies out there that are CMM level 5. My experience is clearly from a small sampling. However, none of that speaks to the point that I was actually attempting to get across (and may not have done well). The fact that companies are willing to outsource overseas means that they no longer feel that they need to be in direct control. This either means that they think that the quality produced outside their company will be of equal or better quality to that which could be produced inside their company. Or, it means that they know there will be issues. So, they might as well save money during implementation and then spend money to fix it. Since, they know that they will have to fix it later either way. Simply put, they are accustomed to a lack of quality. So, they are willing to consider lower cost "economy" options and take on the accompanying risk.
This is similar to a person that buys a cheap steak knife. They know that they will have to replace it in a couple of years. They know that they could have bought the knife that lasts a life time and cuts tin cans. However, they would rather spend more money in the long run to save money in the short term. This is because they are used to things breaking and then replacing them. They accept that low standard of quality as the price they pay for buying a less expensive product. In a way, it's a financing plan. ****
|
|