One of the great breakthroughs in software engineering was Gerald Weinbergโs concept of โegoless programmingโโthe idea that no matter how smart a programmer is, reviews will be beneficial. Weinbergโs ideas were formalized by Michael Fagan into a well-defined review technique called Fagan inspections. The data in support of the quality, cost, and schedule impact of inspections is overwhelming. They are an indispensable part of engineering high quality software. I proposed Fagan inspections as one of the 10 best influences.
Christof Ebert: Inspections are surely a key topic, and with the right instrumentation and training they are one of the most powerful techniques for defect detection. They are both effective and efficient, especially for upfront activities. In addition to large-scale applications, we are applying them to smaller applications, in extreme programming contexts, where daily builds also help.
Terry Bollinger: I do think inspections merit inclusion in this list. They work, they help foster broader understanding and learning, and for the most part they do lead to better code. They can also be abused, for instance in cases where people become indifferent to the skill set of the review team (โwe reviewed it, so it must be rightโ), or when they donโt bother with testing because they are so sure of their inspection process.
Robert Cochran: I would go more basic than this. Reviews of all types are a major positive influence. Yes, Fagan Inspection is one of the most useful members of this class, but I would put the class of inspections/reviews in the list rather than a specific example.
Information Hiding
David Parnasโs 25-year old concept of information hiding is one of the seminal ideas in software engineeringโthe idea that good design consists of identifying โdesign secretsโ that a programโs classes, modules, functions, or even variables and named constants should hide from other parts of the program. While other design approaches focus on notations for expressing design ideas, information hiding provides insight into how to come up with the good design ideas in the first place. Information hiding is at the foundation of both structured design and object oriented design. In an age when buzzword methodologies often occupy center stage, information hiding is a technique with real value.
Terry Bollinger: Here here! To me this technique has done more for real, bottom-line software quality than almost anything else in software engineering. I find it worrisome that folks sometimes confuse it with โabstractionโ and other concepts that donโt enforce the rigor of real information hiding. I find it even more worrisome that some process-oriented methodologies focus so much on bug counting that they are oblivious to why a program with good information hiding is enormously easier to debugโand has fewer bugs to begin withโthan one that lacks pervasive use of this concept.
Incremental Development
The software engineering literature of the 1970s was full of horror stories of software meltdowns during the integration phase. Components were brought together for the first time during โsystem integration.โ So many mistaken or misunderstood interface assumptions were exposed at the same time that debugging a nest of intertwined assumptions became all but impossible. Incremental development and integration approaches have virtually eliminated code-level integration problems on modern software projects. Of these incremental approaches, the daily build is the best example of a real-world approach that works. It minimizes integration risk, provides steady evidence of progress to project stakeholders, keeps quality levels high, and helps team morale because everyone can see that the software works.
Dave Card: I wouldnโt include โdaily buildsโ as one of the 10 best influences. I see that as a โbrute forceโ method of integration. Itโs kind of like having a housing contractor confirm that each window is the right size by holding it up to a hole in the wall, and then chiseling or filling the hole as necessary to make the window fit.
Terry Bollinger: I disagree. Incremental integration, including daily builds, represents a retreat from the earlier, ultimately naive view that everything could be defined with mathematical precision โup front,โ and all the goodies would flow from that. Such philosophies overlooked the inherent limitations of the human brain at foreseeing all the implications of a complex set of requirements.
If you really want to see the daily build approach in ferocious action, take a look at the open source community. When things get busy there releases may even be hours apart. Such fast releases also reduce the human โset up timeโ for returning to the context of each problem.
Steve McConnell: At a level of abstraction higher than daily builds, Barry Boehmโs spiral lifecycle model is an incremental approach that applies to the whole project. The modelโs focus on regular risk reduction is appealing, but the model is so complicated that it can only be used by experts. Iโd give it the prize for most complicated and least understood diagram of the 20th century. Unlike daily builds, it seems like a great incremental development approach that has had little impact in practice.
Terry Bollinger: I disagree. The daily build you touted above really is closer to the spiral model than to โtraditionalโ phased design. And where do you think interface prototyping goes? It surely is not in the original phased development models of the 70s and 80s. Barry Boehm noticed that reality was closer to the daily build model than to the rigidly defined phased models of that era.
Christof Ebert: I donโt agree with Steveโs conclusion. The spiral model diagram was too complex because it tried to put everything into one picture. But the combination of risk management and increments that the spiral model proposes are highly successful when blended together.
Robert L. Glass: In my experience, the spiral model is the approach that most good software people have always used; they just didnโt call it that. Good programmers never did straight waterfall; they may have pretended they did to appease management, but in fact there was always a mix of design-code-test-a-little thrown in along the way.
User Involvement
Weโve seen tremendous developments in the past several years in techniques that bring users more into the software product design process. Techniques such as JAD sessions, user interface prototyping, and use cases engage users with product concepts in ways that paper specifications simply cannot. Requirements problems are usually listed as the #1 cause of software project failure; these techniques go a long way toward eliminating requirements problems.
Wolfgang Strigel: I think user interface prototyping is especially important since it addresses the fundamental engineering concept of building models. No architect would build a bridge without a model to see if those who pay for it will like it. Itโs really simple, if you plan on ignoring the customer, donโt prototype.
Karl Wiegers: Use cases, sensibly applied, provide a powerful mechanism for understanding user requirements and building systems that satisfy them.
Steve Mellor: Whatever techniques are applied, we need to be clear about context. When we talk about use cases for example, they can be very helpful in a context where there is little communication about the problem. Use cases put the software folks face-to-face with the clients and users, and put them in a position to learn the vocabulary of the problem. But use cases are appallingin helping to create an appropriate set of abstractions. Should you use use cases? Depends on your context.
Terry Bollinger: I think these techniques tie back to the value of incremental development. And this includes all forms of prototypingโcommunication protocols, subsystem interfaces, and even the method interfaces to an object class. Thereโs a lot of merit to trying out different approaches to discover what you missed, misunderstood, or got out of proportion.
Automated Revision Control
Automated revision control takes care of mountains of housekeeping details associated with team programming projects. In the Mythical Man-Month in 1975, Fred Brooksโ โsurgical teamโ made use of a librarian. Today, that personโs function is handled software. The efficiencies achieved by todayโs programming teams would be inconceivable without automated revision control.
Terry Bollinger: The promise is high, and the fundamental concept is solid. I donโt think our associated methods for organizing large numbers of programmers efficiently are good enough yet to make the value of this technology really shine. For example, the really interesting code still tends to get written by relatively small teams for whom the benefits of automated revision control are much less conspicuous, for example, many of the open source efforts.
Perhaps the most important software engineering innovations of the next century will come not from more intensive methods for making people more organized, but rather from learning to automate the overall software process in ways that we do not currently understand very well, but which revolve around the overall ability of a computer to remember, communicate, and organize data far better than people do.
I think this kind of innovation will be closely related to methods such as automated revision control. Our current planning tools, which too often just try to automate paper methods, certainly donโt seem to be there yet.
Internet Development
What weโve seen with the Open Source development is just the beginning of collaborative efforts made possible via the Internet. The potential this creates for effective, geographically distributed computing is truly mind boggling.
Terry Bollinger: Internet interactions allow interesting sorts of interactions to happen, and not just with source code. I think this one is important simply because it isnโt very easy to predictโe.g., it may lead to approaches that are neither proprietary nor open source, but which are effective in ways we donโt fully foresee or understand yet.
Wolfgang Strigel: Yes, although it stands on the foundation of best practices in software design and development of the 20th century, it brings new aspects which will change the face of software development as we know it.
Programming Languages Hall of Fame: Fortran, Cobol, Turbo Pascal, Visual Basic
A few specific technologies have had significant influence on software development in the past 30 years. As the first widely used third generation language, Fortran was influential, at least symbolically. Cobol was arguably at least as influential in practice. Originally developed by the U.S. defense department for business use, most practitioners have long-forgotten Cobolโs military origins. Cobol has recently come under attack as being responsible for Y2K problems, but letโs get serious. Y2K is only a Cobol problem because so much code has been written in Cobol. Does anyone really think programmers would have used 4-digit years instead of 2-digit if they had been programming in Fortran instead of Cobol?
Christof Ebert: The real added value of Cobol was to make large IT systems feasible and maintainable, and many of those are still running. The โproblemsโ with Cobol werenโt really about Cobol per se. The problems were with the programmers who happen to be using Cobol. Today, many of these same programmers are using or abusing Java or C++, and the legacy-problems discussion will surely come back someday with regard to those languages.
Turbo Pascalโthe first integrated editor, compiler, debuggerโforever changed computer programming. Prior to Turbo Pascal, compilation was done on disk, separate from editing. To check his work, a programmer had to exit his editing environment, compile the program, and check the results. Turbo Pascal put the editor, compiler, and debugger all in memory at the same time. By making compile-link-execute cycles almost instantaneous, the feedback loop between programmerโs creating code and executing the code was shortened. Programmers became able to experiment in code more effectively than they could in older, more batch-oriented environments. Quick turnaround cycles paved the way for effective code-focused development approaches, such as those in use at Microsoft and touted by Extreme Programming advocates.
Terry Bollinger: I think the real message here is not โTurbo Pascalโ per se, but rather the inception of truly integrated programming environments. With that broadening (integrated environments, with Turbo Pascal as the first commercial example), I would agree that this merits inclusion in the top ten.
Christof Ebert: Aside from better coding, such programming environments also made the incremental approaches feasible that we discussed earlier.
Academics and researchers talked about components and reuse for decades, and nothing happened. Within 18 months of Visual Basicโs release, a thriving pre-built components market had sprung from nothing. The direct-manipulation, drag-and-drop, point-and-click programming interface was a revolutionary advance.
Christof Ebert: Even more relevant is the single measure of speed of penetration of VB. By far the of the fastest growing โlanguagesโ and still going strong. Only Java passed it recently in speed.
Terry Bollinger: Visual Basic provided the benefits promised (but for the most part never really delivered) by object-oriented programming, mainly by keeping the โvisualโ part of the object paradigm and largely dispensing with the complex conversion of objects into linguistic constructions. Visual Basic is limited, but within those limits it has had a profound impact on software development and has made meaningful development possible for a much larger group of people than would have been possible without it.
Capability Maturity Model for Software (SW-CMM)
The Software Engineering Instituteโs SW-CMM is one of the few branded methodologies that has had any affect on typical software organizations. More than 1000 organizations and 5000 projects have undergone SW-CMM assessment, and dozens of organizations have produced mountains of compelling data on the effectiveness of process improvement programs based on the SW-CMM model.
Dave Card: I would consider the software CMM to be both a best influence and a dead end. Steve has explained the case for a best influence. However, I have also seen a lot of gaming and misuse of the CMM that leads to wasted and counterproductive activity. Even worse, it has spawned a growth industry in CMMs for all kinds of thingsโapparently based on the premise that if your model has five levels it must be right, regardless of whether you have any real knowledge of the subject matter. It reminds me of the novel, Hitch-Hikerโs Guide to the Galaxy, where the answer was 42โexcept that now itโs 5.
One of the unanticipated effects of the CMM has been to make the โeffectiveโ body of software engineering knowledge much shallower. My expectation was that an organization that was found to be weak in risk management, for example, would read one of the many good books on risk management or get help from an expert in the field. Instead, what many (if not most) organizations do, is to take the 4 pages in the CMM that describe risk management and re-package it as 4-6 pages of their local process documentation.
Overall, I believe the CMM has helped a lot. However, I think it does have some harmful side effects that can be mitigated if the community is willing to recognize them. Moreover, there are limits to how far it can take us into the 21st century.
Nancy Mead: I donโt think the CMM per se is what is significantโitโs the recognition of a need for standardized software processes, whatever they may be.
Robert Cochran: Standardization is key. Another example is recent efforts to put a good structure on RAD using the DSDM approach. DSDM is not well known in the US, but interest is rapidly growing on this side of the Atlantic.
And donโt forget the many thousands who have used ISO9000 as their SPI framework. I would say that SPI using any of the recognized frameworks (or indeed a mixture of them) is where benefit is gained. They give a structure and coherence to the SPI activity.
Terry Bollinger: I do think that some concept of โprocess improvementโ probably belongs in this list, including both ISO 9001-3 and maybe the first three levels of CMM. But I disagree that the CMM taken as a whole has been one of the most positive influences of the twentieth century.
Wolfgang Strigel: This one can really stir the emotions. Despite its simplicity, the main problem with the CMM is its rigid and ignorant interpretation. The concept itself is sort of a truism. I believe the biggest accomplishment of CMM is in the Software Engineering Instituteโs success in marketing the concept. It certainly has raised the awareness within thousands of companies about โgood software engineering practices.โ Universities were woefully slow in recognizing software engineering as a topic, and the SEI jumped into the fray, educating not only new grads but old professionals as well. Its problems get aggravated with people reading too much into it. The grading is questionable and not the most relevant part of the CMM. I disagree that ISO 9000-3 would merit a similar place in the hall of fame. After all, with ISO 9000-3 you can diligently do all the wrong things, and you will be compliant as long as you do them diligently.
It may be time to throw away the old CMM model and come up with an update that takes into account the experiences from the last 10 years. I donโt think we need a new model or more models; just taking the cobwebs out of the current one will carry us for the next 10 years.
Christof Ebert: Despite the argument about ISO 9000, the CMM is definitely the standard that set the pace. At present, it is the framework that allows benchmarking and that contributes most to solving the software engineering crisesโjust by defining standard terminology. This is my personal favorite โbest influence.โ
Object Oriented Programming
Object oriented programming offered great improvements in โnaturalโ design and programming. After the initial hype faded, practitioners were sometimes left with programming technologies that increased complexity, provided only marginal productivity gains, produced unmaintainable code, and could only be used by experts. In the final analysis, the real benefit of object oriented programming is probably not objects, per se, but the ability to aggregate programming concepts into larger chunks than subroutines or functions.
Terry Bollinger: Whatever real benefits people were getting from OO, it was mostly coming from the use of traditional Parnas-style information hiding, and not all the other accoutrements of OO.
OO is a badly mixed metaphor. It tries to play on the visual abilities of the human brain to organize objects in a space, but then almost immediately converts that over into an awful set of syntax construction that donโt map well at all into the linguistic parts of our brains. The result, surprise surprise, is an ugly mix that is usually neither particularly intuitive nor as powerful as it was originally touted. (Has anyone done any really deep inheritance hierarchies lately? And got them to actually work without ending up putting in more effort than it would have taken to simply do the whole thing over again?)
Christof Ebert: Agreed, because people havenโt taken the time to understand object orientation completely before they design with it. In a short time, it has created more harm to systems than so-called โbad Cobol programming.โ
Takaya Ishida: I think it is questionable whether Cobol and Object Oriented Programming should be on the 10 best list or 10 worst list. It seems to me that Cobol and other popular high level procedural languages should be blamed for most of the current problems in programming. It is so easy to program in such languages that even novice programmers can write large programs. But the result is often not well-structured. With object-oriented languages such as Smalltalk, programmers take pains to learn how to program, and the result is well structured programs. Having programming approaches that can be used only by experts may be a good idea. Our past attitude of โeasy goingโ programming is a major cause of the present software crisis including the Y2K issue.
Terry Bollinger: I really like Smalltalk style OO, and Iโm optimistic about Java. The real killer was C++, which mixed metaphors in a truly horrible fashion and positively encouraged old C programmers to create classes with a few hundred โmethodsโ in them, to make a system โobject oriented.โArgh! So maybe the culprit here is more C++, rather than OO per se, and thereโs still hope for object oriented programming.
Component Based Development
Component-based development has held out much promise, but aside from a few limited successes, it seems to be shaping up to be another idea that works better in the laboratory than in the real world. Component-version incompatibilities have given rise to massive setup-program headaches, unpredictable interactions among programs, de-installation problems, and a need for utilities that restore all the components on a personโs computer to โlast known good state.โ This might be one of the 10 best for the twenty first century, probably not for the twentieth.
Robert L. Glass: The reuse (component-based) movement might have trouble getting off the ground, but some related developments have shown great promise. Design patterns are a powerful medium for capturing, expressing, and packaging design concepts/artifactsโโdesign componentsโ if you will. The patterns movement looks to be the successful end-run around the reuse movement. Similarly, weโve been very successful in domain generalization. Compiler-building was perhaps the first such success many decades ago
Steve McConnell: Yes, 30 years ago a person could get a Ph.D. just for writing a compiler, but today compiler-building domain-generalization has been such a success that compiler building is usually taken for granted.
Robert L. Glass: There have been too few successes other than compiler building. I would assert that the Enterprise Resource Planning (ERP) system, such as SAPโs, is the most recent and most successful example of building general purpose software to handle a whole domainโin this case, backoffice business systems.
Terry Bollinger: The components area has indeed been a major headache, but itโs sort of an โongoingโ problem, not one that has been fully recognized as a dead end. Visual Basic is an example of how component based design can work if the environment is sufficiently validated ahead of time.
The fully generic โanything from anybodyโ model is much more problematic, and is almost certainly beyond the real state of the artโversus the โmarketing state of the artโโwhich generally is a few years to a few centuries more advanced than the real state of the art.
Christof Ebert: Work on components helped to see the direction towards frameworks, reuse, and pattern languages. So it is a useful intermediate step. Thinking in components still is a good design principle and entirely in line with the point made earlier about information hiding.
Metrics and Measurement
Metrics and measurement have the potential to revolutionize software engineering. In the few instances in which they have been used effectively (NASAโs Software Engineering Lab and a few other organizations), the insights that well-defined numbers can provide have been amazingly useful. Powerful as they can be, software process measurements arenโt the end, theyโre the means to the end. The metrics community seems to forget this lesson again every year.
Dave Card: I would include โmetricsโ as a dead end, but I also would include โmeasurementโ in the list of top prospects for the twenty first century. The word โmetricsโ doesnโt appear in most dictionaries as a nounโand that is a good indication of how poorly measurement is understood and why it is consequently misused in our discipline. The ability to express things meaningfully in numbers is an essential component of any engineering discipline, so I donโt see how we can have โsoftware engineeringโ in the 21st century without โmeasurement.โ
Deependra Moitra: Any problems are not with metrics but with the metrics community. We cannot blame an idea/concept/approach if people messed up with it. I canโt imagine performing my job without metrics, especially in a geographically distributed development mode.
Terry Bollinger: Metrics are a problem. Our foundations are off badly in this area, because we keep trying to apply variations of the metrics that worked well for manufacturing to the highly creative design problems of software. Itโs a lot like trying to teach people how to write like Shakespeare by constantly checking their words for spelling errors.
Christof Ebert: Entirely true, but still metrics as such are of big value. We should see metrics as a tool (or means). Metrics usage has matured dramatically over the past ten years. A metrics discipline in software engineering is as necessary as theoretic foundations are in other engineering disciplines to avoid software development approaches that are built on nothing more than gut feeling.
Wolfgang Strigel: Regardless of terminology (metrics or measurements), I believe that this should become another top contender for the next century. How can we claim to be professionals if nobody can measure some basics facts about the development activity? How can we manage activities that cannot be measured? Measurement failed in this century not because it is a bad concept, but because our profession is still in its infancy. There is no other engineering discipline that would not measure product characteristics. And there is no other manufacturing activity that would not try to measure output over input (=productivity). It is not the concept of measurement that is at fault. The problem are consumers who have become too tolerant of faulty software and a red-hot market place in which inefficiencies do not cause the failure of companies. If there was an oversupply in software developers we would see a lot more measurement applied to optimize the process.
Leave a Reply