The Debugging Scandal and What to Do About It
Communications of the ACM
April 1997
Henry Lieberman, guest editor
Media Laboratory
Massachusetts Institute of Technology
Table of Contents
Introduction to the Special Issue on the Debugging Scandal
Henry Lieberman, MIT
Marc Eisenstadt, Knowledge Media Institute, The Open University, UK
Debugging and the the Experience of Immediacy
David Ungar, Sun Microsystems Laboratories,
Henry Lieberman, MIT Media Lab,
Christopher Fry, PowerScout
Software Visualization for Debugging
Ron Baecker, and Chris DiGiano, University of Toronto,
Aaron Marcus, Aaron Marcus and Associates
Programming on an Already Full Brain
Christopher Fry, PowerScout
Fostering Debugging Communities on the Web
John Domingue and Paul Mulholland, Knowledge Media Institute, The Open
Unversity, UK
Thrown from Kansas into Oz: Collaborative Debugging when a Shared World Breaks
Randy Smith, Dave Ungar and Mario Wolczo, Sun Microsystems Laboratories
Introduction to the Special Issue
The Debugging Scandal and What to Do About It
April 1997
Henry Lieberman
Media Laboratory
Massachusetts Institute of Technology
Debugging is the dirty little secret of computer science. Despite all
the progress we have made in the last thirty years: faster computers, networking,
easy-to-use graphical interfaces, and everything else, we still face some
embarrassing facts. First, all too often, computer programs don't work
as they should. This makes software development costly. Too much buggy
software reaches end users, leading to needless expense and frustration.
That's unfortunate, but what is surprising is the fact that when something
does go wrong, the people who write these programs still have no good ways
of figuring out exactly what went wrong. Debugging is still, as it was
thirty years ago, largely a matter of trial and error.
What borders on scandal is the fact that the computer science community
as a whole has largely ignored the debugging problem. This is inexcusable,
considering the vast economic cost of debugging and emotional toll buggy
software takes on users and programmers. Today's commercial programming
environments provide debugging tools that are little better than the tools
that came with programming environments thirty years ago. It is a sad commentary
on the state of the art that many programmers name "inserting print
statements" as their debugging technique of choice.
We can do much better. Programs are complex artifacts, and debugging
is often a struggle against complexity. Why should the programmer have
to face it alone? Let's make the computer take an active role in helping
the programmer deal with complexity.
Computers are now fast and have large memories and disks. Let's use
some of that speed and storage to process information that the programmer
needs to understand what's going on in the program. Computers now have
beautiful color screens and fast graphics. Let's use those graphics to
help the programmer visualize the behavior of the program. We now understand
user interfaces and the programming process much better than we used to.
Let's use some of that understanding to help programmers with the cognitive
task of relating the static description embodied in the code to the dynamic
behavior of the program.
Some people express despair at the possibility of significant improvements
to the debugging process. "Debugging is just plain hard", they
say. Many programmers display a macho attitude, saying, "real programmers
don't need debugging tools". We disagree. Good tools can make a seemingly
intransigent bug appear obvious. This can save significant effort, even
for the most experienced of programmers. Others put their faith in formal
techniques such as program verification, which they claim will virtually
eliminate the need for debugging. We don't think so. Not only is total
elimination of bugs unrealistic, but software is continually evolving.
Debugging tools are just as necessary for incremental evolution of software
as they are for finding errors.
In this special issue on The Debugging Scandal, we aim to show that
there is a wide spectrum of innovative ideas that have great potential
for radical improvements to the debugging process.
First, to get a better understanding of the problem, Marc Eisenstadt
presents us with some bittersweet tales from the front lines in "'My
Hairiest Bug' War Stories". Every veteran programmer can recognize
in these stories something of their own experience. Sometimes you don't
know whether to laugh or to cry at such stories, but they give us some
valuable insights into the detective work of debugging, and a hint as to
where and how better tools could be applied.
In "Debugging with the Experience of Immediacy", Dave Ungar,
Henry Lieberman and Christopher Fry talk about the subjective experience
of debugging. What makes a tool feel so natural that you cease thinking
about the tool itself and focus on the job to be done? Can we achieve such
transparency in a programming environment, so that debugging feels less
frustrating?
Some possible answers to these questions are illustrated with some aspects
of the Self system and the Lisp stepper ZStep 95. ZStep 95 keeps a complete
history of the computation so that the program can be run backwards for
debugging, animates the source code of the program as it executes, and
keeps a correspondence between expressions, their values, and graphic output.
You're always only one click away from answering questions like "What
just happened?", "Where did that value come from?" and "What
code drew that image?".
Ron Baecker, whose early film, "Sorting out Sorting" helped
found the Software Visualization field, reports with collegaues Chris DiGiano
and Aaron Marcus on "Software Visualization for Debugging". To
debug a program, you have to see what the program is doing. Just as the
field of Scientific Visualization has helped physicists understand physical
processes, Software Visualization brings the power of visual perception
to bear on the dynamic and emergent properties of software. Watching a
visualization, bugs can literally become obvious.
Christopher Fry tackles the complexity problem head-on in "Programming
on an Already Full Brain". Why wait to even run a program before you
start to debug it? Often in programming, context can determine what choices
a programmer is likely to need next, so why can't the computer make these
choices available right when and where they are needed? This "just-in-time"
help system can drastically reduces typing and the possibility for error.
Why should debugging always be a solo activity? The last two papers
treat the exciting notion of collaborative debugging. John Domingue and
Paul Mulholland show how we can use the Web to foster communities that
can help each other in debugging. Instead of a simple e-mail bug report,
they can create a transportable dynamic simulation that can give a correspondant
the feel of "being there" in someone else's code.
Randy Smith, Dave Ungar and Mario Wolczo, in "The 'Thrown into
Oz' Metaphor: Collaborative Debugging When a Shared World Breaks"
demonstrate how programmers can collaborate to debug a program more effectively
than any of them would be able to alone. They show how the metaphor of
virtual worlds facilitates the debugging process. They present facilities
that help a group talk about a broken shared world by creaitng another
world in which to do the debugging.
What's wrong with the computer industry? I hope that this issue will
convince you that one of the biggest "bugs" in the industry is
the lack of attention to improving the tools for debugging programs. I
hope you will also be convinced that there are plenty of ideas with great
potential for "debugging" this situation. So let's fix it!