Daniel Jackson

Faculty Research Guide Entry, 1996

I'm interested in many aspects of software engineering, but recently have focused on the most critical area (the development of requirements and specifications) and the one that consumes the most money (maintenance). What both these areas need, I believe, is sound methods and techniques backed by powerful, automatic tools.

I have been working, with Michael Jackson, on the structuring of requirements. We are investigating novel decompositions that are based not on the structure of a purported solution (as invariably done in practice) but rather on the structure of the problem the software is intended to solve. This approach exposes a problem's inherent subtleties, as well as the aspects it shares with other problems, making reuse of problem descriptions and solutions possible.

I have developed, with Craig Damon, a notation (called NP) for specifying requirements in a very abstract and succinct form. We have developed a scheme for checking properties of NP specifications that exploits symmetry in the mathematical structure of the property being checked. Damon has built a tool called Nitpick using this scheme that works completely automatically, analyzing specifications that were previously thought to be amenable only to theorem proving.

Many different kinds of problem can be specified in NP, so Nitpick can be used to analyze not only requirements, but also specifications and abstract designs. Using Nitpick/NP, we found some interesting problems with the paragraph style mechanism of Microsoft Word. We have also analyzed an air-traffic control handoff protocol, a basic telephone switch, and, with Jeannette Wing and Dave Johnson, a mobile internet protocol.

We are now investigating new checking schemes that will scale to larger specifications. With Somesh Jha, we developed a scheme based on Bryant's ordered binary decision diagrams that outperforms our existing scheme on many examples. We are also developing visualization techniques for displaying complex counterexamples, and are beginning to consider embedding Nitpick in a more general simulation framework.

At the other end of the lifecycle, I have been working on tools for reverse engineering: that is, extracting structural information from large programs. Gene Rollins and I developed a tool called Chopshop that calculates program slices for large C programs in a modular fashion, and can display the results not only as code highlighted in an editor buffer, but also as graphs showing the semantic relationships between procedures. Robert O'Callahan has developed Lackwit, a successor to Chopshop, that employs type inference in place of dependence analysis. Although there is some loss of information, the gains are significant: we can analyze much larger programs, and account for aliasing and function-valued variables without affecting the cost of the analysis. Recently, we worked with Reid Simmons to analyze a robotics application. Lackwit produced information about global use of data structures that was not easily obtainable by any other method, and exposed a variety of flaws (including a storage leak in a loop).