Skip to content

Instantly share code, notes, and snippets.

@Morendil
Last active October 9, 2025 22:47
Show Gist options
  • Save Morendil/ebfa32d10528af04e2ccb8995e3cb4a7 to your computer and use it in GitHub Desktop.
Save Morendil/ebfa32d10528af04e2ccb8995e3cb4a7 to your computer and use it in GitHub Desktop.
The IBM Systems Science Institute

The IBM Systems Science Institute

Rubric: Software Engineering : Factual Claims : Defect Cost Increase : Pressman Ratios

Context

Background: I have been researching quantity and quality of empirical evidence underlying claims in software engineering. What do we know, and how well-established is that? See in particular https://leanpub.com/leprechauns which concludes that the answer is in (too) many cases "not much, and poor".

This applies in particular to the "Defect Cost Increase" claim, which is poorly supported by evidence. The claim states that the longer a defect stays undiscovered after being introduced in a software system's artefacts, the more expensive it is to correct.

The "Pressman Ratios" are a specific quantitative assessment of this claimed effect:

"Assume that an error uncovered during design will cost 1.0 monetary unit to correct. Relative to this cost, the same error uncovered just before testing commences will cost 6.5 units; during testing, 15 units; and after release, between 60 and 100 units."

We can compress this to "1:6.5:15:60-100", for ease of distinguishing with other variants of the same claim that differ only in the precise ratios.

The reference cited by Pressman is initially formatted as follows:

[IBM81] "Implementing Software Inspections," course notes, IBM Systems Sciences Institute, IBM Corporation, 1981

Evidence for the existence of the Institute

Evidence that the Systems Sciences Institute even existed is scant and hard to come by. A search of the IBM corporate web site yields nothing, for instance.

Using Google Scholar with search terms "IBM Systems Sciences Institute" (Scholar) yields a short list of people who seem to have been affiliated with the organization:

I am using middle names and middle initials where available to be allow specific Google searches and disambiguations, the usual citation style only has the last name in full, e.g. "A. O. Allen".

What was the IBM Systems Science Institute ?

Quoting Claude W. Burrill:

The Systems Science Institute is an educational organization and is a unit of IBM. We offer a variety of advanced courses for management and for professionals in the computer field.

The Insitute was, in essence, an internal training program for employees at IBM.

Link to the IBM Systems Research Institute

Confusingly, several biographies of the people listed above associate them with the "IBM Systems Research Institute" (for instance, Burrill's obituary). A New York Times article describes it as follows:

(...) virtually all I.B.M. employees receive some kind of company-financed education or training each year beyond basic job training. The education might range from attendance at special lectures to full-fledged courses, inside or outside corporate facilities. Despite its other programs, I.B.M. believes it needs a graduate-level school for its own use. The Systems Research Institute, founded in 1960, is the closest thing to an academic center at I.B.M.. (...) But the Systems Research Institute is not an academic place. ''It's all business,'' said its associate director, Joseph E. Flanagan.

The IBM Systems Research Institute was based in the state of New York, with offices in Manhattan and Thornwood. (Giant, Origins, Thornwood)

Location, history and details

The IBM Systems Science Institute was based in Los Angeles, at 3550 Wilshire Boulevard. (Address).

It started operations as early as 1967 (according to an IBM ad referring to its post-1982 name, see next paragraph). Started

It ceased to operate under that name before 1982, according to a different ad. (Rename) It became the "Information Systems Management Institute".

Incidence on the evaluation of the Defect Cost Increase claim

Another article will deal in more detail with the credibility of the Pressman Ratios, and how they are widely but generally inappropriately cited throughout the literature on software engineering.

The main findings of the present entry on the "IBM Systems Science Institute" are as follows:

  • the Institute was a corporate training program, not a research body; as such it is inappropriate to cite the source of the ratios as "an IBM study" or "a study by the IBM Systems Science Institute", in the total absence of any claim that the Institute was the primary source;
  • the original project data, if any exist, are not more recent than 1981, and probably older; and could be as old as 1967.

References

@tyezdro
Copy link

tyezdro commented Oct 9, 2025

I appreciate the work here to track down the origins of the 100x assertion and shed light on an idea that may not be founded on data as concrete or robust as many assume it was. It's indeed unfortunate that an internal IBM training snowballed into a pinnacle of truth.

However, it's worth considering that there is still merit to original idea: that fixing a problem earlier will usually be easier (and cheaper). Raising concerns with the data (or lack thereof) that supports the scale of that idea is great, but does not necessarily prove that it's altogether untrue.

  1. The world of software engineering in the time of Boehm (i.e. 70's and 80's) was very different
  2. The explanation provided in Software Engineering Economics arrives at 100x increase in cost at production, but it is presented as notional and not as a concrete measurement. Generally speaking, the reasoning is sound.
  3. There have been many attempts to measure this over a long period of time with very different results. This really just seems to imply that every situation is different, and there are too many variables to be able to arrive at a single, definitive multiplier that we can all use to predict effort for defect resolution during the different time periods of any given project.
  4. The article shared by @trdrake (thanks for that!) was even an attempt to show that there is no such thing as DIE and that defects cost the same an any stage. However, even that paper, clearly trying to show that this idea has no merit, fails to do so, IMHO. a) It explicitly does not account for resolution of defects in the same phase, which is when they would be presumably the least effort. b) Despite their claims of "statistical insignificance", their charts showing the resulting effort across phases all clearly have an upward trend; even if it's not 100x, it's there. I don't need to be a statistician to see it.
  5. I don't need to study 100's of teams to know that the time I would need to understand and resolve a bug in code I wrote 2 days ago would be far less than code I wrote 6 months ago, or in a design created by someone else 4 PI's ago. And the amount of overhead involved with retesting fixes on large systems during final integration is obviously going to be far higher than for fixes happening in unit test or early regression.

Again, I very much appreciate the concern with the scale of the original claim and how it's been poorly cited over the years. But I'm not sure it's fair to call it a "leprechaun" and imply that the whole idea is some sort of fantasy.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment