Don't Believe Everything You Read!

[article]
Summary:
There are volumes of written material covering just about every aspect of software engineering. Books, articles, magazines, conference proceedings, Web sites, and other rich sources of information are readily available to those learning about our profession. However, based on personal experience and observation, Ed Weller is compelled to ask how much of this information is actually misinformation. Anytime you collect data you must proceed with caution! In this article, we'll find out why Ed questions validity and accuracy and what you can do next time you're faced with questionable material.

The Misstatements and Misinformation
Many years ago, IEEE Software magazine published "Lessons from Three Years of Inspection Data," in which I described the results of the inspection program at Bull HN Information Systems using four case studies, which were more like experience reports. I ran across a reference to this article recently in a highly respected information source (anonymous to protect the publication). In it, they made the following statements paraphrasing the article:

"6,000 inspections..." The actual number of inspections was 7,413.

"They estimate that code inspections before unit test found 80 percent of the defects." The article reported the results of two projects--one project found 80 percent of the product defects prior to system testing and another project found 98.7 percent of defects by inspection after a year of "unit test, integration test, and continued use of the product by other projects."

"Inspections after system test found 70 percent of the defects." An ambiguous statement that could mean inspections were conducted after system test (which is a rather absurd use of the inspection process), or that the inspection effectiveness would drop after system test was concluded. The article said the effectiveness would "undoubtedly drop after system test," but mentioned no numbers. In fact this statement was in the fourth case study, yet was applied to the paraphrase of the first case study.

"They have concluded inspections can replace unit testing." I stated that the findings supported suggestions by Frank Ackerman and his colleagues that inspections could replace unit testing, but specifically stated we ran the unit test cases in integration test "to ensure that inspections did not miss categories of defects that are difficult to detect by inspection."

There were four errors in three sentences that significantly changed the information conveyed in the original article. Since this was a paraphrase, and not a direct quotation, there is no way the casual reader would be aware of the rephrasing. The errors are obviously unintentional, and I suspect the author's interpretation of the original article in the context of how they execute inspections and testing is the source of the errors. One of the difficulties in writing articles is the limited space allocated by editors, which precludes providing all of the desired background information that would allow the reader to understand all of the factors needed to accurately translate the information in the original article into their organization or secondary article.

In another case, data from the IEEE Software article was used to calculate the ROI of inspections in a white paper on a Web site. The end result was that inspections saved several times the total development budget of the organization. I never could determine how these calculations were done, but at my request they were deleted from the presentation and removed from the Web site.

Incomplete Information
One of the more interesting cases of incomplete information is the statement that the chance of an error in a one-line code change is 50 percent. This number is given without any reference to where it is measured--initial code, after unit testing, after system testing, or in use. To be fair, the source for this suggests this number is the initial error rate without any inspection or testing, but the article is not explicit. If you are trying to measure maintenance quality or convince your organization to change its maintenance process, knowing exactly where in the lifecycle this number was measured is important.

I have measured this number in a variety of organizations, starting with the defect rate of the fixes shipped to the customer, with rates varying from 0 percent to

About the author

Ed Weller's picture Ed Weller

Ed Weller is an SEI certified High Maturity Appraiser for CMMI® appraisals, with nearly forty years of experience in hardware and software engineering. Ed is the principal of Integrated Productivity Solutions, a consulting firm that is focused on providing solutions to companies seeking to improve their development productivity. Ed is a regular columnist on StickyMinds.com and can be contacted at edwardfwelleriii@msn.com.

StickyMinds is one of the growing communities of the TechWell network.

Featuring fresh, insightful stories, TechWell.com is the place to go for what is happening in software development and delivery.  Join the conversation now!