Running Down Assumptions

[article]
Summary:

Do you think the assumptions you make about your software project are important? I do. One of the biggest sources of software project failure is hidden assumptions, especially about your requirements. These assumptions have a way of coming out of the woodwork–usually at the worst possible moment–to foil your projects. But there are ways to track down and expose assumptions.

Recently, as I was watching a cheap action flick, I was astounded to hear the chief villain utter what I consider to be one of the most important principles of system design. He said, "Assumption is the principle source of system design failure."

Okay, he didn't say it exactly like that. What he said was a bit coarser and quite a bit more profane, but it amounts to the same thing. He repeated this catch phrase over and over during the film. Of course, in true Hollywood style he eventually fell victim to the very principle he was espousing all along. I believe the assumption that got him was "I have a foolproof plan." His final words were "I didn't think of that."

As with our villain, assumptions can be dangerous to software projects. In software design organizations, I've seen two common patterns of assumptions that have led to system design failure. The danger is compounded by the shared trait–the assumptions are unstated.

The first kind is hidden, false assumptions. This is where everybody assumes something to be true that isn't. A popular example is "Enough people will really like our great idea, and they'll pay us lots of money to use it."

The second kind is ambiguous assumptions, such as where key stakeholders are making different assumptions about an important issue, and nobody knows it. A prime example is where development management thinks they've produced an early prototype, and marketing is out pushing it as a complete solution.

What assumptions are you making about the software you're producing? In my requirements workshops, I encourage participants to surface and document every single assumption that any critical player might be making, and then go off and make sure that those assumptions are both shared and true. Does that sound hard to do? There are some highly effective heuristic techniques, such as context-free questions, to help make the assumption hunt more likely to turn up as much as possible.

Some of the context-free questions I always ask are "What problems are we really trying to solve?" and "What problems might a highly successful solution create?" Answers to these questions can be eye-openers.

Although they aren't especially difficult, assumption expeditions can take some organizing. Despite the effort, participants usually confess that finding just one important assumption has made spending the time worthwhile.

One way to start the hunt is to make it part of a requirements-discovery workshop. It's important that enough of the right stakeholders participate. You need a good cross section representing groups with differing interests, such as development, marketing, QA, and possibly support, manufacturing, legal, or others. The idea is to get people who will have to face the consequences of false or ambiguous assumptions in the same room to discover and examine their assumptions together.

But don't just stop with the hunt. Once you document your set of assumptions, somebody needs to manage them for the rest of the project's duration. Expect to come across more assumptions that will need to be resolved. Be prepared for the exciting discovery that some you thought were true will, in fact, turn out to be false. If you fail to give somebody the job of managing your assumptions, I guarantee they'll sink back into the woodwork, hiding there until they become a crisis.

At which point you, like our B-movie villain, might be heard saying, "I didn't think of that," as your project is foiled.

About the author

Brian Lawrence's picture Brian Lawrence

Brian Lawrence is an author and a consultant who teaches and facilitates requirements analysis, peer reviews, project planning,risk management, life cycles, and design specification techniques. He is currently serving as the technical editor of Software Testing and Quality Engineering Magazine, and is on the editorial board of IEEE Software. Brian is a participant in Jerry Weinberg’s 1996 Software Engineering Management Group and is a member of the ACM and the IEEE Computer Society. He also is an instructor at the University of California Santa Cruz Extension program in software engineering.

StickyMinds is one of the growing communities of the TechWell network.

Featuring fresh, insightful stories, TechWell.com is the place to go for what is happening in software development and delivery.  Join the conversation now!

Upcoming Events

Aug 25
Aug 26
Sep 22
Oct 12