Three Types of Requirements for Testing

[article]
Summary:

Requirements for software are usually grouped into a bewildering array of categories. Functional and nonfunctional requirements are on top, and a huge number of subcategories are underneath. Here, Clint Hoagland boils it down to three categories, differentiated by the way they should be tested.

Let’s say you’re a new tester. You start your first day on the job, you are introduced to the team, and you are shown to your desk. You boot up your computer and meet your adversary: the software you are meant to test. Accompanying that software is a set of requirements that will guide you in your task.

A quick Internet search for “types of requirements” brings up various systems for categorizing requirements, including Hewlett-Packard’s FURPS+ model and the one advanced by the IEEE. These models can be helpful to those who gather requirements, but they’re not all that useful to a tester. For testers, I propose a different, much simpler system in which requirements are categorized by the way they should be tested. For testers, there are really only three categories: explicit, implicit, and latent requirements.

Explicit Requirements: The Things You Wrote Down
Our first type of requirement is the explicit requirement. This is the simplest type and the easiest to test. Explicit requirements are most commonly found in documents communicated by stakeholders to the development team. They might take the form of an elaborate design specification, a set of acceptance criteria, or a set of wireframes.

Is the software what was described by the explicit requirements? If so, that’s good. If it’s not, then that’s bad—log a bug. Simple, right?

What about in the cases where there isn’t a thorough specification? How do you know whether something is a bug or not?

The first thing to remember is that even in the cases where there isn’t an explicit design document, there will always be explicit requirements lying around somewhere. Look for explicit requirements in the form of claims—that is, communications to end users about things the software can do. These can typically be found in user documentation or in marketing materials.

The other thing to remember is that explicit requirements are only a piece of the puzzle.

Implicit Requirements: The Things Your Customers Will Expect
Implicit requirements are the second type. These are all the things that users are going to expect that were not captured explicitly. Examples include performance, usability, availability, and security. Users expect that their password will not be stored in plain text; that requirement need not be written down by anyone.

Consider a cloud-based storage product that lets you store your files online. The product gets a new explicit requirement: Users should be able to share private content to other users via URL, using a share button. However, while testing it is discovered that by modifying a value in the generated URL, it’s possible for other users to view all of the sharing user’s private content. This violates an implicit requirement that only shared content should be accessible to other users, resulting in a show-stopping bug.

Implicit requirements are sometimes called “nonfunctional” requirements, although I find that usage confusing. It’s certainly possible to capture business expectations about any of those “nonfunctional” areas explicitly, at which point they can be treated like any other explicit requirement.

Latent Requirements: Things That Will Delight Your Customers
Lastly, we have latent requirements. Latent requirements represent behaviors that users do not expect based on their previous experiences but which will make them like the software more. An example: My bank has an animated transition when I transfer money between accounts. I didn’t expect it to do that, but it does help me understand that I was successful, and it looks cool, so I’m delighted. Another example would be cloud-sync in gaming: When video games started allowing users to access their saved game files on any computer, users were surprised and delighted by that feature.

Some websites will auto-complete your username when you start to log in. Some will not. This is an example of a latent requirement that is, over time, becoming an implicit requirement.

Testing the Three Types of Requirements
Why break requirements down in this fashion? Is this a useful way to look at software?

For testers, breaking down requirements in this manner is useful because explicit, implicit, and latent requirements must all be treated differently with regards to the way they are tested and the ways in which failures to meet those requirements are handled. Let’s tackle each of them in turn.

Testing: Explicit Requirements
Any time you’re comparing software to any kind of written documentation, you’re testing using explicit requirements. Remember, though, that an explicit requirement usually implies one or more negative checks that should also be performed.

When the software fails to match an explicit requirement, first examine whether it’s the software or the documentation that needs to change. It could be one or the other, or even both. If the software is reaching the testing stage without matching its explicit requirements, it’s worth taking a step back and examining your team’s process, too. Verification of explicit requirements should be handled by the developer writing the code, ideally by creating a set of automated checks demonstrating that those requirements have been satisfied.

Testing: Implicit Requirements
Testing for implicit requirements is a lot trickier, both on the bug discovery side and the bug reporting side. It also represents a tester’s best opportunity to help the development effort.

To test for implicit requirements, a tester must become an expert in the customer’s problem domain and in the technology the software uses to solve those problems. It’s also difficult to demonstrate “coverage” when testing for implicit requirements, though using heuristic testing methods can help.

When the software fails to match an implicit requirement, a report of that failure must also include an explanation of why a customer would expect the software to behave differently. What is the bug’s impact in terms of its effect on a customer’s experience?

Testing: Latent Requirements
Testing for latent requirements is the trickiest of all because it’s impossible to guess what those requirements will be until you get your hands on the software. To test for latent requirements, testers must deeply understand the customer’s preferences, while still keeping in mind that they are not the customer.

UX research techniques can help you increase your understanding. Find out how the customers are actually using the software, and use that information to design scenario tests to discover latent requirements. Remember, too, that end users don’t always know what’s possible and might not ask for everything that can make them happy.

When the software fails to match a latent requirement, that failure represents an opportunity to improve the software. It also represents unplanned work. Rather than treating these opportunities as bugs, find a way to work them into your team’s regular planning process. A team with a fluid mechanism for incorporating new latent requirements will produce a more satisfying product and happier customers.

Testing Is More Than Checking the Explicit Requirements
It can be seductive to see the testing job as a comparison between a specification and the actual software. However, even in situations where a specification exists—and one may not exist—testers must also consider implicit and latent requirements that aren’t written down. An expert tester is a tester who understands deeply those implicit and latent requirements and how to test for them.

User Comments

2 comments
Kathy Iberle's picture

The FURPS and IEEE models can be very useful in test planning, because they provide a heuristic for finding those implicit requirements.  A list of possibly relevant attributes helps the test planner be both fast and thorough.  Many experienced testers have a favorite list, some based on requirements models and others based on a very similar concept often called "test types". 

Here's an example: http://kiberle.com/wp-content/uploads/2016/01/2000-StepbyStepTestDesign.pdf/.

The same article is at: https://www.stickyminds.com/better-software-magazine/step-step-test-design but the sidebar about test types isn't there.

May 3, 2016 - 1:22pm

StickyMinds is a TechWell community.

Through conferences, training, consulting, and online resources, TechWell helps you develop and deliver great software every day.