Communicating with Context

[article]
Summary:
Danny Faught and Michael Bolton have observed that context-driven testers have a uniquely productive style of communication that they use amongst themselves. This context-driven culture might seem strange to outsiders or newcomers to the community. In this column, Danny and Michael highlight some of these specific traits that can help you communicate whether you consider yourself context-driven or not.

Many disagreements can be attributed to communication problems. The worst cases are those in which we don't even realize that our message isn't getting across the way we intended, and that's often because we're working in different contexts. Here are a few lessons that we've learned from years of discussions with context-driven testers.

Context-Driven Principles
In his blog, James Bach writes, "'Context-driven' means to match your solutions to your problems as your problems change". In the context-driven community, we don't believe that there are best practices; that is, there are no approaches that work universally for every project. It's fine to discuss practices that have worked in a particular context, but it's important to identify the aspects of the context that might be related to the success or failure of the practice-and how the context may color the ways in which the advice is given or received.

What's the Context?
There are dozens of possible dimensions of context that might vary from one development project to another. The context-driven principles state that people, working together, are the most important part of any project's context. Other key aspects of context include the product, customers, development group, schedule, budget, and resources. This isn't an exhaustive list by any means, and sometimes contextual elements can be subtle. In one company that Michael worked with, a change in the corporate email client had a profoundly negative impact on the company culture. eXtreme Programmers suggest that the arrangement of the workspace can make huge differences in team dynamics. Recognizing such differences and how they might matter is an important skill for people who are giving and getting advice.

What's the Problem?
A common trap is to argue about solutions without considering what the problem is in the first place. For example, someone who needs to efficiently execute a suite of functional tests through a Windows GUI every night might innocently ask "What are the best test tools?" That person might be surprised by recommendations for a unit test automation tool or a test management tool for manual test cases because he didn't mention the problem he's trying to solve.

Leaving the problem statement out of the discussion can limit the range of solutions that people offer-or can lead people to propose solutions to problems that you don't have. Open yourself up to creative solutions by keeping the problem in the open.

Avoid Hyperbole
Be careful about saying "always," as in "Always automate your tests." From a context-driven point of view, an absolute suggestion would have to make sense in all possible contexts. To be credible, you have to understand all contexts. Even if you had this kind of omniscience, it would be difficult to convince anyone else that you did.

Often when people say "always," they are referring to a particular context that isn't stated explicitly. If you're not sure what that context is, trying to find counterexamples to the assertion you're discussing can be helpful in framing the context. For example, when faced with the edict "Always automate your tests," you might ask:

  • If we're a week away from a big release and we've never automated a test before, should we start automating now? (Timing is part of the context.)
  • Should we automate usability tests? (The context might involve just one of many types of testing.)
  • Should we automate tests against an interface that is changing frequently? (This is a context that may have a poor return on investment for automation.)

Be Precise
When context-driven people talk to each other, we often ask clarifying or definitional questions. Why all the fuss about definitions? Because it's hard to agree on anything when we don' even agree on what we're talking about. As Toronto-based risk consultant Ian Heppell suggests, "All arguments are really about premises," even though they may look like they're about conclusions.

Instead of saying, "Always automate your tests," you might say, "Using automated unit tests has helped me quickly find coding problems on three of my last four projects. On one project, though, the other developers didn't help maintain the tests, and it took too much effort for me to maintain all the tests myself." Someone may ask, "What do you mean by 'unit test'?" Instead of pretending that there is universal understanding and consensus on what constitutes a unit test, you may offer, "All the tests that a developer runs, even if they're not at the unit level." When you are talking to people who are working in a different context, there's a risk that you might be using the same words to describe different things. In the context-driven community, "What do you mean by . . . ?" is a question that we expect to hear and to ask-and to answer.

In an online discussion, Danny said, "I'm working with a startup that has issued some of their releases without doing any testing." Michael responded as follows:

If testing is "questioning a product in order to evaluate it," did they really not do any testing? Perhaps you mean that they didn't have any people called "testers" on the project, or that they didn't schedule some time for a "testing phase," or that they didn't do testing off the developers' machines, or that they didn't do very good testing. Can you tell us more?

Danny, to his credit, didn't take this personally. He recognized that his statement had been vague and clarified that he meant that no testing specialists had tested the product before its release. It can be surprising how words like "testing" that are so important to our work can have so many different meanings when we don't make the effort to be specific when we use them. Be wary of words that are likely to be interpreted differently by different people. In the example above, if Michael had made assumptions about what "testing" meant instead of clarifying, the rest of the discussion may have been unproductive.

Keep It Real
When we are discussing the merits of a particular practice, context-driven testers value real examples from personal experience. There is a great temptation to talk in terms of "what ifs," but it can be helpful to sort your anecdotes into one of three bins: 1) things I did or observed, 2) things people told me about, and 3) things I think might have happened or might happen in the future. It can be humbling to realize that much of what you're trying to say is conjecture if you don't have a personal experience to back it up. It's OK to talk in conjectures, as long as you make it clear that that's what you're doing.

Don't Forget Your Context
To communicate well, remember that you're not omnipotent, and neither are your listeners. If you make the extra effort to frame your statements well, your message will come across more clearly. After all, it's usually cheaper and easier to say it right the first time.

References

About the author

StickyMinds is a TechWell community.

Through conferences, training, consulting, and online resources, TechWell helps you develop and deliver great software every day.