What to Do with Too Much Test Documentation: An Interview with Fiona Charles

[interview]
Summary:

Fiona Charles is a Toronto-based test consultant and manager with thirty years of experience in software development and integration projects. In this interview, Fiona discusses excess test documentation and what to do about it as well as the art of delivering bad news.

Fiona Charles is a Toronto-based test consultant and manager with thirty years of experience in software development and integration projects. In this interview, Fiona discusses excess test documentation and what to do about it as well as the art of delivering bad news.

Jonathan Vanian: This is Fiona Charles. Fiona, thank you so much for being here with us today.

Fiona Charles: You're welcome. Thank you.

JV: Why don't you just talk a little bit about yourself, your career so far, and just explain your expertise to some of our listeners and readers?

FC: I'm a consultant and sometimes contract test manager. I've been in the industry since 1978, so I've been around for a while. Mostly in the last few years, I've either been consulting on testing and test management or I've been managing testing, as I said. Typically, I manage testing on large multi-project programs.

JV: Ok.

FC: And I do rescues of testing efforts that have gone astray. I look for problems to solve.

JV: You look for problems to solve. Is that a good definition of a tester?

FC: Oh, I think so. Having been doing it for a long time and I’ve been around testing for a long time, I make sure that that's what I do, so I do boring stuff.

JV: (Laughs). Let's talk a little bit about test documentation and how that can sometimes boggle down testers. Can you talk about an example where a team might be overwhelmed with documentation?

FC: I certainly can. I have a lot of examples, but I'll talk about one. I was brought into a project to "fix the testing," and when I got there, what I found was that the testers had been ... There were actually two leads and a couple of testers. They hadn't had very much to test, and they'd been sitting around for about a year, and they'd been churning out documents because that was what their management wanted them to do and it was what the standards set by their testing practice wanted them to do.

JV: Right.

FC: I can't remember how many documents now, but there was a strategy. There was a master test brand. There were a bunch of plans because there was a plan for this and a plan for that and a plan for what happens when we get this. Just about every single one of those plans was twenty-five pages at least, and I started keeping count because it became quite consistent. Around about page twelve or thirteen, you got this tiny, tiny little bit of content about the project, but up to then, it was boilerplate. It was definitions. It was a whole lot of nonsense that had nothing whatever to do with the project. Then you'd get past that little bit of content, and there'd be more stuff.

JV: A lot of busy work.

FC: It is really busy work, and it's a terrible waste of time. Not only is it a waste of the tester's time and expertise, it's a waste of everybody else's time. You ask people to read this stuff because you want them to buy in. You want them to understand what it is that you're doing. They can't read it, and they don't read it. In this particular case, all of this stuff was completely ignored.

They had a strategy document. It was fifty-pages long, and it had no strategy whatsoever in it, and it wasn't the tester's fault. It was because of what they were being asked to do, and I find that's very common. It's a result, I think, of certain kinds of standards. The IEEE standards are the classic ones that are typically adopted in organizations. That stuff is not appropriate really for a lot of what people are doing, and those standards tend to focus on documentation, not on testing.

JV: How did this happen that those are no longer appropriate? Was there a time when it was appropriate or is this a different era that we're in now?

FC: Oh, it's partly that, but I think it's also that people want to control testing. They want to control testing in a way that they don't particularly want to control programming. You don't say to a programmer, "Sit down and document before you do it everything that you're going to do."

JV: Right.

FC: We ask people for designs, but we don't expect them to document every point and click. Somehow, we expect that of testers. Well, I don't, and I say we. Management expects that of testers, and that's absurd. I was in an organization where, when I said we're doing fairly lightweight documentation and we're not running down the expected results because the testers know what to expect and they'll interpret and they'll make a call based on whether what they see is reasonable and what they learn about the system and so on, and the manager said to me, "You're putting too much on the testers," to which the only answer has to be, "Is it putting too much on the programmers to expect them to program without those explicit directions?" The testers are just as bright. They're just as skilled, and if they're not, why not? Let's make sure they are.

JV: That's so interesting. I wonder if it's just a different view, say, of what management has on testers than what they might have with programmers. Are there any certain products or industries in which there's mountains of test documentation that are just unable to be avoided, or is it sort of the same across all industries?

FC: I think that the amount of documentation, the kind of documentation you need to produce for testing, is entirely context dependent. The one overriding principle is that the artifacts have to serve the work, i.e., they have to serve the testing work, not the work serve the artifacts. So if you're in a highly regulated industry, if you're in an industry where it's extremely important that there be evidence of the kind of testing you've done, then think about what is appropriate evidence for that.

It's not necessarily huge numbers of plans and documents done beforehand. It may simply be that you do a video of your testing or you keep notes about your testing. You demonstrate the results of your testing in various ways. I think some of this comes, as I say, out of people who want to control what testing is done, but I think it also comes out of a misunderstanding about what kind of documentation you need necessarily.

JV: How do you know whether or not you're spending too much time on test documentation to begin with? I mean, is it just you come out and you're like, "Wow, we haven't done anything, but we have filled out fifty pages?"

FC: I wrote a StickyMind's article about this a while ago. The product, as a tester, is information for your stakeholders. If you're spending more time writing a whole lot of documentation, which is not your product, and not spending the time on testing, then clearly the balance is not there.

JV: Are there any easier ways for testers to notify stakeholders of these testing efforts without having to provide so much paperwork?

FC: Well, they can talk to them.

JV: They can talk to them.

FC: That's a really good start. I use the word artifacts deliberately because a prose document, and often we're looking at acres and acres of continuous prose, a prose document is not always the best artifact. It could be a diagram that's annotated to demonstrate what you're going to do. I've used that often for test strategies. You could use mind maps for certain kinds of test documentation. You can do all kinds of things. You can use a wiki.

It really depends on what the requirements are of your situation. If you look at the standards, again, going back to the IEEE standards, typically some of those requirements come out of a need to demonstrate due diligence in testing. Well, you don't always need to demonstrate due diligence. If you are in a context where you might be called into court or where your software is particularly high risk, then sure, you do need to demonstrate that you've exercised your best professional judgment in how you test and that you have, in fact, carried through on what you said you would do, but again, you don't have to have acres and acres of continuous prose to do that. There are lots of options.

JV: Got you. We're winding down a little bit on our time, but I want to get a couple questions in on how to deliver bad news because it's another area that you talk about, and this sort of segues nicely into this. Well, let's say that you have to deliver some bad news to some of these stakeholders. Can you describe a situation of how delivering bad news in a careless way can lead to a really negative outcome?

FC: I could give you an example ...

JV: I'd love to hear it.

FC: In fact, I talked about this. I did a webinar on this topic this morning before your’s started, but a tester in a large organization that does software for the consuming public was at a product launch and a product-launch party. So here's this big celebration in this big company with the vice-president, who was the executive sponsor for this particular product, plus a whole lot of other vice-presidents ...

JV: A big deal.

FC: ... and making the speech and everybody's happy. An innocent and terribly honest tester blurts out, "But the thing's full of holes."

JV: Oh, man (laughs).

FC: Well, yes, true, but this is not the right forum for that.

JV: Right. I can tell that. That's a careless and awkward manner to express that sentiment in that way.

FC: I mean, "careless" is a bit harsh, but it is, in fact. It's not knowing better.

JV: Not knowing better. Yeah, there’s a better way of saying it. What are some tips you can give our readers and listeners when they're preparing to deliver some bad news? Obviously, don't do it during ... when all the vice-presidents ... when everything, when all that's around.

FC: Well, pick where and when you're going to have the conversation. Pick a quiet place and don't ambush. If you have an unwelcome message to deliver, don't deliver it in a way that ambushes the other person. Don't corner the other person. Don't embarrass them or unnecessarily annoy them.

JV: People could feel that they're being attacked.

FC: Exactly. Think about what the message is and how you're going to substantiate it. Think about why you want to deliver it in the first place and what good outcomes you're trying to achieve and what bad outcomes you're trying to avoid or prevent. Think also about who it is you should talk to. It might not be the obvious person. It might be someone else. It might be someone who could be a better ally for you. It might be that you need to go over somebody else's head.

So you really need to think about that very carefully, and then, of course, how are you going to say it and how are you going to deliver it. Are you going to just sit in a chair and talk, or are you going to stand up at a white board and demonstrate with perhaps a diagram or a timeline or some other thing some of the things you're talking about? So, there's quite a bit of preparation, I think, that you can do.

JV: Once again, that's context. Given the context of the situation, you can sort of figure out the best way to say it. Get it quiet. Put a white board out. Maybe that's the best way to go, or maybe if you're dealing with someone who's a little bit more hotheaded, you talk in a little easier manner, I suppose.

FC: Oh, you need to watch your body language, too. You don't want to come on too strong and look aggressive. At the same time, you don't want to look like a wuss and as if you can be bullied. As I said, I gave a webinar on this this morning, and it'll be up on the EuroSTAR site. If people want to know more, they can go there and have a look.

JV: Very well. All right, Fiona. I think that about does it for our time. Thank you so much for being here with us. It was great having you on; a very informative conversation.

FC: Thank you, Jon.

 

FionaFiona Charles is a Toronto-based test consultant and manager with thirty years of experience in software development and integration projects. Fiona is the editor of The Gift of Time, featuring essays by consultants and managers of various professions about what they've learned from Gerald M. Weinberg.

About the author

Upcoming Events

Apr 28
Jun 02
Sep 22
Oct 13