A Small Experiment

[article]
Summary:

Lisa Crispin's team recently faced a slowdown in the flow of requirements for a project. They were waiting on the product owner, who was swamped, but he was waiting on someone, too. To beat the waiting game, they tried a small experiment involving smiles, frowns, and deadlines. In this article, she describes how the experiment turned out and suggests looking to experimentation when things get tricky.

If you asked anyone on my team what agile practice is most responsible for our success over the past eight years, I bet that person would say, "Retrospectives." At the start of every two-week sprint, we spend time talking about the previous sprint, identifying areas that need improvement, and thinking of ways to overcome obstacles. But, I wonder if it's not so much the retrospectives themselves leading to our success as the “small experiments” (to borrow Linda Rising's term) we try to address our problem areas.

Here's a recent example: Our product owner (PO) is awesome, but like many POs, he has many responsibilities and not enough time. Years ago, he came up with the idea of "story checklists." Before each iteration, he prepared a checklist for each user story, following a template that included information such as mockups for new UI pages or reports, whether a new story affected existing reports or documentation, whether third parties needed to be involved, and high-level test cases. This helped us get off to a running start with each story.

As our PO became burdened with more responsibilities, he started to run late on preparing the story checklists. The downward slide started slowly. At our sprint planning, he'd say, "Oh, I am still working on the checklist for this one story, but I'll have it ready soon." Or, "I'm waiting to hear from the head of sales to get the final requirements for this; I'll let you know as soon as I know." We're agile, we're flexible, and we have a lot of domain knowledge, so we felt we could cope.

But, the one missing story checklist soon turned into two, then three. After awhile, we weren't getting any story checklists, ever. We discussed each story with our PO at our sprint planning meetings and wrote requirements and high-level tests on the whiteboard, but that whiteboard also had outstanding questions for each story. We'd start working on the story with the best information we had, but then there would be changes. We spent a lot of time going back and forth to look for the PO, ask questions, and update the requirements as they changed or were finalized. We still got our stories done, but it was costing the company more and slowing us down.

The PO had no motivation to reverse this change. It wasn't even his fault; he was usually waiting for other people.

Our frustration mounted until, finally, at a retrospective, we decided we had to do something about this problem. The company was spending extra money to finish each story, simply because the business people were not getting their ducks in a row before each iteration began. We decided to try an experiment.

We recently had begun using a product called MercuryApp to record our feelings about the progress of the sprint every day. (That is another experiment—a better way to keep track of how things go so our retrospectives can be more productive.) MercuryApp lets you rate your feelings on a five-point scale from a very sad face to a very happy face. This gave us an idea. At the end of our sprint-planning meeting, we put a "rating face" next to each story on the whiteboard. If we didn't have any requirements, we put a very sad face. If we had all the requirements we needed to complete the story, we put a very happy face. Most were somewhere in between—a sort of sad or happy face, or a "meh" face.

Tags: 

About the author

Lisa Crispin's picture Lisa Crispin

Lisa Crispin is the co-author, with Janet Gregory, of Agile Testing: A Practical Guide for Testers and Agile Teams (Addison-Wesley, 2009), co-author with Tip House of Extreme Testing (Addison-Wesley, 2002) and a contributor to Beautiful Testing (O’Reilly, 2009) and Experiences of Test Automation by Dorothy Graham and Mark Fewster (Addison-Wesley, 2011). She has worked as a tester on agile teamssince 2000, and enjoys sharing her experiences via writing, presenting, teaching and participating in agile testing communities around the world. Lisa was named one of the 13 Women of Influence in testing by Software Test & Performance magazine in 2009. For more about Lisa’s work, visit www.lisacrispin.com.

StickyMinds is one of the growing communities of the TechWell network.

Featuring fresh, insightful stories, TechWell.com is the place to go for what is happening in software development and delivery.  Join the conversation now!

Upcoming Events

Apr 29
Apr 29
May 04
Jun 01