requirements

Articles

working together Harvesting Stakeholder Perspectives to Organize Your Backlog

When Mary Gorman and Ellen Gottesdiener facilitated a game called The Backlog Is in the Eye of the Beholder for the Boston chapter of the International Institute of Business Analysis, both the players and the facilitators learned some important lessons in organizing a project requirements backlog. In this article, they describe the game and what it revealed, including the value of truly knowing your stakeholders.

Building Team Trust, Front to Back

Trust is more than a feeling. In a project, it is something that can be grown from careful planning and development of good requirements. Ellen Gottesdiener describes three types of trust which can be built from good requirements and team management.

Ellen Gottesdiener's picture Ellen Gottesdiener
Acceptance Test-Driven Development: Not as Optional as You Think

The components of software processes work together in important and sometimes unrecognized ways. The removal of one of those components will affect the others. In this article, which originally appeared in the August 2010 issue of the Iterations eNewsletter, Jennitta Andrea takes a look at the value of acceptance test-driven development and the costs of making it an optional practice.

Jennitta Andrea's picture Jennitta Andrea
Slicing Requirements for Agile Success

Agile teams need to analyze product requirements in enough detail to build, test, and deliver the right requirements in short time frames. For the many teams that struggle to define "just enough, just in time” requirements, here's help.

9 Questions You Must Ask When Selecting the Right Tool and Vendor

The key to selecting the best vendor and tool is asking the right questions. The answers to these nine essential questions can mean the difference between satisfaction with your purchase and a giant waste of time and money.

Joe Townsend's picture Joe Townsend
Avoid Failure with Acceptance Test-Driven Development

One of the major challenges confronting traditional testers in agile environments is that requirements are incrementally defined rather than specified at the start. Testers must adapt to this new reality to survive and excel in agile development. C.V. Narayanan explains the Acceptance Test-Driven Development (ATDD) process that helps testers tackle this challenge. He describes how to create acceptance test checkpoints, develop regression tests for these checkpoints, and identify ways to mitigate risks with ATDD. Learn to map acceptance test cases against requirements in an incremental fashion and validate releases against acceptance checkpoints. See how to handle risks such as requirements churn and requirements that overflow into the next iteration. Using ATDD as the basis, learn new collaboration techniques that help unite testing and development toward the common goal of delivering high-quality systems.

C.V. Narayanan, Sonata Software Ltd.
Executable Specs w/ FitNesse Selenium

"Executable Specifications with FitNesse and Selenium."

Dawn Cannan, DocSite LLC
Meet "Ellen": Improving Software Quality through Personas

Users are the ultimate judge of the software we deliver because it is critical to their success and the success of their business. However, as a tester, do you really understand their tasks, skills, motivation, and work style? Are you delivering software that matches their needs and capabilities-or yours? Personas are a way to define user roles-imaginary characters-that represent common sets of characteristics of different users. David shares how his team at Microsoft defined and used one persona named “Ellen” to help them design, develop, and test the first version of a new product. David shares before Ellen and after Ellen examples of the product, showing how the product changed when Ellen joined the team. See examples of the robust test cases and acceptance scenarios they defined from unique insights that Ellen provided.

David Elizondo, Microsoft Corporation
The Many Hats of a Tester

As testers, we must wear many hats to do our job effectively. Quite often, it is the pith helmet of an explorer, hacking through the vines and darkness of the unknown; or the baseball cap of the crime scene investigator, determining how the failure occurred. To make things even more interesting, the hats we need often differ from project to project and organization to organization. Adam Goucher begins with a general discussion of some hats testers typically wear and when they are appropriate or inappropriate. He then leads an “Art Show” exercise-a brainstorming process resulting in lots of “art” on the walls-illustrating the hats we all may wear in our daily testing activities. Through the Art Show process, you'll take away new insights into what hats you and other testers need, tips for wearing the beautiful ones with success, and how to avoid putting on the ugly ones.

Adam Goucher, Zerofootprint
Stop Guessing About How Customers Use Your Software

What features of your software do customers use the most? What parts of the software do they find frustrating or completely useless? Wouldn't you like to target these critical areas in your testing? Most organizations get feedback-much later than anyone would like-from customer complaints, product reviews, and online discussion forums. Microsoft employs proactive approaches to gather detailed customer usage data from both beta tests and released products, achieving greater understanding of the experience of its millions of users. Product teams analyze this data to guide improvement efforts, including test planning, throughout the product cycle. Alan Page shares the inner workings of Microsoft's methods for gathering customer data, including how to know what features are used, when they are used, where crashes are occurring, and when customers are feeling pain.

Alan Page, Microsoft

Pages

StickyMinds is a TechWell community.

Through conferences, training, consulting, and online resources, TechWell helps you develop and deliver great software every day.