reviews

Articles

 Four Solutions Compared How to Test Your Website on Multiple Browsers: Four Solutions Compared

Robbie Bridgewater writes on the difficulty in finding bugs during testing since no single computer can run all of the major browsers—not to mention the added challenge of testing various mobile operating systems. In this article, Robbie compares four possible solutions to this dilemma.

Erle Bridgewater's picture Erle Bridgewater
How to Break Embedded Software: An Interview with Jon Hagar
Video

Thirty-year system software engineer and testing consultant Jon Hagar details the challenges that embedded software testing poses. Learn how risks should feed attacks, especially when maintaining the safety of devices like pacemakers and braking systems.

Noel Wurst's picture Noel Wurst
What to Review If You Can’t Review Everything

Payson Hall shares with us a useful list of review criteria via a case study of a troubled software development project. Reviews can be messy. Sometimes it’s hard to know where to start, particularly when you are in triage mode and can only review a small sample.

Payson Hall's picture Payson Hall
Writing Test Rules to Verify Stakeholder Requirements

Some organizations employ business analysts who are very good at specifying requirements at the beginning of a software project. The advantage of this step is the reduction in ambiguity for the developer and tester of what should be delivered.

Brendan Quinn's picture Brendan Quinn
Taking the Risk: Exploration over Documentation

The loudest voice in the room might push for a stable, predictable, repeatable test process that defines itself up front, but each build is different. An adaptive, flexible approach could provide better testing in less time with less cost, more coverage, and less waste.

Matthew Heusser's picture Matthew Heusser
Test Documenting Over the Cliff

Unless you're in a test role where full, complete documentation is necessary in order to be federally compliant or to keep people from dying or losing a limb, attempting to document every little thing is a fool's errand. Software changes. A lot. With constant change, what we document one day may be obsolete the next.

Bonnie Bailey's picture Bonnie Bailey
Sprint Reviews that Attract, Engage, and Enlighten Stakeholders
Slideshow

Are you suffering from chronic disinterest in what your team is delivering? Are your product owners unavailable or distracted? Are your sprint reviews ho-hum experiences with low attendance? If you answered Yes to any of these questions, your agile teams are in trouble-and you need to attend this session. Experienced agile coach Bob Galen explores real-world patterns for how to increase the interest in-and the energy and value of-your sprint reviews. First, Bob explains how to prepare properly, the keys to dry runs, and the role of a Master of Ceremonies. Then he examines ways to orchestrate pro-active reviews that include the whole team and engage your audience when demonstrating "working software." Next Bob discusses how to perform a review follow-up and gather feedback for high-impact improvements. Finally, Bob wraps up by exploring ways to make sprint reviews a centerpiece of your agile adoption and transformation.

Bob Galen, RGalen Counsulting Group, LLC
Performance Appraisals for Agile Teams

Traditional performance evaluations, which focus solely on individual performance, create a “chasm of disconnect” for agile team members. Because agile is all about team performance and trust, the typical HR performance evaluation system is not congruent with agile development. Based on his practical experience leading agile teams, Michael Hall explores how measurements drive behavior, why team measurement is important, what to measure, and what not to measure. Michael introduces tangible techniques for measuring agile team performance-end of sprint retrospectives, sprint and project report cards, peer reviews, and annual team performance reviews. To demonstrate what he’s describing, Michael uses role plays to contrast traditional, dysfunctional annual reviews with agile-focused performance reviews.

Michael Hall, WorldLink, Inc.
Evaluating the Quality of Requirements: A Primer for Independent Verification

Would you tell your publisher to stop editing in the middle of your manuscript and publish your novel now? Of course not! Then, why would you tell your QA/test team to stop identifying problems with requirements documentation? All deliverables—and especially requirements—should undergo a rigorous assessment based on their quality attributes and measurable evaluation criteria. Mark Haynes describes quality models and attributes he has used to evaluate requirements documents. He shows how you can detect imprecision-that will haunt you later-and remove it with a set of formal criteria and informal heuristics. Discover how you can use quality attributes, even subjective ones, to conduct a quality dialogue within the development team. Mark shares examples of poorly written requirements for you to evaluate and try your hand at improving.

Donald Haynes, Modis
Help Technical Support Help Themselves

This article discusses how testing teams can improve their test coverage and better communicate with technical support to uncover issues earlier than during product implementation. This kind of collaborative work can stop most defects from getting into production.

Ipsita Chatterjee

Pages

StickyMinds is a TechWell community.

Through conferences, training, consulting, and online resources, TechWell helps you develop and deliver great software every day.