Testers come from a wider range of backgrounds, and have complex multifaceted roles. People who test are not “just testers…” At present, many testers do not feel well-supported by their tools. As my research uncovered stories of frustration, fear, and anger, I realized the illusory role of usability in tool adoption and the importance of understanding who is using those tools.
When you’re designing a dashboard to track and display metrics, it is important to consider the needs and expectations of the users of the dashboard and the information that is available. There are several aspects to consider when creating a new dashboard in order to make it a useful tool. For a mnemonic device to help you easily remember the qualities that make a good dashboard, just remember the acronym “VITAL.”
Testers gather lots of metrics about defect count, test case execution classification, and test velocity—but this information doesn't necessarily answer questions around product quality or how much money test efforts have saved. Testers can better deliver business value by combining test automation with regression analysis, and using visual analytics tools to process the data and see what patterns emerge.
It's easy to make simple mistakes in data analysis. But these little mistakes can result in rework, errors, and—in the worst case—incorrect conclusions that lead you down the wrong path. Making small process changes can help you steer clear of these mistakes and end up having a real impact, both in the amount of time you spend and in your results. Here are some tips for avoiding rookie mistakes in data analytics.
With incoming priorities being requested by just about everybody, how in the world can you and your team prioritize? Brandon shows you some innovative techniques that you can use to turn chaos into order. One surprising approach is simply handling priorities on a first-in, first-out basis.
The loudest voice in the room might push for a stable, predictable, repeatable test process that defines itself up front, but each build is different. An adaptive, flexible approach could provide better testing in less time with less cost, more coverage, and less waste.
Children can teach us some extremely profound things--often when we least expect it. Jennitta Andrea shares sage advice about project retrospectives that she learned while perusing the well-known children's stories on her daughter's bookshelf. These insights will help improve the way you plan, facilitate, and participate in project retrospectives.
Combinatorial testing is effective for testing multiple, non-sequential inputs that affect a common output in complex software. But, it's easy to misapply it or become a slave to the output. Learn to overcome limitations and benefit fully from this technique.
In this interview, Geoff Meyer, a test architect in the Dell EMC infrastructure solutions group, explains how test teams can succeed by emulating sports teams in how they collect and interpret data. Geoff explains how analytics can better prepare you for the changing nature of software.
In this interview, Daria Mehra, the director of quality engineering at Quid, explains how people can use machine learning to better contextualize data, details the complexity of test automation and how to be sure you have enough test coverage, and defines the term “artisanal testing.”
In this interview, Kevin McCaffrey, the founder and CEO of Tr3Dent, details why digital transformations have become so important in the software industry and why companies need to understand how to utilize the data they’re getting from internet of things devices.
In this interview, Isabel Evans and Julian Harty reveal the area of software where we're ignoring the user experience. They cover how we're not thinking strongly enough about the tool sets used by IT projects, and dig into why test tools are often put on the shelf.
Successful agile testers collaborate with programmers as code is written, isolating problems, troubleshooting defects, and debugging code all along the way to getting the product to done. But modern systems are scaling beyond what traditional teams are able to understand using familiar tools. New appreciation for systems and complexity theory, as well as disciplines and tools around emerging areas such as observability and resilience engineering, are offering solutions that allow teams to actively debug their systems and explore properties and patterns they have not defined in advance. Chris will share the basics of the theory of these new ideas, as well as some tools that support this type of work. He'll show how dynamic analysis can be used to isolate and understand puzzling system behavior.
Sabermetrics turned the baseball world upside down by challenging decades-old measures of individual performance and their perceived linkage to team success. After cementing their legacy as the Lovable Losers for 108 years, the Chicago Cubs were able to leverage a data-driven...
Test code development is generally approached with more lenient standards and less scrutiny than production code. As a result, rather than providing valuable feedback on software quality, this can lead to tests that produce inconsistent results and false outcomes. Team productivity is...
Agile—a single word that sparked unprecedented confusion in the technology world. When it went agile, did your organization throw out your business analyst team? Have they banned all requirements documentation? Are teams struggling to see the big picture? Brian Watson has encountered...