Articles

Testers looking at graphs of performance test results Responsibly Reporting Performance Test Results: Trends, Noise, and Uncertainty

In order for performance test results to have value, you should report them in context. There are two main considerations: How do these compare to previous results? And how can we provide early reports on performance while emphasizing that these are preliminary results that may change significantly as we progress? Here are some ideas for responsible reporting.

Michael Stahl
Dashboard on a computer showing test data results, photo by Carlos Muza Reporting Automated Test Results Effectively

The modern iterative software development lifecycle has developers checking in code to version control systems frequently, with continuous integration handling building and running automated tests at an almost equally fast rate. This can generate an enormous amount of test data. Here’s how you can ensure you are reporting results effectively across your team and realizing all the benefits of that information.

Ajeet Dhaliwal
Monitoring dashboard with criteria set up Solving Production Issues Using Testing Tools

Standard web-monitoring tools can ping webpages and verify that they’re responding, but they don’t alert you to an issue. But you can use the technology in load testing to monitor your sites by running an interactive script that can detect issues and generate emails as needed. It runs constantly like a silent sentry, never sleeping or taking a vacation, improving your sites' reliability.

Nels Hoenig
Infinity symbol Has Continuous Deployment Become a New Worst Practice?

Software development has been moving toward progressively smaller and faster development cycles, and continuous integration and continuous deployment are compressing delivery times even further. But is this actually good for businesses or their users? Just because you can deploy to production quickly and frequently, should you?

John Tyson

Better Software Magazine Articles

Measure the Measurable: Improving Software Quality Through Telemetry

Observing customers in a usability lab can be invaluable for improving product design. But, once your software leaves the lab, do you know what your customers are actually doing and whether or not your software meets their expectations? Learn how engineers on the Microsoft Office team apply a variety of software telemetry techniques to understand real-world usage, how the results drive product improvements, and how you can apply similar techniques.

Jamie Campbell
Learning from Experience: Software Testers Need More than Book Learning

People often point to requirements documents and process manuals as ways to guide a new tester. Research into knowledge transfer, as described in The Social Life of Information, suggests that there is much more to the process of learning. Michael Bolton describes his own experiences on a new project, noting how the documentation helped ... and didn't.

Michael Bolton
The Missing Measurement

In these times, many of us are being told to "do more with less." A more useful approach is "invest our organization's scarce resources where the return is the greatest." To do so, we must define the financial benefits sought when developing a system in addition to its requirements.

Lee Copeland
Metrics that Motivate

To implement a meaningful incentive system for your team, you need to select metrics that encourage the behaviors you need and the results you want. But first you have to decide what you need and want.

Linda Hayes

Interviews

Greg Paskal Data Visualization in Test Automation: A Conversation with Greg Paskal
Video

Greg Paskal, test automation lead at Ramsey Solutions, talks about data lakes and how to effectively use data visualization. Done well, data visualization should help practitioners, managers, and stakeholders easily consume, understand, and act on the information the visual displays.

Owen Gotimer
Craeg Strong discusses agile documantation Best Practices for Lean Documentation: An Interview with Craeg Strong
Podcast

In this interview, Craeg Strong speaks about his upcoming presentation, meeting strict documentation requirements in agile, how agile documentation differs from traditional governance, the advantages and disadvantages to taking your documentation agile, and the art of a company turnaround.

Cameron Philipp-Edmonds

Conference Presentations

STARCANADA Shift Your Perspective on Data and Influence Stakeholders
Slideshow

With all the open source tools available in the market, it can be overwhelming to determine which might meet your needs and which will work best in your environment. Join Jennifer Bonine as she explains the relationship between data and your environment, and using data to help make...

Jennifer Bonine
Lessons Learned from Forty-five Years of Software Measurement
Slideshow

Counting is easy. However, what makes measurement really valuable-and really hard to get right-is knowing what to count and what to do with the results. If your organization is mostly tracking resource usage, costs, and schedule data, it is making a big mistake. What about the users? The customers? The overall business strategy? Sharing the lessons he has learned from fighting-and surviving-many software measurement battles, Ed Weller offers a step-by-step approach for implementing a practical and valuable metrics program. After understanding what measures are most important to the business strategy and all stakeholders, the next step is to decide what data supports those measures and how to capture it. With data in hand, you can create simple and informative ways to make the resulting metrics visible and easy to digest. The biggest challenges-avoidance, disbelief, and rationalization-come next.

Edward Weller, Integrated Productivity Solutions, LLC

StickyMinds is a TechWell community.

Through conferences, training, consulting, and online resources, TechWell helps you develop and deliver great software every day.