Automated Test Results Processing

[article]
Summary:

Automating test results analysis process provides an efficient method for producing problem reports that uniquely identify product failures by standardizing the format and content of failure summaries and their supporting details.

QA automation often leverages success on the ability to perform continuous, non-stop testing. When one test completes, its results are collected and a new test is begun. The collected results may include test logs, product logs, Dr. Watson dumps, user mode process dumps, and kernel mode memory dumps.

QA engineers then process the results, distilling them down into problem reports. As a manual process, it is difficult to achieve any level of efficiency in processing results and provide any consistency in problem reporting. Consistency in problem reporting is key to being able to distinguish new problems from ones that have already been reported, and to provide the details development engineering needs to isolate and identify a defect.

Automating the results analysis process provides an efficient method for producing problem reports that uniquely identify product failures by standardizing the format and content of failure summaries and their supporting details.

About the author

Edward Smith's picture Edward Smith

Edward Smith is a Consulting Engineer at Mangosoft Incorporated where he has been specializing in automated test and infrastructure design for complex distributed systems since 1996. His designs for distributed cache and file system tests are used to evaluate data coherency, quality, and robustness of Mangosoft products. His automation infrastructure designs keep a test lab of over 100 systems running automated tests 24x7. Ed also studies ways to optimize the workflow between product development and product testing applying automation where it makes sense. Ed has authored several papers on his automation techniques including a paper on the Continuous Testing model and on Automated Test Results Processing (www.mangosoft.com/technology). He has presented his work at the TSC2000 and the STAREAST 2001 conferences. Prior to Mangosoft, Ed worked at Digital Equipment Corporation for 19 years in a variety of Quality Assurance and Customer Support roles. He designed and developed tests for OpenVMS Clusters and their distributed I/O subsystems, worked as a Software Serviceability Engineer on OpenVMS, and as a Systems Support Engineer for Digital Equipment’s General International Area. Ed can be contacted at edsmith@mangosoft.com.

StickyMinds is one of the growing communities of the TechWell network.

Featuring fresh, insightful stories, TechWell.com is the place to go for what is happening in software development and delivery.  Join the conversation now!

Upcoming Events

Apr 29
Apr 29
May 04
Jun 01