testing

Conference Presentations

Movin' On Up: Making the Transition from Test Lead to Manager

Want to be a test lead? Ready to take on the responsibilities of test management? Making the transition to a lead, then a management position, takes more than just guts- it takes preparation. This presentation illuminates some of the technical aspects you'll encounter when transitioning to test lead or test manager, including: organizing and managing the testing; working with the project manager and the rest of the project team; and deciding how, when, and what to invest in your test infrastructure. You'll also explore some of the nontechnical aspects such as coaching and mentoring, giving feedback, and providing work direction.

Johanna Rothman, Rothman Consulting Group
Unified Test Automation Using XML

Are you looking to reduce the maintenance costs of your testware? Unified Test Automation (UTA) is one approach that's demonstrated cost-saving success. UTA serves as a cost-reduction strategy by centralizing test resources and minimizing the overhead of maintaining the different components of testware, such as test documents, test software, and test data. The advent of new technologies such as the XML markup language and associated XML third-party tools for editing XML content has produced an ideal framework for the centralization of all testware. Rodrigo Geve cites specific examples and explains how this has been achieved.

Rodrigo Geve, Geve & Associates
Testing Your Software's Requirements

Many testing organizations focus primarily on software executable code, but that's not the only thing you can test. For instance, did you ever consider testing your software requirements? When you test only code, you face some big disadvantages, not to mention that design defects often aren't even fixable because they demand too much effort, too late in the release cycle. In fact, it's difficult to even report some requirements defects since the developers have already committed to the design strategy. But if you test your requirements early in the game, you can discover defects before they're cast into designs and code, consequently saving your organization potentially huge rework costs.

Brian Lawrence, Coyote Valley Software
Looking Ahead: Testing Tools in 2010

It's May 15, 2010, and you're in a triage meeting reviewing the testing status and bugs in your telemedical software. The system uses real-time voice, video, graphics, and an expert knowledge base to support expert medical procedures in remote locations. As the test manager, you're using trace diagrams, deployment diagrams, runtime fault injection, coverage views, test patterns, built-in self test, and other modern, agile techniques to review the bugs, diagnose faults, assign priorities, and update your test plans. Sam Guckenheimer contrasts the methods available to you in 2010 versus the techniques you used years ago when you were starting out as a test manager.

Sam Guckenheimer, Rational Software ATBU
Beyond Record and Playback: The Behind-the-Scenes View of Web Test Automation

Web-based test automation goes well beyond the mere action of recording manual test scripts and replaying them. Test automation is more of a development process than the normal quality assurance or test effort. This presentation takes an in-depth look into what it takes to truly automate Web site testing. You'll explore the following building blocks: planning/analysis, design/development, implementation, and support.

Michael Prisby, UPS
Applying Testing Expertise to the Retrospective Goldmine

Digging up postmortem project data is like mining for gold. The returns can be significant and long-term because this is where your best (and worst) practices really shine. By allowing your test groups to drive the retrospective activities, improvements can finally be built into the product lifecycle model instead of rotting in a postmortem report. By improving retrospective facilitation and follow-up, you'll ultimately improve your software lifecycle process. Nick Borelli delivers a practical and proven approach to the retrospective process, and shows you how to build consensus for process improvements uncovered during retrospective analysis.

Nick Borelli, Microsoft Corporation
Enterprisewide Testware Architecture

Testware: the stuff of which tests are made. The term comprises a bewildering range of artifacts including data files, scripts, expected results, specifications, and environment information. It also implies how these artifacts are arranged, where they're stored and used, and how they're grouped and referenced. Since testware architecture has not traditionally been considered an important issue, individual projects and teams, even individual testers, have evolved their own approaches to the arrangement of their testware, resulting in much wasted effort. Of course, different applications and environments may demand unique testware architectures, but do they have to be so different? Isn't there a single, unified, flexible, and expandable approach that fits most, if not all, situations within an enterprise? Perhaps not, but the goal of uniform testware architecture across projects is certainly worth striving for.

Mark Fewster, Grove Consultants
Automated Web Testing Strategies

As Web applications move from static content to dynamic transactions, the risk of failure increases while cycle time collapses. Although automation is the ideal solution for this combination, those who've ventured into automated Web testing have discovered a whole new world of unexpected challenges. For instance, dynamic page layouts and content frustrate test automation requirements for predictability and repeatability, while the lack of meaningful-let along consistent-object names further complicates consistent execution. Ultimately, this leads to excessive maintenance and lower productivity. This presentation shows you how to identify the potential issues that come with automated Web testing, then offers ways for you to incorporate site and test development strategies to overcome them.

Linda Hayes, WorkSoft
STAREAST 2002: A Case Study In Automating Web Performance Testing

Key ideas from this presentation include: define meaningful performance requirements; changing your site (hardware or software) invalidates all previous predictors; reduce the number of scripts through equivalence classes; don't underestimate the hardware
needed to simulate the load; evaluate and improve your skills, knowledge, tools, and outsourced services; document your process and results so that others may learn from your work; use your new knowledge to improve your site's performance and focus on progress, not perfection.

Lee Copeland, Software Quality Engineering
Investing Wisely: Generating Return on Investment from Test Automation

Implementing test automation without following best practices and tracking your ROI is a prescription for failure. Still, many companies have done so seeking the elusive promise of automated testing: more thorough testing done faster, with less error, at a substantially lower cost. However, fewer than fifty percent of these companies realize any real success from these efforts. And even fewer have generated any substantial ROI from their automated testing initiatives. This presentation takes an in-depth look at the specific pitfalls companies encounter when implementing automated functional testing, and offers proven best practices to avoid them and guarantee long-term success.

Dale Ellis, TurnKey Solutions Corp.

Pages

StickyMinds is a TechWell community.

Through conferences, training, consulting, and online resources, TechWell helps you develop and deliver great software every day.