Process

Conference Presentations

Performance Testing 101

Organizations are often so eager to "jump in" and use load testing tools that the critical steps necessary to ensure successful performance testing are sometimes overlooked-leading to testing delays and wasted effort. Learn the best practices and tips for successful automated performance testing in areas such as assembling a proper test team, planning, simulating a production environment, creating scripts, and executing load tests.

David Torrisi, Mercury Interactive
Mining the Gold from Your Web Server Logs

How often have you wished that you knew what your customers really thought of your Web site? You can extract a gold mine of information from your Web server's log to reveal how your site is used. Learn ways for your team to use this information to organize browser testing based on user statistics, improve testing coverage of your Web site, and plan more realistic load testing.

Karen Johnson, Peapod, Inc.
Wireless Application Testing

Putting the Web on cellular phones, PDAs, and other wireless devices is all the rage. Still in its infancy, the idea of doing online transactions via mobile devices has created a new buzzword: "M-Commerce." However, some companies in their quest to be first-to-market have overlooked the fact that this new technology is still in need of basic testing for quality, performance under load, and usability. Discover the importance of testing wireless applications, and learn how to identify common bottlenecks and problems.

Scott Moore, CommerceQuest
When Test Drives the Development Bus

Once development reaches "code complete," the testing team takes over and drives the project to an acceptable quality level and stability. This is accomplished by weekly build cycles or dress rehearsals. The software is graded based on found, fixed, and outstanding errors. Development strives to increase the grades in each build--improving the quality and stability of the software. Learn how to use this "dress rehearsal" process to build team morale, develop ownership by the entire development team, and ensure success on opening night.

Cindy Necaise, MICROS Systems, Inc.
Advanced Data Driven Testing (ADDT)

Learn how the Convergys Test Automation Team developed an Advanced Data Driven Testing (ADDT) approach using a test automation engine. Gain insight into how this technique was successfully implemented to improve the reliability and quality of their software products and reduce the number of testing man-hours. Shakil Ahmad gives a high-level description of the engine design, functionality, and benefits as he shares his company's successes-and frustrations.

Shakil Ahmad, Convergys
The Role of Information in Risk-Based Testing

With risk-based testing, you identify risks and then run tests to gather more information about them. Formal risk analysis is often necessary for identifying and assessing risks with new domains or technologies. A common problem, however, is how to assess risks when you have little information. Learn how to use testing to identify risks, reach team agreement on risk magnitude, and identify actions which allow these risks to be understood and mitigated.

Bret Pettichord, Satsfice
STAREAST 2001: Bug Hunting: Going on a Software Safari

This presentation is about bugs: where they hide, how you find them, and how you tell other people they exist so they can be fixed. Explore the habitats of the most common types of software bugs. Learn how to make bugs more likely to appear and discover ways to present information about the bugs you find to ensure that they get fixed. Drawing on real-world examples of bug reports, Elisabeth Hendrickson reveals tips and techniques for capturing the wiliest and squirmiest of the critters crawling around in your software.

Elisabeth Hendrickson, Quality Tree Software, Inc.
Failure is Not an Option: 24 x 7 on the Web

This paper discusses the factors involved in determining the cost of a twenty-four hour by seven days per week (24 X 7) e-Commerce or internal web site going offline for any length of time. After determining these costs, and showing a real-life example calculation, the paper then goes into several ways to minimize this risk via hardware architecture, software architecture, and stress testing.

Ed Bryce, Reality Test
Software Customer Satisfaction Surveys

Satisfying our customers is an essential element to staying in business in this modern world of global competition. We must satisfy and even delight our customers with the value of our software products and services to gain their loyalty and repeat business. Customer satisfaction is therefore a primary goal of process improvement programs. So how satisfied are our customers? One of the best ways to find out is to ask them using Customer Satisfaction Surveys. This paper includes details on designing your own software customer satisfaction questionnaire, tracking survey results and example reports that turn survey data into useful information.

Linda Westfall, The Westfall Team
Managing Iterative Development: Avoiding Common Pitfalls

The Rational Unified Process (RUP) advocates an iterative or spiral approach to the software development lifecycle, as this approach has again and again proven to be superior to the waterfall approach in many respects. But do not believe for one second that the many benefits an iterative lifecycle provides come for free. Iterative development is not a magic wand that when waved solves all possible problems or difficulties in software development. Projects are not easier to set up, to plan, or to control just because they are iterative. The project manager will actually have a more challenging task, especially during his or her first iterative project, and most certainly during the early iterations of that project, when risks are high and early failure possible.

Per Kroll, Rational

Pages

StickyMinds is a TechWell community.

Through conferences, training, consulting, and online resources, TechWell helps you develop and deliver great software every day.