testing

Conference Presentations

Evolve Your Testing the Pokémon Way
Slideshow

How can you know that your services will handle the requests of millions of users a day? Or that making a fundamental change to one of your technologies won’t break your user experience? The answer: By having your entire team use and build on a phased approach to testing the right pieces at the right time. The Pokémon Company International services group develops and updates the services used for logging into the Pokemon.com website and applications like Pokémon Go and Pokémon TV. Since the launch of Pokémon Go in 2016, their quality-focused team has worked to develop strategies that reduce the number and duration of customer issues in the face of millions of daily active users. Join Paul Grimes as he presents the phases their test engineers go through and the tools used to ensure that the services Pokémon is delivering are capable of meeting users' demands.

Paul Grimes
Make Your UI Tests Resilient with the Next Generation of Frameworks
Slideshow

A big problem with test automation on any platform or operating system is synchronizing test automation interactions with the UI. It is challenging to know when the UI is ready for the next automated click().

Satyajit Malugu
Driving Quality with the "Yes, If ..." Mentality
Slideshow

It can be easy to feel like the villain when you work in testing. After all, part of the job is to point out when things are broken, people have made mistakes, timelines aren't realistic, or a plan can't work. But if your team feels like you're a frequent naysayer, trust can and will erode.

Jane Jeffers
Testers: The Unsung Data Heroes
Slideshow

Connor Dodge believes that data is the most valuable commodity in the world, and that testers generate some of the most valuable data in product development organizations. Test data can inform release schedules, aid in decision-making, and shape the direction of the product. 

Connor Dodge
Internet of Things: Changing the Way We Test
Slideshow

The internet of things (IoT) brings connectivity to a range of previously non-internet-enabled physical devices and real-world objects. This shift has an impact testing—changing what we test, when we test, and the way we test. For one thing, once you’re in the real world, the number of possible issues explodes due to environmental conditions. Just like a race car must adjust its tires for different track conditions, IoT devices must account for environmental factors such as temperature and humidity to prevent unanticipated failures. Jane Fraser believes that for the IoT to be successful, we must focus on developing testing methods, analytics tools, and SDKs that help teams to automate activities such as checking connection strength and robustness, verifying mobile compatibility, and testing various hardware capabilities. This includes Wi-Fi, BTLE, radio, natural language processing technologies, and more.

Jane Fraser
Getting to Continuous Testing
Slideshow

Max Saperstone tells the story of how a health care company striving to get to continuous releases built up their automation to secure confidence in regular releases. Initially, as no test automation existed, Max was able to take an opportunity for greenfield test automation and, in the span of twelve months, develop over two thousand test cases. A pipeline was created to verify the integrity of the automated tests and build Docker containers for simplified test execution. These containers could be easily reused by developers and the DevOps team to verify the application. Join Max as he walks through the feedback loop that was created to allow application verification to go from hours to minutes. Max will share his choices of BDD tooling, integrated with WebDriver solutions, to verify the state of web and mobile applications.

Max Saperstone
Testing AI-Based Systems: A Gray-Box Approach
Slideshow

Testing artificial intelligence- and machine learning-based systems presents two key challenges. First, the same input can trigger different responses as the system learns and adapts to new conditions. Second, it tends to be difficult to determine exactly what the correct response of the system should be. Such system characteristics make test scenarios difficult to set up and reproduce and can cause us to lose confidence in test results. Yury Makedonov will explain how to test AI/ML-based systems by combining black box and white box testing techniques. His "gray box" testing approach leverages information obtained from directly accessing the AI’s internal system state. Yury will demonstrate the approach in the context of testing a simplified ML system, then discuss test data challenges for AI using pattern recognition as an example and share how data-handling techniques can be applied to testing AI.

Yury Makedonov
What's That Smell? Tidying Up Our Test Code
Slideshow

We are often reminded by those experienced in writing test automation that code is code. The sentiment being conveyed is that test code should be written with the same care and rigor that production code is written with. However, many people who write test code may not have experience writing production code, so it’s not exactly clear what is meant. And even those who write production code find that there are unique design patterns and code smells that are specific to test code. Join Angie Jones as she presents a smelly test automation code base littered with several bad coding practices and walks through every one of the smells. She'll discuss why each is considered a violation and via live coding, she will demonstrate a cleaner approach. While all coding examples will be done in Java, the principles are relevant for all test automation frameworks.

Angie Jones
Capturing Testing with 3 Magic Words
Slideshow

Testers tend to be innately curious creatures. Being curious and evaluating risks—that is what the testing job is about. Often it is the statement “I don’t know” that drives our curiosity in testing. Find out not only how to push past the fear of not knowing but how to embrace your curiosity.

Janna Loeffler
How Infrastructure as Code Can Help Test Organizations Achieve Automation
Slideshow

For many test organizations, the first hurdle to automating the testing of a product is deployment of that product in its test environments. Infrastructure as code can be used to facilitate the basic processes of provisioning servers, from bare metal to virtual to cloud, as well as configuration management of the software that resides on the servers. Off-the-shelf infrastructure-as-code tools such as AWS CloudFormation, Chef, Puppet, and Ansible provide less expensive alternatives to developing proprietary in-house deployment solutions. Join Kat Rocha to learn how infrastructure as code can better align test and production environments and reduce problems that arise from configuration drift. We will explore how to use some Infrastructure-as-code tools to facilitate automation and improve testing.

Kat Rocha

Pages

StickyMinds is a TechWell community.

Through conferences, training, consulting, and online resources, TechWell helps you develop and deliver great software every day.