We recently had a chance to speak with senior engineering manager (applications) for Comcast Video Services, Clark Malmgren, about his role in test automation and how his experience and practices can lead to success in many areas of software testing.
Clark Malmgren is the senior engineering manager (applications) for Comcast Video Services. He’ll be presenting a session titled “Building a 21 st Century Test Automation Framework” at the upcoming STARWEST 2012 in Anaheim, California. Noel Wurst recently had a chance to speak with him about his presentation, and the future of cable technology.
Noel Wurst: How long have you been with Comcast, and how rapidly has the set top box (STB) evolved just in the time you've been there?
Clark Malmgren: I started working for TVWorks straight out of college in 2006. Since then, TVWorks was purchased and is now a part of the Comcast family. The STB has a tendency to evolve very slowly. This can be very difficult since “Next Generation” devices only allow for Java 1.2, which is well over 10 years old. Recently however, we are moving towards more of a cloud-based solution with Comcast’s Xcalibur solution.
Noel Wurst: You mentioned the pitfalls of the STB in relation to test execution, can you go a little more into what those pitfalls are, and secondly, how you tackle/address them?
Clark Malmgren: Testing pitfalls with the STB generally fall into four categories. The first is that the OS and hardware vendor solutions are proprietary and are often incorrect, but will not change. Because our software must work in that environment, we cannot do our testing in an emulator, which means we need to create a system or integration test environment that includes that software. We have built a suite of tools to deal with this called CATS (Comcast Automated Test Solution). This is a fully integrated solution that puts the STBs on universal racks that have IR blasters, serial output ports (for logging), power control and video processing. This is all available using either Java APIs or a remote access tool similar to a Slingbox.
The second problem is that the boxes are slow. Running thousands of tests on the devices can take an extremely long time. To get around this, we have made a significant effort to optimally execute our testing in parallel while still executing every test on every device type. I will talk about this specifically during my presentation. The third pitfall is that some interfaces with the STB are unreliable. Specifically, IR is sporadic at best. Sometimes key presses get dropped or stray IR may causes problems on the box. To address this, we have implemented an extensive failure analysis layer to determine if a failure may be a false negative. As a more proactive measure, we have created tools to send keys to the box using UDP or other protocols that are more stable than an IR blaster.
The last problem is that almost no testing tools work for us. As a result, we end up building all of our own tools. This can be painful, but it is a blessing in disguise because we always have the opportunity to build exactly what we want. For example, there is no solution for remote procedure calls that works on any of our STBs. So I ended up building my own tools, which can run on Java 1.2. This has made a huge number of previously un-automatable tests now automatable.
Noel Wurst: In setting up a modern day test automation framework, are there any precautions that must be met to ensure the highest level of testing quality and efficiency?
Clark Malmgren: It is essential to test the test framework. We use unit tests, component tests, and integration level tests to validate the framework before pushing it to be used by the group