Test Tools, IT Projects, and User Experience: An Interview with Isabel Evans and Julian Harty


In this interview, Isabel Evans and Julian Harty reveal the area of software where we're ignoring the user experience. They cover how we're not thinking strongly enough about the tool sets used by IT projects, and dig into why test tools are often put on the shelf.

Jennifer Bonine: We are back with our virtual interviews at STARWEST 2016. I'm pleased to have two individuals this time with me, Isabel and Julian. Thanks for joining me, both of you.

Isabel Evans: Pleasure.

Julian Harty: Thank you.

Jennifer Bonine: For folks that haven't had a chance to interact yet with Isabel or Julian, I'll just have them give you a little brief on each of them. You'll also notice Julian's wearing this quite lovely lab coat. Yes. He is part of the test lab here at the conference. Some of you in the virtual world may have heard of the test lab. Very neat experience where people can come and actually engage in what we talk about. We talk about all these things at the conference, you actually get an opportunity to engage with testing, engage with the lab and do some hands-on, for those of you who like that hands-on opportunity. We do believe that the folks out there, if you're watching virtually, Rachelle or one of the folks from the conference, maybe Lee, have mentioned that you can get engaged in the test lab as well. Hopefully you guys have an opportunity to do that and can be virtually part of what's going on in that lab. Then at the end they announce the winners. There's always incentive to be a winner at the end and to be able to get engaged in that. Hopefully you're participating in that.

Maybe Isabel and Julian, give us a little background on the two of you, for the folks watching.

Isabel Evans: Should I go first?

Julian Harty: Yes.

Isabel Evans: I'm Isabel Evans. I've been in software testing for more than thirty years, software testing and quality, and I work independently now, working on process improvement, user experience, usability, and general software quality areas.

Jennifer Bonine: Then Julian?

Julian Harty: I'm also independent. I worked in tech for a bit too long, for years at global roles at companies like Google and eBay, but I spend most of my time now helping use technology in education in a bunch of countries.

Jennifer Bonine: Very nice.

Julian Harty: And helping refugees too. Helping them use technology.

Jennifer Bonine: Wow. Amazing. If you haven't met either of these two, they're wonderful people to know in your network of folks and to get to know. The two of them have a shared passion, I've heard, that they're working on that we'd like to share with all of you out there, because it may be a passion of yours as well and you may want to join in on some of the things that they're trying to do, and it may help you in your organizations. Would you guys mind sharing a little bit about the project that the two of you have embarked on as a team?

Isabel Evans: He's looking at me, so I'll go. One of my interests and one of Julian's interests is around user experience. It occurred to us when we were talking about this one day that we talk about user experience, and the industry's getting increasingly engaged in looking at user experience, particularly around the ubiquity of mobile apps and so and so forth. Actually, there's a whole area of software where it doesn't seem as though we're thinking strongly about the end user experience and usability, and that is the tool set used by IT projects themselves. Within that, if we look at test tools, one of the reasons that they've become shelf ware is because people find them hard to use and hard to engage with. Not all testers are technical and not all testers should be expected to be technical, because some of them are coming out of the business and user groups and so on and so forth.

We started talking about that and investigating whether we should do something around a tool kit, a work box, to help tool providers improve that. Then as we started looking at it, we were finding it wasn't just the test tools that had that problem, wasn't it?

Julian Harty: That's right. The challenge for us is that we use software to help make software. When there are flaws in the software that we're using, and then we're not going to do quite as good a job, it may take us longer, we may make mistakes, or even bad decisions because of our challenges working with the tools. This is a perennial problem across the industry. It's good that we can find ways to help people focus on improving the tools so that people can then get their jobs done. All of us use software tools.

Jennifer Bonine: Right. I think it's interesting, we had talked about sometimes the barrier to entry for people is feeling like an outsider and they don't understand what they're supposed to be using and they're intimidated by the tools or the technologies. A lot of times, if those tools are being created by a certain group of people who have a certain mindset, they all think alike, but if you have someone with a different mindset, like a user or someone who comes from outside of technology, they come in and feel like an outsider and it's a barrier to entry.

Isabel Evans: There are a number of things which make the tool sets, which is surprising in the way they make the tool sets difficult to use. I was at a meeting in London, an innovation meeting, and there were a lot of people there working on innovative tech projects. Highly, highly intelligent people. I was talking about this idea. I was getting comments like, "I spend 50 percent of my time wrestling with the tool set instead of solving the problem. One person said to me, "It's like the tool set was designed to be used by a 12-year-old boy in his bedroom in the 1980s." For me as well, Julian was encouraging me to get involved with Get Help and learn a bit about that. I was instantly put off by the name because it's a slightly insulting name. It just doesn't feel right to be using it. There are all sorts of areas like that which are obviously off putting. I think if we had a broader group of people working in IT, one of the outcomes of that would be there would be a broader understanding of the experience for end users for all the software that's being produced and actually better quality, better usability for all the software.

Julian was talking about making the software quality better by making the tools better that we use ourselves, which is intrinsically in one direction, but I think actually there's a possibility for a wider influence on the industry as a whole. Making it actually less geeky.

Jennifer Bonine: Yep. In broadening that perspective. I think that when you engage diverse perspectives, you get better outcomes. Instead of just one perspective tackling a problem, engaging those other folks that will help you improve the quality because they come from a different mindset in how they approach the problem. Inherently, tackling, improving quality from two angles, one from this improved tool kit and tool set that's more accessible, and then the other side of diversifying the number of folks getting into the industry and having access to influencing what is developed and how it's developed and how it's tested would be phenomenal. Again, I think we're seeing there are some challenges for hiring. Companies are having issues with not enough testers, not enough developers, not enough people applying for jobs. This may also solve that challenge for organizations.

Isabel Evans: It could have a difference on that. I think one of the bits of research that the silent man here was getting into, talking about diverse, let's give the man a chance to say something. You were talking about some of the areas where actually developers also find the tools problems. Developer liberation front and things like that.

Julian Harty: Yes. As an example, setting analysis tools have been around in the industry for decades, but we've had so many problems with adoption of these tools. The first time people use them, they get frightened because it may say 100,000 things are wrong with your code, and no one wants to even dig in to this, let alone make sense of it. What we've noticed is that the tool vendors are now starting to integrate some of the feedback, so it's in the moment feedback. As you're still writing the line of code, it's a little gentle hint saying, "Just so you know, there are a few things you might want to look at improving now.” We get the feedback when we need it to make the changes. You at least have the option there.

It's a massive problem for developers. Talking to some of the experts in the industry who've been around it long enough to know this world, they're saying that the tools, it hasn't really improved much in the last few decades for software developers, even with all the beautiful ideas. We're looking for a step change in the tools, and, as already mentioned, improving the diversity of being able to use the tools without learning the specifics and the weird language of executing code. How many people execute someone in a day? We need to think about these as well.

Jennifer Bonine: Exactly. If people are interested in this, where you guys are going and getting more information and having access to some of the developments that you're making, how would they best get involved or contact the two of you about this?

Isabel Evans: We're both on LinkedIn. We haven't specifically set up anything like a website or anything like that yet because we're early data now thinking, we're gathering ideas, we've been trying to get a feel about whether it was just us thinking this or actually generally people are thinking it. We're getting some really interesting feedback and some very positive feedback. I think what I'd like is if you're out there and you're feeling like you've experienced what are actually, if you were looking at another piece of software, you'd be calling a usability problem or a user experience problem, I'd be really interested on, just a comment through to me on LinkedIn to say, "Actually, yes. Us too" and maybe some examples. Because if we build up a body of evidence, then we've got something to take back into the industry. You were thinking about open source, weren't you particularly?

Julian Harty: Yes. One place where we can experiment without worrying too much about proprietary tools is with open source tools such as, say, Selenium, where people can go and download the code, modify it themselves and create quick and easy prototypes to try out ideas. Many of these ideas will have flaws with them, but we'll learn together from doing this. An open place is a great place to experiment together and collaborate.

Jennifer Bonine: Absolutely.

Julian Harty: You'll find in most cases, most of the other repositories are truly naughty.

Isabel Evans: I think what we want to do, well, what I'd like to do first is try and draw up some sort up very simple guidelines. I'm thinking if we can get people experimenting with those and feeding back. As we're just starting on the project now, I guess the other thing is, if anybody else is already working on it, because we don't want to be reinventing the wheel.

Jennifer Bonine: Let you know. For those folks out there, you can contact both of them through LinkedIn, send them a message if you've experienced some of these issues, and then also let them know if you're already thinking about this or have looked at it so they know that as well. I think that's one of the great parts about this conference as well is understanding what people are thinking about, what they're passion is and what they're working on to get that exposed to a larger audience. I thank both of you for being with me today.

Isabel Evans: Pleasure. Thank you for having us.

Jennifer Bonine: Yeah, and exposing this great project you guys are embarking on. I'll look forward to hearing how it progresses.

Isabel Evans: Thank you very much.

Jennifer Bonine: Thank you.

Julian Harty: Thank you. For those of you who want to get involved in the test lab remotely, then go to starwest.atthetestlab.com and you should find the page to get you started.

Jennifer Bonine: Awesome, thank you. Thank you both.

Isabel Evans: Thank you.

Isabel EvansIndependent quality and testing consultant Isabel Evans has more than thirty years of IT experience in quality management and testing in the financial, communications, and software sectors. Her quality management work focuses on encouraging IT teams and customers to work together via flexible processes designed and tailored by the teams that use them. Isabel authored Achieving Software Quality Through Teamwork and chapters in Agile Testing: How to Succeed in an eXtreme Testing Environment; The Testing Practitioner; and Foundations of Software Testing. A popular speaker at software conferences worldwide, Isabel is a Chartered IT Professional and Fellow of the British Computer Society, and has been a member of software industry improvement working groups.





Julian HJulian has been working with mobile apps for many years and increasingly with mobile analytics to find ways to improve various mobile apps, and how to develop and test mobile apps. He both researches the topics as part of a PhD and apply the ideas in practice both directly and through working with various industry leaders including HP and Appachhi.

User Comments

1 comment

About the author

Upcoming Events

Sep 22
Oct 13
Apr 27