To SME or Not To SME

[article]
Summary:

Subject matter experts (SMEs) serve important roles on a project and are especially pivotal during the testing phase. In this week's column, Dion Johnson explores how SMEs positively and negatively affect testing and what you can do to make sure you have the right amount of SMEs on your testing team.

To SME or not to SME, that is the question. This line, borrowed from a famous Shakespearian work and altered, is as equally poetic and perplexing as its original incarnation from centuries ago. The original statement posing the question, "To be or not to be," was in reference to human existence and almost seemed as though it didn't need to be asked. Why wouldn't someone want to 'be'? Why wouldn't someone want to exist?

Our software testing incarnation of Shakespeare's query, "To SME or not to SME," is in reference to the existence of subject matter experts (SMEs). This question also seems unnecessary, because it seems pretty apparent that being a SME is a good thing. Why wouldn't someone want to be a SME on the application under test (AUT) or the tools used to implement and test that application?

To SME
Effectively testing an application typically requires a certain degree of application and environmental expertise. For example, I worked on a project that combined Web, Web services, and Java to create a front-end application that utilized a database that sat on a UNIX server. This system also interfaced with several applications not directly accessible during testing; therefore, the test team had to use and help maintain Java-based simulators to test the primary system's interfaces and outputs. The testers clearly needed a wide range of skills. By the time I joined the team, there were already a few SMEs in various areas who were also responsible for test development and execution. I began my tenure on the team by reading the vague requirements documents then ramping up on the simulators, tools, and technology pertinent to the system, as well as the existing test bed. The interesting thing that I found from the existing test bed was that in contrast to the high degree of environmental complexity, the tests themselves were remarkably simple and seemed mainly to exercise the most basic scenarios. The team had very bright people, but team members seemed to spend so much time and effort on being SMEs on the application and its associated tools, technologies, and systems that little effort was put into truly probing the various application components. This is a classic example of not being able to see the forest for the "SMEs."

Not to SME
Too much focus on becoming a SME has its problems, but too little focus on gathering knowledge can be equally, if not more, problematic. I worked on a project on which testers, in order to gather greater insight on the system, sought out members of the development and database teams to help fill in gaps left by existing project documentation. The project managers concluded, however, that such information gathering improperly influenced the development and implementation of the application tests, thus they completely cut off direct communications between the test team and the development and database teams. This "brilliant" move, while well intentioned, also served to bring testing to a crawl. Given the vague nature of the documentation and the lack of detailed information coming from the user on how the system should operate, there were few avenues for sufficiently increasing system knowledge and expertise except the other teams that worked on building them. By severing direct communication, test team requests for information regarding specific system rules were often ignored, overlooked, misunderstood, or seriously delayed. So activities that should have taken minutes often took days or even weeks.

About the author

Dion Johnson's picture Dion Johnson

As a senior test consultant and managing partner for DiJohn IC, Inc. and advisor to the Automated Testing Institute, Dion Johnson provides IT consulting services that focus on the overall system development lifecycle, with particular focus on the quality assurance, quality control, requirements analysis, and automated testing. He has presented at numerous SQE conferences and contributed to StickyMinds.com and Better Software magazine. In addition he is an editor for the Automated Software Testing magazine. Email Dion at dionjohnson@dijohn-ic.com or dionjohnson@automatedtestinginstitute.com.

StickyMinds is one of the growing communities of the TechWell network.

Featuring fresh, insightful stories, TechWell.com is the place to go for what is happening in software development and delivery.  Join the conversation now!

Upcoming Events

Sep 22
Oct 12
Nov 09
Nov 09