When is Done Really Done?

[article]
Summary:

When your idea of a completed task is significantly different from that of your team's members, you're asking for trouble. In this week's column, Peter Clark outlines some steps you can take to ensure that everyone on your team understands your expectations when you ask them if they're "done."

One of the biggest problems I have had over the years with the people who report to me is a fundamental disagreement of what the word "done" means. When I was a brand-new supervisor, I would ask my people how they were doing on whatever tasks they were working on. They would tear themselves away from their computer screens and say, "Fine. I'm almost done." I would walk away feeling good about them and about the project, confident that they had things covered. A week or two later, I would check in with them again. They would look up at me and say, "Fine. I'm almost done."

Another variation of this is the high-precision answer. How are you doing? "Fine. I'm 95% done." The purpose of this degree of precision is to convince the interrogator that everything is under control. In fact, nothing could be further from the truth. I have formulated "Clark's Law of Misappropriated Precision": When it comes to estimating completeness, the greater the precision, the greater the inaccuracy. I have been told that an activity was 97% done and required no more than two weeks to complete. I have then waited four months for completion.

The problem isn't that my people are being mendacious. The problem is that we have a disagreement on what the meaning of "done" is. When I, as a manager, say "done" I mean that the task is complete and this person is ready for her next assignment. Therefore, when people say that they are 97% done, I expect that only 3% of the work remains. On a 100-hour task, I foolishly assume that only 3 hours remain and I should be able to assign them something new tomorrow.
What do my people mean when they say "done"? It varies. If I have asked them to create a new module, "nearly done" may mean:

• I have thought about it, and I am ready to start coding.
• I have coded it, and I am ready to think about testing it.
• I have tested it, and I just have to fix all of the bugs.
• I am tired of working on this, and I am ready for something new.
• I have found a horrible bug, and I am ready to rewrite the module from scratch.

Projects can float along in a dream-like haze for months in this manner. Oftentimes you don't know whether something is really done until integration. When you go to put all the pieces together, you then find out that a lot of them are missing or wrong.

There are things you can do to avoid this. For common tasks, like coding a module, I create a description of the meaning of "done." This includes:

• A description of what is included in this activity: for example, coding, unit testing, inspection, documentation of insertion of the module into the revision control system
• A brief list of related activities that are not included in this activity, to prevent gold plating and distractions
• A list of any predecessor tasks to this one
• A list of inch stones, along with the percentage of the overall task that is finished when that inch stone is complete

Predecessor tasks are important because your staff will often jump the gun to work on something that they like. For example, they will begin working on the code before the requirements have been finalized. You need to establish the gate they must pass through before they can start working on a task or your sweet daydream will quickly turn into a nightmare of blown schedules and budgets.

The inch stones are

Pages

Peter Clark

Peter Clark has twenty years of experience in industrial automation. He currently manages teams working in materials handling, especially baggagehandling systems. A regular columnist on StickyMinds.com, Peter can be reached at pclark@jerviswebb.com.

TechWell Stories

StickyMinds is one of the growing communities of the TechWell network.

Featuring fresh, insightful stories, TechWell.com is the place to go for what is happening in software development and delivery.  Join the conversation now!

Upcoming Events

 Aug 25 Advanced Tester Certification—Test Analyst Prepare for the ISTQB® Advanced Level—Test Analyst Certification Exam Aug 26 Mobile Application Testing Techniques for Testing Mobile Devices Sep 22 SQE Training Week (Washington, DC) Build a customized week of training Oct 12 STARWEST Conference Be a Testing Rock Star

Recommended Web Seminars

 Aug 21 Application Testing Challenges: Theories, Tools, and Technology Aug 26 Test Automation during Development: A Paradigm Shift Sep 11 Testing at the Speed of Mobile: Adopting Continuous Integration with Agile On Demand Forrester's Insight on the ROI of Service Virtualization On Demand 3 Key Steps to Effective Load and Performance Testing