"I thought you said you were done with the Cranfragle last week!" Sue demanded. "I just talked with Alan and he said that you haven't even started on the Green Flop."
"Yes", Joe answered, "I'll be starting on that tomorrow. What does that have to do with the Cranfragle?"
Sue's jaw dropped. "How can you ask that? You're not done until the Green Flop is done! "
What does it mean when a member of your team says that something is "done"? Taking the time to define this critical term does much more than avoid disagreements. It can also save critical project time and avoid embarrassing oversights.
"Done-ness" Points of View
My father was an accountant, and that perspective drove his definition of "done". He had a sign on the wall of his office that was aimed at the hapless engineers who might wander in. On the sign, above an image of a roll of toilet paper were these words: “The job isn't done until the paperwork is done.” His definition of "done" included time cards that were complete and expense reports that had been signed off on.
Your definition of done is driven by your area of expertise or specialization. You know what must be completed, and you attach specific value to those activities. By the same token, other people who have different areas of expertise will focus on other things and probably value them differently than you do. Many of us are configuration management professionals, so while we may agree with developers that done requires compiled code and properly built executables, our definition extends to include activities that developers may not pay as much attention to; things like code checked in with changes described, baselines updated, and change requests closed. Meanwhile, the testers among us will be concerned about unit and integration testing performed and defects addressed.
A "Done-ness" Standard
These differences of opinion must be resolved and yielding to the loudest voice will probably not bring us the most optimal result. The best standard comes from collaboration of all of the specialties in the organization. Yes, that even includes the accountant!
While your organization's priorities and unique challenges will have some effect on the "done-ness" standards you ultimately adopt, the following are a set of things that most software organizations include in well-thought-out standards.
Development based on appropriate baseline code: Latest is not always greatest. We must ensure that the changes were made to the correct code base.
Requirements met: In traditional approaches, this means conformance to specification. If you are using an Agile method, this means that appropriate collaboration with the customer role has gone on, and that person is satisfied with the result. Coding standards met: This presupposes that you have coding standards. Some method has been used to verify that the developers followed the standards. Methods could include peer-reviews, software tools, or pair programming.
Code under revision control: Appropriate code management tools or procedures have been applied so that the new version of the code can be distinguished from the original and you can get back to the original if needed.
Code checked in: This is actually part of the prior item. The code is in the library with all necessary supporting information. Naturally, this would force you to agree on the information that is necessary. Examples include pointers to the specification section, change request or defect report that prompted the change and a description of what was done.
Unit testing done: All programmers do some unit testing, so this item is more concerned with the completeness of unit testing. Many organizations look for documented unit test plans, libraries of automated unit tests, evidence of test results, records from code coverage monitors, and the like.
Executable built: Again, this is a given. But what must be ensured is that the build was done using repeatable procedures, that is used all of the correct versions of the appropriate components, and that everything that should have been done in conjunction with the actual build was done. Software integration testing complete: As with unit testing, the issue with integration testing concerns the completeness of the tests. Organizations often require similar evidence of appropriate testing. Refer to unit testing, above.
Smoke test passed: This is a test that ensures the executable actually functions at some very basic level. Smoke tests are generally part of the build process, or are executed immediately after the build process. They are designed to ensure that the system is ready for whatever comes next in the process (testing, installation, use, etc.)
Baseline established: Any time the system is being handed off to someone else (testers, customer, etc.), establishing a baseline is called for (with all of the documentation and controls that this implies). This new baseline then serves as the basis for whatever comes next.
When "done" means ready for production, then its definition must include the following additional considerations:
Functional/operational/acceptance testing complete: Full testing is a must, and the organization must agree on the types and standards that are necessary for production systems. Installation procedures ready: Who will be doing installation, and what will they need to do it effectively?
Documentation/training ready: Who will be using the system, and what help and handholding will they need?
Consensus is worth the effort! Getting all of the necessary people together to define "done-ness" will be difficult. Facilitating such a meeting will probably be a challenge, as well. Still, there is nothing like working in an organization that works like a well-oiled machine, where everyone knows what is expected of him or her and just naturally does it.
It's not just a dream. This ideal is within reach!