During the last 6-8 years, I've seen companies purchase a variety of sophisticated tools supporting software development, some of which claim to integrate the strategic activities and artifacts of requirements, design, production, test, and deployment into releasable software products. Any tool that can do all that is certainly worthy of consideration for implementation into the software engineering environment.
Such tools can make life easier for many while assuring customers product integrity. But why is it that these tools rarely find their way into that environment? Are they too complex? Do they really have what it takes to do the things they claim? Are organizations fully prepared to implement and use such tools? Are people prepared and committed to make some very significant changes in the way they do software development? All are interesting questions.
Perhaps, and most likely, the reasons why such tools are not successfully implemented is an overall lack of training or exposure to the specific techniques upon which the tools were based. Training budgets tighten and companies hire people they feel "should have" current knowledge of newly released tools - even before they are fully tested and proven worthy. Also, people are typically resistive to change, thus, when introduced to a tool that requires them to learn "something new" or how to produce artifacts and work products supporting a different methodology, there is usually some initial resistance. A case in point of how people tend to resist change is in the following example:
A group of software developers that are used to interpreting functional requirements from a requirements specification document or similar work product and have little knowledge of object-oriented methodology are suddenly required to do object modeling. That is, management decides that object modeling is sexy and wants to institute a change to the way system or software design is accomplished. Tools are purchased and the company methodologist or technologist does a presentation to roll out the new design method. The next day, analysts and the design group look at each other and try to sort through what they have "learned" and apply it to what needs to be built.
It's hard to lay blame on anything specific companies are or are not doing when it comes to automating, but it boils down to training, acceptance, gradual change, guidance, mentoring, patience, and some degree of enforcement. A clearly understood system or software development lifecycle (SDLC) model is key to understanding what, why, and when certain activities occur and certain work products are produced throughout the lifecycle. Without knowledge of SDLC elements (phases, milestones, baselines and their associated inputs, processes, and outputs), automating processes for the sake of automating is a death march - that is, a high degree of certainty that the automation will not succeed.
At many organizations, configuration management (CM) activities are "half-vast" at best, and clearly chaotic. This may be due to the fact that the wrong people have been assigned to perform CM tasks or the project manager is either clueless or knows very little about CM benefits. Then at some point and perhaps out of utter frustration, some genius gets the bright idea to "automate" - thinking that this is going to solve their dilemma, so the organization focuses on automating their chaos! Am I the only one who has witnessed this undertaking?
If you haven't yet read the many articles written on this topic by now please do yourself a favor and research why there should be an elaborate analysis performed prior to acquiring a solution to automate CM processes. When considering automation, ask the following questions (and more if you can think of more):
1. Are software builds performed regularly?
2. If so, are the builds successful? If not, your problem may not be an SCM problem, but a project management issue.
3. Are defects tracked from identification, fix, test, closure, build, and verification?
4. If branching techniques are practiced, are merges done often?
5. Is parallel development practiced?
6. Are codeline policies prepared, communicated, and used?
7. Are there sufficient smoke testing, unit testing, and integration testing?
8. Are there current, defined, and well-documented CM processes in place?
9. If not, why not? (The answers to this question may be a significant part of the problem domain.)
10. If so, have they been in place long enough to prove that they work? (That is, has the team used these processes for a sufficient period and do those processes work?
11. If not, then consider using the processes for 12 months. (9 months may be sufficient in some working environments.) This should be sufficient time to sort through most issues.
12. Is everyone agreeable to the CM processes currently in place? That is, are more changes needed to the CM processes to improve the way things are currently done?
13. If so, then collaborate the proposed changes with all affected parties, make all appropriate changes, and allow for sufficient usage time (see step 10).
14. If not, then the current CM process must be working - and automation should be considered only if the current processes are beginning to hinder or otherwise create bottlenecks.
Only upon reaching step 14 should the organization consider automating CM processes. Automating does not necessarily make things function more efficiently - especially if the automation creates problems in other activities such as defect resolution, and tracking builds, changes, baselines, and other things. Be smart when considering automating CM processes by determining the exact needs of the software team. Form a group consisting of key developers, network and system administrators, librarians, build-meisters, release engineers, SQA engineers, object modelers, designers, testers, source code checkers, the SCM group, software managers, DBAs, technical writers, and others as appropriate to discuss CM issues. Determine each person's or function's needs, then document these needs along with known constraints, complaints, and problems into a requirement document. This document will serve as the basis of need for a SCM tool.
What must you have? What can the group live without? What is the budget or how much is the organization willing to invest in a solution that protects the integrity of their product(s)? Are there plans to increase the software development group? Will the solution be required to support the enterprise or the organization at multiple locations? Can Web-based applications, legacy applications, and shrink-wrap components be supported by the same SCM solution?
All these questions and more must be asked and answers known before a SCM solution is identified as vendors will need to know these things to ensure their solution is right for you. And only after you and your software team has accomplished sufficient due diligence, will you really know the answer, "When it is appropriate to automate SCM?"
- Steve P. Berczuk, Brad Appleton, Software Configuration Management Patterns: Effective Teamwork, Practical Integration, Addison Wesley, 2002, ISBN 0201741172
- Anne Mette Jonassen Hass, Configuration Management Principles and Practice, Addison Wesley, 2003, ISBN 0321117662
- Dick Carlson, Making SCM Agile, CM Crossroads, April 2003
Dick Carlson is a consultant with 20+ years of software engineering experience that focused on software development training and mentoring, development and implementation of software lifecycle methodologies, software configuration management, software quality assurance, project management, requirements development and management, risk management, business modeling, business re-engineering, and software process improvement. Dick has trained and mentored teams and individuals on conducting strategic SCM activities and creating critical work products. He has also been involved in software process improvement initiatives in preparing organizations for SEI CMM Levels 2 and 3 compliance. Dick is the VP of Education with the Association for Configuration and Data Management (ACDM) and can be contacted by e-mail at [email protected].