Models are useful in different settings in different ways. Models can test facts, ideas and understanding, simulate operation, and aid coordination between systems and people. In this column, Becky Winant lists six model patterns she has seen in practice in software development organizations, talking about where each is appropriate, and the strengths and weaknesses of each.
Last week, Joe (a client) expressed frustration about modeling practice disputes. "You hear people argue over requirements models, analysis models, executable models, architecture models, design models, and 'we don't need' models. Why can't they agree?" he asked. My years of working with people to build models made Joe's complaint familiar. Some build models to analyze a problem and some to plan software design. People choose different quantities of detail and quality criteria. Some projects might require models as documentation, while others don't.
Here are patterns of modeling practice I've seen and the conditions under which they're used
Tool and core component developers usually are artisans. My friend, Doug, managed a group that built internal and external interfaces for all corporate products. Innovation and pride of craft drove the group. For one high-visibility project they eclipsed all expectations. Doug gleefully recalls how the executives came by the room peeking in to glimpse the magic of this group.
Artisan practice is model-less. An artisan needs only a mental concept. Products are directly crafted using personal skill and discussions with customers. The artisan might say, "the code is the requirement."
The hallmark of this pattern is personal craftsmanship. This is a strength and a weakness. When products remain small, with complexity and risks low, the individual craftsman shines. As products grow requiring expansion and maintenance to meet market demands, the artisan approach breaks down. Having no legacy or broad design for the product also means expensive and challenging revisions.
Irv, an IT manager for a banking group, hired me to create diagrams. "But, please," he urged me, "don't teach anyone to build models." The project involved three departments of a bank developing an integrated system to support international commercial accounts. Irv wanted to use design diagrams to bridge communication among the divisions and also to keep departmental boundaries clear. When I finished the diagrams and definitions, Irv said, "That's what I need! The operational detail we already know."
Improv modeling practice consists of diagrammatic sketches, not formal or technical models. System knowledge is ingrained in the culture and its systems. Diagrams align design concepts. Detail is expanded and explored with code. The whole motivation for this practice is communication; large systems take coordination to be successful. Improv modeling facilitates mutual agreement of purpose and success criteria but contains little elaboration of functional or technical detail.
Improv's forte is team synergy and communication. It succeeds because people know their market and systems well. This practice disintegrates when radical change forces rethinking requirements. At that point, modeling practice would need to extend beyond a tool for agreement into one that would expose what participants don't know-the gaps and anomalies in interpreting requirements.
My colleague, Linda, knew a contract practice where model walkthroughs only began after models were declared "done." Senior management approved the models, and while the full-color, oversized, plotted diagrams were displayed, no one suggested changes. Later the models were posted in the hallway outside the senior systems manager's office and covered in glass.
Contract modeling practice dictates detailed formal models. Automated tools shape model practice using fill-in templates and syntax checking. While they are detailed, contract models aren't required to pass tests for well-formed structure, only readability. Requirements often are literally a contract signed by all parties.
Contract practice thoroughness is good and bad. It increases good modeling sense and skill. In time, people build better and more useful models. The danger is wasting time producing useless models, if people only go through the motions. The practice fails when the cost of rebuilding unwieldy, poorly crafted models becomes intolerable.4. Illustrator
Elaine manages a software product that is sold to laboratories. New releases take about six months each year. Most are technology changes. Elaine's staff and I built models capturing industry knowledge and other models for design concepts. The models formed baselines for design discussions. Revisions to them became new baselines. By reviewing the knowledge models separate from the implementation, Elaine figures she has saved two months of rehashing and reconstructing information.
Illustrators model to explore the problem statement and uncover implied and derived requirements. Illustrator practice applies technical rules for model completeness, quality, and usefulness. Because good requirements are stressed, illustrator requirements usually include business goals, problem statements, and measures of success.
Illustrator practice reflects a mature modeling practice, integrating modeling and project planning processes. But this practice falters when model or library size exceeds the limits or capabilities of the tool technology being used.
Tim, an automotive engineer, and his colleagues have learned to track errors caught in simulation, comparing them to errors and debug time from their previous release. This early testing carves hours from expensive debugging and recoding. Management is now exploring integrated software and hardware simulation for vehicle testing and reducing full reliance on costly physical prototypes.
Kinetic practice centers on executable models. For highly complex systems with significant risks, model simulation exposes failure points early. Algorithms, logic paths, performance, conformance to requirements, and industry and regulatory standards get tested. Kinetic practice stresses model integrity and industrial-strength tools. Requirements may be detailed like the contract pattern, but emphasis is on identifying and containing risks. Kinetic practice brings engineering discipline to software development. This approach proves itself with high-risk, complex systems. But this approach won't work if management won't budget for the tools, schedule time, or library support.
Kim manages a group of environmental control engineers that relies on reusable domain models and model translation. They've released three products based on this process and using translator tools. Last year a new product line used the team's models. The new software was ready for system test three months ahead of schedule. This year Kim told me she was assigned to a sister engineering group in France to share the control models and to teach them the process.
Vanguard practice recognizes models as corporate assets. Reusable models are cataloged by subject. Models are one ingredient for code translation. The vanguard modeler would say, "the models are the code." The models capture subject knowledge and system service abstractions. Design patterns, implementation rules, libraries, and model elements are fit to a particular architecture. Translator tools use this information to produce code. Vanguard practice changes a technical process into a business practice, and will succeed as long as the culture upholds that belief and its investment in tools.
One of my clients who has an artisan modeling practice hopes to step up their CMM rating, which enables them to compete in new areas. We used these patterns in discussing changes to their processes and measures. You may recognize more than one of these patterns in your organization and notice that practice varies according to organizational style, requirements process, and system complexity.
I would like to express appreciation for the SEI's CMM/CMMI and Gerald M. Weinberg's Software Engineering Cultural Patterns.