e-Talk Radio: DeMarco, Tom, 22 February 2001


Staff turnover, mis-sizing a project, inflation of size during the project, failure to specify, and variation in delivery rates are the top risks common to all IT projects. Listen as Ms. Dekkers and Mr. DeMarco talk about coming face to face with risks in software development projects.

TEXT TRANSCRIPT: 22 February 2001
Copyright 2001 Quality Plus Technologies and Carol Dekkers. All rights reserved.

ANNOUNCER: Welcome to Quality Plus e-Talk with Carol Dekkers, brought to you by StickyMinds.com, your online resource for building better software. This program will focus on the latest in the field of technology. All comments, views, and opinions are those of the host, guests, and callers. Now let's join your host, Carol Dekkers.

CAROL: Welcome to Quality Plus e-Talk! with Carol Dekkers. I am Carol Dekkers and welcome to show number eight. To anyone who is listening through the Internet or around the world, we would like to say welcome to our audience. This week's show number eight. I would like to mention for show number seven, Elizabeth Hendrickson, last week, there was a mixup. We had pre-taped the show. We will be re-airing that show on March 29, the show number thirteen. So on March 29, Elizabeth Hendrickson will be talking about how to evaluate software tools. This week I've got an incredible guest. I have had people that have sent me emails, they've phoned me, and they've said when is Tom DeMarco, the great Tom DeMarco, going to be on your show, and this is it. This is the show and I am, I am tickled pink I guess I should say. Welcome to the show, Tom.

TOM: Well, thank you Carol.

CAROL: I would like to introduce you by saying Tom is a principal of the Atlantic Systems Guild, a Fellow of the Cutter Consortium. He has made outstanding contributions to the theory and practice of software development through his writing, lecturing, and consulting. From his early work on structured analysis to his later contributions, he has absolutely, he epitomizes to me the world of software quality, the world of risk management, and the awards are just...they go on and on: the 1986 Warnier Prize for lifetime contributions in the field of computing; the 1999 Stephens Award for contributions to the methods of software development; the 1986 J.D. Warner Prize for lifetime contributions to the information science, is like the lifetime achievement awards for the Grammys, Academy Awards, and The Golden Globe all rolled into one, and I am absolutely thrilled to have you on the show, Tom.

TOM: Well, thank you, Carol. I am not sure if that is as great as all that...I think the right way to think of me is someone with scars all over his body from having personally made all the mistakes that characterize software projects and management of software projects. I just had a good memory of what caused each one of those scars.

CAROL: And...and one thing that I always remember about one of the first times that I met you, Tom, was in...I think it was in Minneapolis and you came out for dinner with us, and I was so stunned that Tom DeMarco would grace us at a dinner table, and it was great! Today, we're gonna talk about risk management, making success possible, an area of where absolutely you have had scars, battle...battle scars, and you have seen a lot of projects that have been good, bad, failed, and...and really I think that risk is something that you and Tim Lister in your seminars have really focused on and really made a lot of headway in that area. Would you agree?

TOM: Software is an intrinsically risky business, so it's not just myself, Tim, and other experts on the subject, who have come face to face with risk. In fact, all projects incur a lot of risks...and paradoxically it, it...the real risk doesn't have a tremendous amount to do with building software. That's not a very difficult thing. The difficult thing to do is understanding what software will correctly fit into an organization and then causing that software to do its work on the organization, which means changing the organization. Organizations are resistant to change and any executive who decides, "Gee, we ought to change in the following way-- we ought to centralize sales or decentralize accounting" or something like that. The very first thought they have as to how to do that is, "We'll call in the software folks and get them to build a piece of software that does the dirty work." So, building the software is the easy part and doing the dirty work (i.e. effecting a change) on an organization, that's what you built the software for and that's where all the risk lies. So, it's not just the experts that tangle with risk; it's virtually every software project leader has to come face to face with this risk because software effects change and organizations have change-resistance, change-reluctance.

CAROL: So, you are talking about more than just doing project management. The project manager who has taken all the project management courses and goes through and lays out their schedule in Microsoft Project or in some other thing, and...and goes through and has the user meetings... that's not risk management.

TOM: No...no, in fact, a very simple test for risk management is to look at the project plan laid out in Project or in any other scheme, and ask the people involved, the people who understand the project, to point to a task that might not happen at all. In most projects the personnel will stare at you in complete lack of comprehension. They say, "Well, why would we have something on the project plan that wouldn't happen at all" and my answer to that is because that's a risk. It's something that may happen or may not happen, and if you don't have a plan for what happens, if it happens, what your response is going to be, then you're not doing risk management. So, putting together a schedule of all of the things that must necessarily happen on the project doesn't include all of the things that might happen on the project, and therefore, it's not really managing risk.

CAROL: Right, and there are probably people saying, "Well, if we had to lay out in a project plan every possible eventuality that could possibly happen, we would never get software developed in the first place."

TOM: No, that's true. There are always risks that you cheerfully ignore. I mean one of them was cited as an example by the people in our risk management course. What's a risk that you would be willing to ignore, and he said, "Sun goes Nova." The sun explodes...

CAROL: Right...

TOM: There's a certain probability would happen, it would have disastrous consequences, but I'm gonna ignore that. So, there are lots of low probability situations that you ignore. On the other hand, there are some core risks common to all projects. I think that there are five or six core risks common to all projects. For instance, inflation of the product; it grows from the time it's first specified until between that time and the time when the first delivery is made that the requirement has grown. Now, that...that core risk is common to all projects, and if you had a risk list that didn't include that one or any of the other core risks, then I would say you're crossing your fingers and just hoping too much. And the reason you do that is because somebody, let me give you an example of how you got in that position, where you didn't consider something that happened to most projects in the past; you never considered the possibility that it might happen on your project. People leaving, for instance, in the middle of the project. How could you ever not have a plan for that? How could you ever not work that into your plan?

Well, let me give you an example. Your boss has just come to you, Carol, and said, "We need this new backbone system in two years." You've been privately saying it would take six people three years to do it, and they are suggesting you do it with four in two years. And they say, "We absolutely have to have it in two years, in fact, there's a good chance that we are going to have to have it in eighteen months." And then finally they come back and say, "Yeah, that's it, eighteen months and with the four of you and that's it." And they say, "Only you can pull this off. We're really counting on you because it's really important." Now, you're looking at your project plan and you're saying, "Well, should we expect that of the four people at least one of them is going to leave over the course of the eighteen months?" Should we? I mean that is a pretty likely probability, but your inclination is to look at this whole thing and say, "Well, hell, eighteen months is barely doable anyway. If I'm gonna succeed, I've gotta catch some breaks along the way, and one of the breaks I've gotta catch is my key people don't leave." Now, look what's happened, you have made the catching of breaks along the way an integral part of your project plan; that's how you end up not doing risk management, and it's because you are so challenged that you know that luck is going to break for you in order for you to succeed. It puts you in the completely wrong mode for dealing with risks.

CAROL: Now, why do you think we try and avoid risk? Is it just an inherently human thing to...is it kind of like people don't get insurance because they don't want to think about actually dying, so they don't get insurance? Is that the same type of thing that goes on, that we just avoid risks because if we don't think about them, maybe they'll never happen.

TOM: Well, let's...let's take that in two parts. You're talking about why do people avoid risk management, and then, but the way you said it, is why do they avoid risk. There are actually two totally different things that go on in the course of a project. One, is that people avoid thinking about the awful possibilities, that's exactly what you said, they don't do risk management, which is the equivalent of not having insurance to protect your family if you drop dead, because you yourself are too gutless to think about the possibility of your own demise. But, then, there is also something that people call risk avoidance, and since you used that term, you said avoid risk, I want to talk about that as well.

Risk avoidance is entirely different. It says, "Don't do a risky thing." And very often people would assert having not thought about it for too long, that the proper approach to risk is risk avoidance. The problem with that is the risky things are the only things that are really worth doing. So, for instance, if you're considering a project to bring package-switching technology into the air traffic control arena, you look at this and you say, "Boy, it's really risky but it has some tremendous advantages." The tremendous advantage is that the packets can do basic accountability. You can always tell if a packet hasn't arrived, and accountability is a real important thing for conveying information back and forth between aircrafts. So, it's a risky project because it...it gets involved in this very delicate matter called air traffic control, but the risk is there and packaged up with an important benefit, that you'd have some accountability for information flow back and forth between the aircraft and the tower.

I made that example up, but I made it up in such a way as to demonstrate a basic truth, that risk is always packaged with opportunity, and the choice not to do a risky thing, "Gee, let's not mess with the air traffic control system, it's so fragile and so much work; there's so much risk associated with it, let's not do it," always means that you forego the opportunity, the advantage, whatever the reason is, that you might build a system to benefit, you forego it if you do risk avoidance.

Today, there is risk avoidance going on everywhere. When you hear of a company like AT&T, that during the '90s laid off, you know, a third of its workforce, IT workforce, just before the IT people became almost impossible to get and AT&T was still laying them off. During that time, they were letting go people that could have solved, for instance, their Year 2000 problem...

CAROL: And I'm going to stop you right there.

TOM: Okay.

CAROL: We are going to take a short break. If anyone would like to call in on our toll-free number and talk to Tom DeMarco, the number is 866-277-5369. And we'll be back after these short messages...Welcome back to Quality Plus E-Talk! and this week's guest is Tom DeMarco of the Atlantic Systems Guild. We've been talking about risk, risk avoidance, risk management, and just before we went into the break Tom had started to tell us about the situation at AT&T in the '90s.

TOM: And many others, Carol. Companies that laid off people, very often laid off people that were CMM Level III, and the amazing thing is how come they invested so much in these people and then when it came time to tighten ship they had to lay off these very high-priced people who had been process improved. Part of the reason was that they were, I think, part and parcel of their whole business of becoming CMM Level III, they had stopped doing really risky projects. You know, if you're really under the gun to improve your CMM Level, one of the things you could do is do easy projects, and so when the tough choice came, they looked at this group of people and said, "Yeah, they're very good at what they're doing, but what they're doing isn't very valuable to us." Again, because they were avoiding risk; they were doing easy projects. Everywhere I go, I see people doing, you know, dumb conversions to a 105th conversion of a database, mainframe database, to a Power Soft client distributed system, and they could do those in their sleep. What they're doing is avoiding the tough projects that really are worth something. So, risk avoidance is not the solution to risk.

CAROL: And is it similar to you always miss 100% of the shots you do not take--You know that's a reasonable opinion.

TOM: That's right.

CAROL: And we have a caller, Patricia.

PATRICIA: Hi, Carol. Hi, Tom. I'm glad you're taking the call. I have a question about The Deadline.

TOM: Patricia, what?

CALLER: Patricia McQuaid. I am a professor out at Cal Poly State University in San Louis Obispo, and I teach classes related to software quality, software testing, project management, rapid development, and, so, a lot of what I teach deals with risk. One of the books that I just so happened to use in my class is your book, The Deadline, and I have used it several quarters. I really enjoy it. The students just love it. One question that we all have is this, you know, we're talking about risk and one of the big risks is that the project can get out of control and, worse yet, management may not even be aware of it. In your book, you have the technique of going to confession, where employees that are afraid to tell the manager that the project is going to be behind the deadline, they're afraid to tell. So, one of the techniques is that they pretend they're going to confession where the manager really knows who it is, but pretends they don't, so that they get the information and there is no adverse reaction to the employee. The question is, "Is that true? Have you ever seen that done, or did you just make that up?"

TOM: That's 100% true. That's impossible to make up a lovely story like that. It was a young manager who worked at Apple, her name was Maura, and she actually had a confessional outside of her door. Somebody would knock on her office door and say, "Maura, a guy's gotta go to confession." And then the person would disappear and go inside. Really, you wouldn't know that it was the same person, except, of course, she did know it. And then she would go in and sit there and listen to his confession, and he'd say, "We're just not gonna be done on June 1st." And she say, "Aw, that's very bad. And say two Hail Marys." It was a way that she could in some sense remove herself from knowing what it was the guy was saying and who was saying, and take the bad news without associating the person with it. Other companies use anonymous email. They use these anonomyzers in order to send messages to the boss, because the boss has said, "I want you to send me messages, even when you're not comfortable sending them over your own name." And they have an anonymous email. They go to a Web site and type in a message, and it gets sent without any trace of who actually sent it. So, there are different ways that people anonymize or pretend to anonomyze to make it...to send the message...it's okay to give bad news to the boss.

CALLER: Well, that's great. It seemed like too good to make up. And I always end the class, of course, telling them that you would ideally like to have the environment in which the communication is such that you could tell management this without repercussions. But, if that doesn't work, I think this is a great idea. Well, thank you.

CAROL: Thanks for calling in, Pat.

TOM: Thank you, Pat.

CALLER: Thanks, 'bye, Carol.

CAROL: I think that's a very innovative way, is actually doing a confessional. That basically opens up the communication with something that is partially humorous, partially innovative, and if it really works, then it's probably worth a try.

TOM: Well, you know, it was Maura's solution. I am not sure it would be right for anyone else, right for me, but the interesting thing was that that box that sat in front of her door was there all the time. People would ask about it, and it was a constant symbol that it was going to be okay to give bad news to Maura. If you don't do that, then people say, "Oh, the boss is willing to hear bad news, but only of the correctable variety." You know, if you don't give me another person, I'm not going to be able to make the date; as opposed to the uncorrectable variety, which is the date was stupid, we never had a prayer at it, and sure enough, we're not gonna make it.

CAROL: Right. Should we take a caller, Tom? Would you like to?

TOM: Sure.

CAROL: Okay. Diana? Welcome to the show.

CALLER: I'm here. This is Diane. Oh, thank you very much. I appreciate it, Carol. And I appreciate you calling back on me on the calendar. My question for Tom: we here in Melbourne, Florida, are struggling with equally challenging deadlines, and I'll call them opportunities, because I like the way you phrase that. Um, introducing an integrated schedule is somewhat of a new concept we're doing, to marry all the pieces together, and I'm a testing manager. How can I easily depict on a Microsoft Project picture where my risks lie, without giving them too much detail, because management often cannot deal with the details but they can deal with the concept in a picture format.

TOM: Um...I guess. I'm not sure. Microsoft Project is not exactly set up to deal perfectly with the probabilistic nature of things. A better tool might be something like Slim, which is explicitly, QMS's Slim Project, which is explicitly set up for that. You know, I would...I think the right way to call these things to management's attention is to have a very simple tool that is a complement to your Project Schedule, which is a risk list, that says, "Look, there are ten risks that we're looking at." Things that could happen or could happen to some extent, and this is...for instance, people leaving. We've got 45 people and we expect, we in the past, have seen this depart rate and we might expect during the course of the project to lose 6 of the 45, and this is what it would cost us, and if we lose only 2, it will cost us less. To have a list of risks and associated with each one a probabilistic assessment of its cost in time and money.

CAROL: And we'll be back shortly. Diane, you're welcome to stay on. We'll be back shortly after these short messages, and we'll talk more about, "What are those risks, what are the core risks, and risk management." Hi, welcome back to Quality Plus E-Talk, I'm Carol Dekkers. If you're just joining us, you're joining us for the right show of the season. As one of my email recipients, or email senders, said to me, "You keep getting the cream of the crop guests." And today's guest is no exception to that rule at all. "I run a company called Quality Plus Technologies, and we specialize in doing software measurement. And I offered for the past couple of weeks that we have a calendar that links the CMM, the Capability Maturity Model, which Tom DeMarco has been talking about, and function point analysis as one of the tools, one of the metrics that you can use. If anyone is interested in receiving one of these paper calendars, please send me your full mailing address to: [email protected]. I am going to give out the toll-free number one more time: 866-277-5369, and that's toll-free anywhere in the United States or Canada.

And I would like to welcome Tom DeMarco back to the show. We had a caller who was asking about how to manage risks. I absolutely love the way you put it, Tom. Which is, "Risk is always packaged with opportunities." So, say we embrace risk and we say, "Yes, we'll go into these risky projects." How do manage risk? Isn't that like a really hard thing to do?

TOM: Well, it is a hard thing to do. The most important aspect of managing risks is to face up to uncertainty. In order to say that you really have managed risk, you have to be able to point to explicit declarations of what you're uncertain about. In particular to say, "I'm uncertain about this delivery date. I can...it may be as early as June of next year, but it could well stretch out until January of 2002." When indeed that is how uncertain you are. In some organizations it is absolutely impossible to show uncertainty. It's okay to be wrong, but it's not okay to be uncertain. Or, if it's okay to be uncertain, it's okay to be uncertain with a very, very narrow window. So, for instance, if you said, "I can't tell you for sure whether we'll be done June 15th or June 30th," people will accept that as a reasonable window. Now, unfortunately, that's not a reasonable window at all. Given the deviations from plan that we've experienced in our history, in our recent history, as IT managers, the deviation from plan of 2% is unthinkable. We are much more likely to have deviations from plan on the order of 50%. So, saying it will take from 18 to 30 months to get this job done, that would declare uncertainty that is consistent with a kind of uncertainty we've seen in the past. But that would be politically unacceptable, that your boss might be well willing to hear you say, "It will take us from 18 months to 18 months and two weeks." But not to say from 18 to 30 months. The truth of the matter of is, there is a lot of uncertainty. And the thing that is really hard about risk management is it forces you to declare your uncertainty, to show the entire range. And that is a risky business in some companies. In some companies, you just can't admit to being uncertain; it is better to be certain that you do it in 18 months and then actually do it in 30 months, that's okay. But you declare up front your uncertainty is unacceptable, and if that's true in your company, that you're not allowed to show any uncertainty, that it's not that the macho of management doesn't allow that, then you're in the situation you'd be in if I showed you how to play an octave with one hand on the piano, but your hand wasn't big enough. In other words, you understand what is required, but you just can't do it. And I'm afraid that this is the situation in many blaming cultures, where risk management is effectively impossible.

CAROL: Now, in a lot of those cultures, I've seen these estimating tools, and I won't mention any in particular. But there are estimating tools that you put in, what I would consider somewhat risky, uncertain variables in the first place, and they crank through this elaborate equation and out the tail end comes a work breakdown structure that says it will take you 42.36 hours to do acceptance testing. And I always look at those and think, "Oh, my gosh, what if I smoked. What if I went out for 15 minutes, I'd blow this whole 18-month project."

TOM: Right.

CAROL: And I think we sometimes do it to ourselves in terms of when the numbers come out to two decimal places, we assume that they might be right.

TOM: Well, a basic rule of engineering is that you...you shouldn't allow precision that is greater than accuracy. And yet that is unfortunately, while people understand that in the abstract, they do it all the time. For instance, the British Project that went about automation of the Bond Exchange in London, was budgeted with an amount that went down to the penny. So, 638,477,912 Pounds and 61 cents. Something like that.

CAROL: Oh, my gosh!

TOM: But, when they were done, it had overrun by more than half. And they didn't actually finish. And not only that, but they couldn't say within a hundred million Pounds that they had actually spent. So, this is the kind of accuracy the project demonstrated and the precision of the original costing was ludicrous compared to the accuracy that they had demonstrated in the past. In fact, safety factors on the order of a factor of 2 in either direction, I mean if you really did declare what your uncertainty was consistent with the accuracy you demonstrated in the past, then it wouldn't be unreasonable to have a factor of 2 in either direction.

CAROL: What would you say that the tools are of risk management? What kind of things can we rely on? What can we look at?

TOM: Well, the basic, the most important tool, is a census; a simple list of the outstanding risks, which doesn't have just one or two things, most typically has 20 or 30. If you've just got 1 or 2 things, you know, we might be late or we might overrun the budget, you're not getting at the underlying risk, the causal risks. We are getting at the resultant consequences of those risks, but you're not getting at the things that cause those things to be risky. So, I would say a risk list that had 20 to 30 things, including the core risks common to all software projects, each one of them assessed to some extent, plus all of the risks that are unique to your project, just a list, so that'll be number 1. Number 2 would be risk diagrams for each one of those risks. There ought to be an explicit statement of how uncertain you are about that risk, about its causal factors, that shows what the most optimistic and what the least optimistic situation would be. So, explicit declaration of uncertainty. Beyond that, there would be risk brainstorms or risk identification sessions, a basic brainstorming exercise that you go through on a fairly regular basis to give people a chance to articulate risks that they hadn't thought of before, so that there is an ongoing process.

CAROL: What are the...I know people are sitting out there saying, "Okay, now we've heard that there's risks, we've heard that there's common risks." And they're waiting for me to ask you the pertinent question, which is, "What are those core risks?" That is, the top risks that are common to all IT projects.

TOM: Well, I'm not sure people are waiting on that, because I think they know the core risks. I mean...but, I can give you some information that quantifies them a little bit. The most important, the five most important risks that are common to all software projections are, one, loss of staff, staff turnover. Two, original mis-sizing of the project. In other words, what happened on the project was a direct result, as in the case of the British project I mentioned, that the original statement of how long it ought to take was completely off the wall. It was based on wishful thinking, and it had no connection to what actually had to be done. So, original mis-sizing of the projection. Third, inflation of size during the projection. The fact that you set out with a requirement at the beginning and then add function creeps, that new features get added in as the project goes on. That's the third thing--inflation of size. The fourth one is a failure to specify. Where you can talk about the system in the abstract, "Oh, yeah, we want a system that gives us a reasonable strategy for covering our foreign purchases with leads and lags on the foreign currency market. That's in the abstract, which you want. But, if you have no idea what a reasonable policy would be for that, then you could never get down to the details of the specification. You could never say these are the inputs and these are the outputs. So, failure to specify where you can talk about a system in the abstract but you can't say what the system actually does. And that most often is a result of a group of people that have different ideas of what it ought to do, and they haven't resolved their differences. And then the fifth one is that people under different circumstances work at different rates. So, variation in delivery rate due to, in most cases, just a lack of skill or an abundance of skill that causes you to deviate from the norm in your work rate.

CAROL: Right. And those are things that I think we've all encountered. They may not happen. You might get a project where not everything comes together but probably at least one of these is going to happen in every project.

TOM: No, they all happen to some extent, unless you've got a tiny little project. But if you've got 100 people on your staff, you'll have some staff turnover.

CAROL: Right.

TOM: Will six people leave during the year, or will sixteen people leave, or thirty people leave during the year? Obviously, the impact depends on that. These particular risks all happen to some extent, and the extent is what determines whether they're...how, how drastically they affect the project result.

CAROL: And we have a caller. Caller, are you on the air?

CALLER: Hi, there.

CAROL: Hi and welcome to the show.

CALLER: Well, thank you. I was...

TOM: Who is this, please?

CALLER: This is Danny Faught.

CAROL: Hi, Danny.

CALLER: I'm calling in from Ft. Worth, Texas.

TOM: Okay.

CALLER: I've had the privilege of seeing Bob Glass come down here for a talk recently. He talked about how in the '80s the big thing was for productivity; in the '90s the big thing was quality...

CAROL: And Danny, I'm gonna have to have you hold that thought. Remember what you're saying, and we'll be back after these short messages, and you can ask Tom DeMarco your question. Welcome back to our final segment. We've been talking to Tom DeMarco, who is a principal with the Atlantic Systems Guild. We've got a caller, who just began talking about, or asking his questions. Danny, would you like to rephrase your question.

CALLER: Okay, sure. I have listened to Bob Glass talk about fads, the themes of say the '80s with productivity; in the '90s it was quality, in his opinion. So, I got up later and asked, "Well, what about the naughties. What about this decade and do you see what theme is coming. He said no he didn't see the theme. Actually, I see sort of a merging theme that is going back to fundamentals. I just wrote an article about that. But, he asked me did I think that there might be a theme this decade, and I said, "Well, what about risk management?" He said, "Well, maybe, if people would actually do it." So, I wanted to get your opinion.

TOM: ...it can ever come to grips with uncertainty, so I don't think it is going to be the theme of the period. I think the theme of the period will continue to be more self-delusionary, and it will have to do with widened access and systems that, when in fact, I think that is true, too. They will be able to build systems that affect an evernet. I think that is going to be on everybody's mind during the period. I think that risk management will be something that people continue to pay lip service to, but don't take very seriously, except, of course, great companies where very important software is built and typically small companies also do well in risk management. So, with the exception of those sectors, I wouldn't expect it to be a dominant theme.

CALLER: I guess as long as companies succeed without doing it, there is not going to be a strong case for doing it.

TOM: No, I think companies succeed all the time without doing much risk management, but what do they succeed at. I mean, what they typically succeed at is building from all their portfolio of projects that they might take on, of taking on the relatively easy ones, and those companies succeed in the near term; in the long term, they become weaker and weaker because they don't move the world, they let the world move them.

CALLER: Interesting. Well, I appreciate the comment.

TOM: Thank you.

CAROL: Thank you for phoning in, Danny.


CAROL: I have a question, Tom. Does risk management pay for itself?

TOM: Risk management costs next to nothing, and so you might ask the question, "Does being realistic about things pay for itself?" I think the answer is that it does, but it does pay for itself in ways that are unsettling. In particular, people are always saying to me, "Gee, if I ever told a user that there was this much of a window of possibilities, he'd never enter into the project in the first place." And so one of the ways that risk management pays for itself is users, the people who are being asked to fund these projects, sometimes bail early and say, "Well, no. I mean given that the possibilities are so grim, I think I won't fund this project which is a success." Risk management is therefor paying for itself because you avoided doing something that the payback on wasn't sufficient for the risks you were running, but it's an unsettling way for it to pay itself back, and may people who have got a stake in undertaking projects and making sure we get plenty of projects to do, would look at that as a failure.

CAROL: And I'd like to go back to something you said earlier about Slim as being a product that quantitative software management, QSM, puts out, one thing that absolutely struck me when I took the Slim training because our company resells Slim on behalf of QSM, is that Slim as an estimating tool will allow you to set tolerance and set probability. You can say I want an 80% assurance that I am not going to exceed this number of resources, and visually at a glance it will take the combination and say that you can't possibly succeed on this project. And rather than coming up with a work breakdown structure to, you know, two decimal places of accuracy or precision, it comes up and says, you know, here's how much time the whole thing should take with these confidence levels. My question is that in most of the companies I have seen other estimating tools. The other estimating tools make it look like you've got 100% accuracy. So, you've got, you know, it gives you one number and that's it. Isn't that really just a 50/50 bell curve?

TOM: Yes, I think that's the right way to look at it. And in fact very often it's worse than that. What it really tells you is your most optimistic. In other words, if none of the risks happen, none of the risks made their way onto the plan, you could finish by such and such a date. That's why we miss the date so often. When they deal with that, they deal with it with a huge fudge factor. You know, all risks taken together give us a fudge factor of 38%, so we add 38% to the most optimistic. I made that number up, but, for instance, they might be saying that. They add 38% to the most optimistic number and say that's the number and present it as a fait accompli, but it isn't, as you say.

CAROL: And we are running out of time quickly. I could talk to Tom DeMarco for hours and hours. We will take a short break and be back to sum up with Tom DeMarco and risk management, making success possible. Welcome back to Quality Plus E-Talk! I'm Carol Dekkers, and I have four minutes to sum up everything. I have about an hour of things that I want to be able to tell you. I'd like to thank our guest this week, Tom DeMarco, who is a principal with the Atlantic Systems Guild. If you want to get in touch with Tom, you can go to www.atlsysguild.com or you can go onto my Web site at www.qualityplustech.com and you can click on the radio show schedule and link directly up with that same site for Tom DeMarco.

We've been talking about risk management and risk avoidance. One thing that I was really interested in was the common risks in all IT projects. Tom told us that it was staff turnover, mis-sizing a project, inflation of size during the project, failure to specify, and variation in delivery rates. I'd like to thank you very much, Tom, for taking the time out of what I know is an extremely busy life to spend it with us today.

TOM: Well, thank you, Carol.

CAROL: I'd like to mention that Tom has two exciting new books. He is quite a prolific writer in addition to be an excellent presenter and software expert. He has got actually a mainstream novel that is out on www.amazon.com and he has been doing a book tour. I am waiting to see him on Rosie O'Donnell, David Lettermen, Jay Leno, and at the top of the New York Times Bestseller List.

TOM: But not on Oprah, because Oprah only has depressing books and this is a comedy; it's the story of a house party that takes place on a little island off the coast of Maine in the late 1940s. It's called Dark Harbor House.

CAROL: And you can actually download Chapter One, which is intriguing and has a wonderful review at the end of it. That is available off of the Atlantic Systems Guild Web site. The other book that's coming out, I guess in April...

TOM: It is in April, yes.

CAROL: Okay - It's called Slack, Getting Beyond Burnout, Busy Work, and the Myth of Total Efficiency. The Dark Harbor House is a completely nonsoftware, totally fiction book, and Slack gets back into some of the things that are very near and dear to all of our software development hearts. So, I'd like to say thank you to Tom for being on the show. I'd like to tell you a little bit about who is coming up on next week's show. We have Dr. David Zubrow, who is with the Software Engineering Institute, and this week he is actually at the Software Engineering Process Group Conference in India. So, he will be fresh back from the Indian Conference, hopefully over his jet lag. He will be talking to us about high maturity organizations, a world-wide perspective. So, that's Level III's and up. We've also got Dr. Alan Davis, who is going to be on March 8 and talk about requirements in Internet time. Dr. Davis has been an IEEE software editor for a number of years and is quite well known in the requirements area. We've also got coming up on March 15, Jim Highsmith; we've got on March 22 a treat for you which is standards in laymen's terms, "How to choose a process standard in layperson's terms." And we are also going to be having the famous, or infamous, I guess, Elisabeth Hendrickson interview, which was supposed to be broadcast last week, so we'll have it at the end. Tom, any final five-second words to our audience?

TOM: Just that risk is something you can't run away from, you've got to run toward it. And if you run toward risk and take on risky projects, you have to manage your risks.

CAROL: And risk, as Tom said, "is always packaged with opportunity." So, I'd like to take this opportunity to tell you to have a wonderful week, a wonderful week of embracing risk, embracing change, and do visit our Web site and take a look at our articles. We will E-talk to you next week with David Zubrow. Thanks for listening. This is Carol Dekkers signing off for now.

Copyright 2001 Quality Plus Technologies and Carol Dekkers. All rights reserved.

About the author

StickyMinds is a TechWell community.

Through conferences, training, consulting, and online resources, TechWell helps you develop and deliver great software every day.