KPIs are for the Lazy

Matthew Heusser's picture

It was in graduate school, in CS 641 (Management of Software Development), about fifteen years ago. Our professor, Dr. Roger Ferguson, was explaining the great benefit of the waterfall model for software:

The great benefit of the waterfall model is that it is easy for management.

At the beginning of the project, management sets the deadlines. Once the deadlines are set, breathe a sigh of relief, the hard work is over. After that, all you need to do is ask each week if things are track. If they are, breathe a sigh of relief and go back to your golf game. If not, kick over some chairs, perhaps assemble a war council, make someone responsible for fixing this mess, then go back to the golf game, secure that you have done your job.

That's really about it: The waterfall is easy on management.

Those of us who take software seriously know the truth: The beginning of the project is when you know the least about it's complexities and effort. That plans and requirements change over time, and the best way to run a complex project is to actually manage, not play games.

This reality of the waterfall model is no huge surprise. What might be a big surprising is the realization that Key Performance Indicators, or KPIs, and have some of the same problems.

Management by Spreadsheet

When most of my clients talk about KPI's, they want a small set of numbers, perhaps three to five, forming some sort of balanced scorecard. Automate it, put it on the company dashboard, and every employee can see how the team is doing in real time. You can do this at any level: From test cases per week to team velocity to sales volume.

Sounds wonderful, doesn't it?

But what does this dashboarding allow management to do? Why do we want it?

Sadly, all too often, the metrics allow management to either breathe a sigh of relief and get back to that golf game - or, if the metrics are yellow or red, give warning that something is wrong. Management can then kick over some chairs, perhaps assemble a war council, ask for updates every hour until this thing is fixed ... and then go back to the golf game, secure in the knowledge that they have 'handled' the situation.

This sounds more than a bit familiar to me.

The dashboard ideal is not my ideal. And not just for the reason that numerical targets distort performance.

The Heart of Lean Management Is Actually Managing

Before there was Lean (an English word), because American methodologists went to Japan and studied what the Japanese were doing and make rules and systems, there was Taiichi Ohno, the chief engineer at Toyota who created the Toyota Production System.

Ohno did a lot of things, and we don't have time to tell his story properly here, but one of the things he suggested was for plant managers to get out of the office. In one of his more famous moves, Ohno would draw a circle in the middle of the factory floor in chalk - perhaps three fee in diameter - and insist the new manager spend an entire day in that circle.

Study the system and you'll see the waste, goes the logic.

Gee, that worker has to carry parts from point A to point B all day long. What if we just moved those stations together?

That's called transportation waste, and it is one of those seven wastes, or "Muda" that Ohno identified. You can google "seven wastes Toyota" if you'd like, but that's not the point today: The point is improving a system starts with studying it, and KPIs all too often are used as tools to get us away from the system.

The Japanese word here is gemba, which means to "the real place." To improve the factory, go to the factory. To improve the software organization, study what is actually happening - don't manage by spreadsheet!

Am I saying that numbers are themselves evil and you should avoid them? Certainly not! The course I teach on software testing, called Lean Software Testing, or LST, is full of measures, many of them numeric.

I am saying that belief that some small set of numbers tell the entire story, thus enabling management "by spreadsheet", or more socially acceptable terms like "data-backed decisions" or "management by the numbers" is a sort of naive approach to managing organizations.

Like the waterfall model, the idea of management "by the numbers" makes the work sound easy, bit it is flawed. Actually managing is more work, but it will lead us to more pride in work, better teamwork, more shared understanding, in better results.

I know the way I would like to lead my tiny little company, and a bit about how things are going at SQE.

How are you doing?

And, perhaps more importantly - what are you going to do about that?

User Comments

4 comments
Matthew Eakin's picture

It is hard to argue the evil of "management by spreadsheet." To me it is about caring. If a manager cares so little about a project or a team that the only interaction they have is via KPI's, the project (and ultimately team and Manager) will fail because the lack of caring will trickle down, then up. Regardless of how "good" or "bad" the KPI's are. Conversely, if the Manager is on the "factory floor" talking with Developers and Testers about test coverage, the full automation stack, team velocity, etc. the result will be a team that cares and numerous successful projects. Again, regardless of how "good" or "bad" the KPI's are. My point: KPI's in reality have zero effect on product delivery.

If Management still insists on KPI's, I try to treat them like an onion. I'll give them Red, Yellow, Green in the summary. But also give them the ability to drill down to the metrics driving each color under the hood so they know why there is a Yellow. I also try to make these very public (charts on walls) rather then an email no one looks at.

July 1, 2014 - 9:54am
Srinivas Kadiyala's picture

What are Key Performance Indicators for Testers?

-
Srinivas

July 1, 2014 - 10:59am
John Arrambide's picture

I think KPI are a critical component to bring visibility and transparency to any project. I think everyone would agree that KPI's alone are meaningless without the context (story) explaining them. KPI's are a great tool for predicting projects that will get into trouble and great for historical reference during retrospectives. It’s hard to have discussions about project cause and effect events without any data (evidence) backing it up, without data it just unsubstantiated opinions which are just best guesses and don’t help improve or move things forward.

August 13, 2014 - 6:05am