The Art of the Unintended Consequence.
Besides the danger of telling the wrong story and all the unintended consequences of that (wrong decisions, improving the wrong things, not improving things that need to be improved, etc.), misuses lead to other dangerous consequences.
We"ve already discussed the damage to morale caused by showing only the bad metrics. Besides demoralization of the workforce, you also run the risk of creating, continuing, or increasing the following: Fear of metrics being abused Anger over misuse and abuse Uncertainty of what to do or what will be done Doubt of the validity of the metrics Mistrust of those collecting, a.n.a.lyzing, and reporting the metrics Avoidance of activities which could positively or negatively affect a metric Reluctance to partic.i.p.ate in future improvement efforts Each of these are worthy of discussion, but first it is important to realize who we"re talking about-those who are providing the data and those who feel the data is about them. Note that I said data. It doesn"t matter if the metric is an aggregate or tells a bigger story. Those who provide the data won"t care about your plans for the metric if they believe the data may be misused or abused. These reactions to the metrics are warranted. Fear, uncertainty, doubt, anger, mistrust, and avoidance are emotions that can"t be dismissed or debated.
Emotional reactions to the misuse of data cannot be dismissed.
Let"s go into more depth.
Fear of metrics and their abuse. Metrics are dangerous and can cause more harm than good. Employees may fear the misuse and abuse of metrics before you collect the first data point. Even if you do everything right-show how the data will be used (to tell a complete story, which in turn will be used to improve processes) and how it won"t be used (to punish or control staff)-fear may still exist. Fear that you will not live up to your promises or will change your mind about how you use the metrics. Fear that others will get access to the data and then misuse or abuse them. This fear is real and warranted. Your mission is to find ways to build enough trust to overcome it. In short, you"ll build trust by explaining how you will and won"t use the metrics and keeping those promises.
Uncertainty of what to do or what will be done. When those providing the information for your metrics are uncertain about how it will be used (or if they will be used at all), they may hesitate to provide you data. Uncertainty leads to many other potential problems, such as eventual doubts about the accuracy of your information.
Doubt of the validity of the metrics. Rather than truly doubting the validity of the metrics, some people decide to call the validity of the data into question so that they won"t have to deal with the metrics. If the metrics are invalid, then they can ignore them. If the metrics are proven to be invalid or inaccurate, the power that metrics should yield is lost.
Anger over misuse and abuse. This, in my opinion, is a better response to misuse than fear or uncertainty. Anger shows a level of concern and involvement, which is a good foundation toward improvement. Anger, however, also usually creates a defensive reaction and has the ability to bring out the worst in others. Leadership may want to punish those who show anger, or label them as "disgruntled" or a non-team player or "disruptive." Rather than simply condemn those who react with anger, their pa.s.sion should at least be appreciated. Pa.s.sion, involvement, and self-motivation cannot be taught or instilled.
Mistrust of those collecting, a.n.a.lyzing, and reporting the metrics. When we find mistrust, it is a deeper problem than anger. Mistrust requires a deep effort to overcome. Mistrust as a whole can"t be fixed easily, but mistrust about the use of the metrics can be addressed. If you can build trust in this effort, then perhaps you can use it as a foundation for improving overall trust. To build trust requires steadfast dedication to the principles-use the metrics only in the ways you offered and never use them in the ways you said that you would not.
Avoidance of activities that could positively or negatively affect a metric. Fear, uncertainty, doubt, anger, and/or mistrust can lead to pa.s.sive resistance toward your metric efforts. The simplest form of resistance is to avoid anything to do with the metrics-avoiding involvement in the identification, collection, a.n.a.lysis, or reporting of the metric. This avoidance may not even be noticed unless you need involvement. Again, you will have to proactively deal with this. If you want to have a successful program, it is just as critical to deal with pa.s.sive resistance as the more overt behaviors.
Reluctance to partic.i.p.ate in future improvement efforts. This consequence is often overlooked when dealing with metrics or any organizational improvement effort. When you misuse the power afforded to you, the simplest retort available is to resist future improvement efforts. Even when it was not intentional, it is hard to support another improvement effort or technique if the previous one was mishandled.
These unintended consequences don"t have to happen-if you"re careful and respect the power of metrics.
Recap.
Metrics are a powerful tool-and like most powerful tools, they can do some serious damage. You have to take precautions, use the proper safety equipment, and finally you have to respect that power.
The rules of thumb are as follows: Metrics can do more harm than good.
Metrics should never replace common sense or personal involvement.
Metrics are not facts, they are indicators.
What you say may not be what others hear.
Damage from misuse of metrics hurts everyone in the organization.
Constant diligence is required to ensure metrics are used properly.
We covered the following: The Power of Metrics. Even innocent misuse of metrics can cause damage, such as low team morale. If word spreads through your organization about the misuse or abuse of metrics, irreparable damage will be done.
Misuse of Metrics: The Good, the Bad, and the Ugly. These include: Sharing only part of the story Not sharing the story at all Sharing only the good metrics Sharing only the bad metrics Showing the raw data Using metrics for a personal agenda Using metrics to control people Using metrics to make decisions Using metrics to win an argument or sway opinion The Art of the Unintended Consequence. The major types of consequences you can expect from misusing metrics, are as follows: Fear of metrics and their abuse Uncertainty of what to do or what will be done Doubt of the validity of the metrics Anger over misuse and abuse Mistrust of those reporting, a.n.a.lyzing, and reporting the metrics.
Avoidance of activities, which could positively or negatively affect a metric.
Conclusion.
Respecting the power of metrics essentially means being cautious and a little fearful of the harm that can be caused by metrics. This fear shouldn"t paralyze you; it should instead energize you to handle the metrics with care. Put on your safety gear, take precautions, and ensure that others are kept out of harm"s way.
Avoiding the Research Trap.
Let"s start with my take on research. By research I don"t mean the focused, directed investigation I suggest you do to determine root needs. Nor do I mean the further investigation we perform once we"ve identified anomalies in our data. I also don"t mean the deeper dive you should perform when finding the data, measures, information, standards, or benchmarks for the specific metrics you"ve designed.
By research, I am referring to the non-directed exploration of information.
This type of research can be broken into many different categories. They include scientific and historical methods; qualitative and quant.i.tative views; exploratory, constructive, and empirical research; primary and secondary research; and many others. All of these types of research have the following commonalities: High expense Long length of time to conduct Considerable effort to conduct Unknown applicability The government and other research supporters find it useful to fund research in many areas, expecting that a certain percentage of the a.n.a.lysis garnered will result in breakthroughs. Many times resulting innovations or insights are made in areas totally unrelated from the original intent. Velcro and microwave ovens are examples. Many of our new technologies resulted from military research that ended up in uses other than combat-related activities. Research is an excellent means to give us new ways to see old problems and sometimes research uncovers new problems which would have gone unnoticed. Research is an essential part of humankind"s progress and future.
Research can also be a great tool for businesses seeking information to help improve. In First Break All the Rules by Marcus Buckingham and Curt Coffman (Simon & Schuster, 1999) and Good to Great by Jim Collins (HarperCollins, 2001), the authors conducted some serious research on business performance improvement. Collins had a team of researchers sort through the performance histories of nearly 1,500 companies and Buckingham and Coffman used more than 80,000 interview results from 25 years of research done by the Gallup Organization. The authors used their results to come up with concepts and theories for organizational development. Their ideas were born from their research and many businesses benefitted from it.
But this halo effect causes some to believe that they should partake in research or at least follow the principles used in research. Just because there has been very good use of research, it doesn"t mean that a business should try to replicate these efforts, especially not for practical application.
It"s important not to ignore or forget all of the failed attempts at innovation, all of the research that proved not to be useful, and all of the results that led researchers down the wrong paths.
Many leaders use examples of others" success to push them to try a new idea. They cherish books like In Search of Excellence by Tom Peters and Robert Waterman (Warner Books, 1982), where success stories fill the pages.
Most leaders want to see metrics that others are using. They don"t want to undertake a venture without proven success from similar organizations or compet.i.tors-to some degree ensuring that they are never in the lead or on the cutting edge. This aversion to risk-taking was addressed in more detail in chapter 13 on benchmarking.
Research can be a very good thing. But most organizations really can"t afford to conduct non-directed research (that is, any research not directly related to product development). It is not that research is bad-it"s simply too costly in terms of time and expense.
Can you afford to perform non-directed research?
The Cost of Research.
Research involves gathering data using interviews, surveys, observations, experimentation, doc.u.mentation and instrumentation. Research involves a lot of time to gather the data; not only the researcher"s time, but the time of those providing the information. Experiments take time, especially when you consider the need for control groups and double-blind techniques.
You can"t afford to create metrics without a purpose. You can"t afford to gather data, create measures, and compile information without a direction for it. You can"t afford to build a structure and hope that later someone will come to fill it or use it.
So you may be thinking, "But I never wanted to do research and I doubt that my company will ever want to do research!"
Although you may not want to do it, I a.s.sure you that you"ll end up doing research.
If you are collecting a.n.a.lyzing, and reporting data without a root question, you"re performing non-directed research.
Research in Disguise.
A common task a.s.signed to a metrics a.n.a.lyst by well-meaning management is to come up with some "interesting" data. Basically, the leader is admitting he doesn"t know what he wants, but he is willing for you to take the time and effort to try and come up with something he likes. Like Justice Potter Stewart"s a.s.sertion regarding p.o.r.nography, he"d know it when he saw it.
Management wrongly believes that the right metrics, the metrics they want, will be revealed through just a bit of trial and error. If you simply collect enough information, management will be able to separate the wheat from the chaff.
To satisfy a request to go out and find some useful, interesting measures, you can: Examine existing products Read books on your industry and the processes involved Ask a lot of industry experts about their opinions and experiences Study how processes are carried out Choose from any one of the many research tools available (online databases, search engines, indexes, and publications) That"s right, you are conducting research. So while I"ll agree that none of us go into our daily tasks wanting (or expecting) to conduct research, we all end up doing so anyway.
It is essential to drive to a root need. When you are presented with the ambiguous request for conducting research in disguise, you have to push back. You may need to push back in a manner appropriate for your organization, but you must push back nonetheless. The best means of this is to carefully use the five whys. Rather than seek the elusive answers to questions you haven"t identified, spend that energy "researching" the root question.
Most organizations (including all of the ones I"ve worked for) can"t afford to conduct wide-ranging research and then process and utilize enormous amounts of data. It"s just not cost effective. Most of us don"t have the time or resources to engage in non-directed research.
Even when we try to use results of research conducted by outside sources, many times it doesn"t end well. It is rare that I"ve found research data that fits the specific root questions my organization was working with.
Are You Already Trapped?
If you find yourself seeking answers to unknown questions, you"re probably trapped. You may find yourself blocking-out four or more hours a day for a few months so you can gather information. You then find yourself determining what data is available. You start mining data from numerous sources.
One of the benefits and problems with technology is the proliferation of data. When you know what you"re looking for, it"s great to have it all "at your fingertips." When you don"t know what you"re looking for, the mountains of data can bury you.
When you"re firmly in the trap and you"ve gathered a lot of data, you"ll be confronted with the daunting task of a.n.a.lyzing that data. You"ll have to figure out how to pull them together-which measures have relationships with others and which don"t.
When you share the results of your hard work, your boss may tell you that you"ve missed the mark (the mark which he himself couldn"t describe, point out, or identify). You won"t know if the data was the wrong data or if your a.n.a.lysis of that data was off. Even if you get past that, you run the real risk of inspiring your boss. He may think of other "interesting" data you could collect. This will require more searching. If you can"t locate any secondary data that"s already been published, you will probably have to find other ways to get the data. This may include conducting surveys, focus groups, observation, or the creation of automated tools for collecting. This in itself can be extremely expensive due to the cost to buy, and learn to use or develop tools. You realize that this path is an expensive one-with ever-diminishing returns.
The best advice I can give is, if you find yourself in a hole, stop digging. Instead, try to climb out of the hole-preferably by asking for a little help from your manager. I usually start by asking him to stop shoveling dirt on my head. Then I ask for a hand in climbing out. This help should be a willingness to work on the root needs. The problem is that you have to convince your boss that you need his help. You need him to give you some of his time and effort.
Ask your boss for help out of the hole (after you get him to stop shoveling dirt on your head).
Stop Digging.
"We need three key metrics." My boss had called me to his office to give me a strong motivational speech.
"Three?" It amazed me how many organizational development things came in threes. Three goals. Three process improvement ideas. Three metrics.
"Yes. But, if you come up with four or five, that"s all right."
"Uh huh."
"It shouldn"t be too hard. We have a contract with a consulting organization that has a lot of data on IT services." I knew of this arrangement, but I hadn"t seen much useful data, measures, or information in their databases.
"What information do you want exactly?" I tried not to get frustrated.
"I"m open to whatever you come up with. Just use their existing data as a benchmark..."
"And find three, right?"
"Right!" I could tell he was happy with his clear direction.
I did as asked (I wanted to keep my job). I looked at the consulting firm"s data, without any specific question in mind, trying to find three we could use as a benchmark. Unfortunately the data they had didn"t relate to our problem areas. We had specific areas we were trying to improve. These were not unique problems, but they weren"t issues that the consulting firm had previously researched. They had researched the general areas of interest, but these weren"t the areas we were having trouble with. The three I found to use-the availability of servers, abandoned call rates, and customer satisfaction ratios-were defined differently than we would have defined them.
For example, they defined availability on a 24 hours a day, 7-days-a-week basis-while we had scheduled downtimes during low-usage timeframes (our customers did not expect 24/7 availability).
Abandoned calls did not take into account the time before the caller hung up. We always had a message you had to listen to before a technician would pick up. This message was updated daily (at a minimum) and informed the caller of any current issues. For example, if we had a current problem with e-mail, the message may say, "We are experiencing e-mail connectivity issues; we hope to have it resolved by 1:00 p.m." If the purpose of the call was to let us know about this issue, the caller could hang up, confident that we already knew about the issue and were working it. This call, in our opinion, shouldn"t be counted as abandoned. The research didn"t differentiate the amount of time spent on the line before the caller disconnected, so it didn"t actually match our information.
It was easy to see the disconnect in the customer satisfaction measures. The consultant"s research had a four-point scale; we had used a five-point scale for the last three years. Their questions were also not an exact match. Finally, they gave a value of "1" for "very satisfied" and a "4" for "very dissatisfied." We used the scale in the other direction.
In a forced effort to have comparison data, my boss made me use the data anyway, comparing it although it wasn"t an exact match. He would say, "It may not be Macintosh to Macintosh, but it"s still apples to apples."
Finding a cache of data can mislead you into force-fitting your questions to align with the available answers.
To stop the madness, you"ll need to admit to your boss that you can"t do what he"s asked. You can"t perform the research because you don"t have the time, skills, or energy to chase the unknown. Instead, you need his help. You need his help to direct your efforts and allow you to be more efficient and effective. You need his help to ensure you"re productive.
What"s Wrong with Research?