Surveys are probably the most common data-gathering tool. They are used in research (Gallup Polls), predictive a.n.a.lysis (exit polls during elections), feedback gathering (customer satisfaction surveys), marketing a.n.a.lysis (like the surveyers walking in shopping malls, asking for a few minutes of your time) and demographic data gathering (the US census). Surveys are used whenever you want to gather a lot of data from a lot of people-people being a key component. Surveys, by nature, involve people.
The best use of surveys is when you are seeking the opinions of the respondents. Any time you collect data by "asking" someone for information, the answer will lack objectivity. In contrast to using automated tools for collecting (high/total objectivity), surveys by nature are highly/totally subjective. So, the best use of the survey is when you purposefully want subjectivity.
Customer satisfaction surveys are a good example of this. Another is marketing a.n.a.lysis. If you want to know if someone likes one type of drink over another, a great way to find out is to ask. Surveys, in one way or another, collect your opinion. I lump all such data gathering under surveys-even if you don"t use a "survey tool" to gather them. So, focus groups, and interviews fit under surveys. We"ll cover the theories behind the types of surveys and survey methods later.
Use People.
So far I"ve recommended avoiding human provision of data when accuracy is essential. I"ve also said that when you want an opinion, you want (have) to use humans. But, how about when you decide to use people for gathering data other than opinions? What happens when you use people because you can"t afford an automated solution or an automated solution doesn"t exist?
I try to stay fit and get to the gym on a regular basis. I"ve noticed that a gym staff member often walks around the facility with a log sheet on a clipboard. He"ll visually count the number of people on the basketball courts. He"ll then take a count of those using the aerobic machines. Next, the free weights, the weight machines, and finally the elevated track. He"ll also check the locker room, and a female coworker will check the women"s locker room.
How much human error gets injected into this process? Besides simply miscounting, it is easy to imagine how the counter can miss or double-count people. During his transition between rooms, areas, and floors of the facility, the staff member is likely to miss patrons and/or count someone more than once (for example, Gym-User A is counted while on the basketball court, and by the time the staffer gets to the locker room, Gym-User A is in the locker room, where he is counted again). Yet, it"s not economically feasible to utilize automated equipment to count the facility"s usage by area.
We readily accept the inherent inaccuracy in the human-gathered form of data collection. Thus we must ask the following: How critical is it to have a high degree of accuracy in our data?
Is high accuracy worth the high cost?
How important is it to have the data at all? If it"s acceptable to simply have some insight into usage of the areas, a rough estimate may be more than enough Many times you collect data using humans because we need human interaction to deal with the situation that generates the data. A good example is the IT help desk. Since you choose to have a human answer the trouble call (vs. an automated system), much of the data collected (and later used to a.n.a.lyze trends and predict problem areas) is done by the person answering the phone. Even an "automated" survey tool (e-mails generated and sent to callers) is dependent on the technician correctly capturing each phone caller"s information.
Another Example.
I want to provide you another example of how to develop a metric. This one is from a work experience.
I once worked with a web and teleconferencing technician. His boss wanted to know the answer to the following question: "Is the service worthwhile to maintain?" The technician"s service cost the amount of a full-salaried employee, plus expensive equipment, a dedicated room, and monthly fees. The boss wanted to know if the costs were worth the benefits.
By now you"re probably demanding that I define "the service," "worthwhile," and "maintain." You should be! The service could be all forms of con-ferencing, or it could be only web conferencing that requires the technician"s time and has a recurring fee. If the teleconference method is a sunk cost and doesn"t require intervention by the technician, this may not be a factor.
When we ask if something is "worthwhile"-what exactly do we mean? Is it simply a question of monetary return on investment? Does goodwill count? Do employee productivity, effectiveness, and efficiency matter? And finally, what do we mean by maintain? To have it at all? To pay the salary of the technician? To have our own facility vs. using a contracted or hosted solution? Does "maintain" include hardware maintenance? Does it include upgrades to software? How about repairs and replacements?
Once we have clear definitions for the terms that make up the root question, we will have a much better picture! Remember the importance of a common language. It is equally important that everyone fully understands the language used to create the root question.
Figure 2-2 shows the picture the technician and I drew of his service metric.
Figure 2-2. Web/teleconferencing value The picture we drew depicts the value gained (costs avoided) by using the web-conferencing system. We antic.i.p.ated showing information on money savings (travel and hotel costs), time savings (the time to travel), environmental savings (fuel consumption and CO2 emissions), and the happiness of the clients who were able to more easily meet "face-to-face" with others. These factors would be compared to the actual costs incurred.
This is not a perfect metric. There is no such thing in my experience. You can"t prove that the costs would have been incurred without the system. It"s like a.s.suming that if we didn"t have telephones, we"d write or visit family more often. While we can"t categorically say this would happen, for the purposes of determining the avoided costs, or the value of the service, we have to make these a.s.sumptions.
Information.
We asked a lot of clarifying questions-seeking definitions for all of the parts of the root question. The definitions led us to the following information decision (we only wanted to answer one aspect of the root question): How much do we save? How much money, jet fuel, CO2 emissions, and time do we potentially save by maintaining the conferencing center? This clarification made the next phase purposeful. Rather than chase all manner of data, we could focus our efforts only on the measures and data we needed.
Measures.
We designed measures that would reveal the following: The amount of time saved for each conference The amount of money saved for each conference Travel funds saved (plane fare and taxis) Hotel funds (when the distance dictates an overnight stay).
The amount of CO2 emissions saved for each conference.
Data.
To build the measures, we needed data like the following: Locations partic.i.p.ating in the web/teleconference.
Number of partic.i.p.ants at each location.
Distance from each location to the "host" location. For purposes of the metric, we had to determine a "host" location that partic.i.p.ants would travel to. If our location were the host, we wouldn"t gain the savings-but our colleagues could claim them If the meeting is held at location X (because of protocol, for example) the distance from each partic.i.p.ating starting point.
If protocol doesn"t dictate a specific meeting location-then which location has the most partic.i.p.ants Plane fare amounts to and from the host location The CO2 emissions from airplanes for these flights If the location is not at an airport: The distance to the location from/to the airport The cost for ground travel from/to airport The average nightly cost of a hotel room at the host site At international locations At domestic locations The cost for the web/teleconference A system with recurring fees A system with annual fees The salary of the technician.
The amount of time the technician spends on each conference.
The total number of web/teleconferences.
This metric is a good example of how to build from a root question to an abstract picture and finally to the data, measures, and information needed to tell the story. The best part of this example, though, is that it was created to satisfy the request of the service provider. Our webconferencing technician requested the metric to answer the question his boss had been asking him. I warned him that if the answer came back, "No, get rid of the service," that I wasn"t going to hide the results. He agreed. Partly because he is a loyal employee-and if the data accurately showed that it was not worth the cost, he"d be the first to advocate dropping the service. And partly because he "knew" that the service was a worthwhile one. He knew the worth of it since he worked with it every day.
The service provider usually already knows the answers that you"ll build metrics to validate.
This metric is published quarterly to show the benefits of web/teleconferencing for the organization.
Recap.
In this chapter, we covered the following: Getting to the root question: It is imperative to get to the root question before you start even "thinking about" data. The root question will help you avoid waste. To get to the real root, I discussed using Five Whys, facilitating group interventions, and being willing to accept that the answer may not include metrics. Make sure you define every facet of the question so you are perfectly clear about what you want.
Testing the root question: I provided some suggestions on how you can test if the question you"ve settled upon is a true root question. Even with the tests, it"s important to realize that you may not have reached it when you draw your picture. You may have to do a little rework.
Developing a metric: This is more about what you shouldn"t do than what you should. You shouldn"t think about data. You shouldn"t design charts and graphs. You shouldn"t jump to what measures you want. Stay abstract.
Being an artist: The best way I"ve found to stay abstract is to be creative. The best way to be creative is to avoid the details and focus on the big picture. One helps feed the other. Draw a picture-it doesn"t have to be a work of art.
Identifying the information, measures, and data needed: Once you have a clear picture (literally and figuratively) it"s time to think about information, measures and data. Think of it like a paint-by-numbers picture. What information is required to fill the picture in? What color paints will you need? And make sure you don"t leave out any essential components.
Collecting measures and data: Now that you know what you need, how do you collect it?
How to collect data: I presented four major methods for collecting data: Using automated sources, employing software and hardware, conducting surveys, and using people.
Conclusion.
This chapter covered how to create a root question and, based on that, how to design a metric. I also covered how to identify and collect the information, measures, and data needed to turn the metric picture into a usable metric.
Bonus Material.
A different way of thinking about metrics. A different way of approaching metrics. These are part of what I"m trying to share with you.
The school my six-year-old attends uses fundraisers to supplement its funding. Recently, the school was selling coupon books for local fast-food restaurants at the low cost of five dollars. Not a bad deal-I easily got back my investment by using just one coupon.
I like this fundraiser. In my opinion, it gives a more-than-fair value for the price and the school gets a generous share of the funds. The princ.i.p.al liked it, too, but wanted confirmation of how well the program worked-ostensibly so he could decide whether to keep doing this fundraiser or investigate a replacement.
After I dropped my daughter off at cla.s.s one day, the princ.i.p.al approached me. "Marty, we collected some data..." He knew I was a metrics expert. The fact that he was already "collecting data," and that he led with that, gave me pause.
"Really?" I tried to stay non-committal. I love talking metrics, and like most people I also like talking about my profession. But, I can"t remember anyone who has ever been happy with the answers I give to their questions. Most people don"t have a root question. Most don"t want to be creative. Most simply want data to prove them right and that"s "simpler than possible."
"Yes. We collected data on how successful the coupon book program was this year."
"Uh huh." I bit my tongue.
"I wanted to know if it was successful, so we counted how many books we sold."
Without defining "success" and without looking for the root question, he did a couple of things he could have avoided. The good news was that I knew that it didn"t take much effort to collect the data-he just had to count how many books were left at the end of the year compared to how many the school had purchased.
He even built a nice chart showing the percentage of the total sold by week and overall.
"Nice graph," I said.
"As you can see, we sold all of our coupon books a week before the planned deadline. So, I"m doubling our order for next year."
"Really?"
"Sure, it was a tremendous success!"
"O-kay..." I couldn"t hold back a small frown.
In the short pause that followed, I actually imagined that I was going to escape with no further discussion.
"I know you well enough to know that you think I did something wrong. Come on, tell me."
If this were a seminar instead of a book, I"d give you a test at this point. I"d ask you, "What did the princ.i.p.al miss? What should he have done? How might his conclusions be wrong? How might those errors be a problem?"
If this were a seminar, I"d give you five minutes to capture the answers. Since this is a book, if you want to try and figure it out, close the book now and take a shot.
I took the time to explain, as kindly as I could where he went wrong.
"John, it"s not that you did something wrong, but you might have missed something. I"d at the least ask you not to make a decision before you investigate further. The first thing I would do before going further is to define what you mean by "success." Is it simply selling a lot of coupon books?"
"Sure. Isn"t it?"
"I truly don"t know. Why are you selling the books? What"s your purpose?" He nearly rolled his eyes. He felt I was making him state the obvious.
"To raise funds."
"Ok. And you succeeded at that, right?"
"Yes."
"And is that all you wanted to know?"
"Sure."
"Really? Then why are you deciding to double your order next year? It seems that you actually wanted to know how many books you should order next year..."
"Well, yeah. If it was a success, I figured we"d increase our order."