×

SUBSCRIBE TO TMCnet
TMCnet - World's Largest Communications and Technology Community

CHANNEL BY TOPICS


QUICK LINKS




 
ctihelpdesk.gif (2682 bytes)

October 1998


Why Cookie-Cutter Performance-Measurement Metrics Can No Longer Stand Alone

BY DEBRA N. DISBROW, CALLCENTER TECHNOLOGY, INC.

What is performance in a call center and how should it be measured? While performance is one of the most commonly implemented metrics, the ways in which it is measured rarely reflect the true status of the call center. Performance can be many things to many people, but there's one aspect most experts will agree upon: it's not measured as effectively as it could be. This is due, in part, to the increase in the number of metrics being generated in call centers. Most metrics, like quality monitoring evaluation scores, are used as stand-alone indicators without regard to the synergies possible from combining multiple metrics. The practice of utilizing single, generic metrics to define performance of agents, groups or sites has become inadequate. Before offering a solution, let's take a look at some typical performance measurements.

Productivity is one of the oldest and most generic measures of performance. Anyone in a call center can tell you how to measure agent productivity. Average handle time, number of calls handled and first call resolutions are all typical ways to assess agent productivity. The data used for this measurement usually comes from an ACD.

At the other end of the spectrum, customer satisfaction is one of the newest measures of performance. One might assume that customers are satisfied if wait and handle times are short and service levels are high. Monitoring a related metric, such as sales volume, might reveal a completely different picture. Call centers have realized that direct and frequent measurements of customer satisfaction need to become the norm to accurately assess the situation.

Customer satisfaction tracking by call centers is conducted in a variety of ways. Some call centers send postcards to recent customers. Others conduct random phone calls and request that customers complete online surveys. Some call centers select customers for follow-up based on initial call type, agent or skill set. Still others use word-spotting techniques from recorded conversations as the basis for generating customer callbacks. Recognizing customer satisfaction as a valid performance metric means another measure of performance is generated. The metrics used for customer satisfaction are much more subjective than those used for productivity.

The experiences customers have with call centers are often measured using cradle-to-grave call-tracking software. These products manage customer interactions, provide agent scripting and define agent workflow. Wait times, the number of call transfers, problem-resolution speeds, call-escalation frequency, callback times and a host of other metrics are available from these applications to assess performance. Customer experience variables are a new set of metrics that can be added to our growing list of performance variables.

The measurements discussed up to this point quantify the customer experience before and after the call. Conducting performance measurements during the call is the role of quality monitoring systems. Quality monitoring can either be performed manually or by means of an automated recording system. The end result of either type system is yet another metric. Automated recording systems have recently become a widespread tool in even the smallest call centers. They are used to silently monitor and/or record agent conversations to enable management to generate overall performance scores. Quality monitoring scores are yet another variable by which agent performance can be measured.

Despite the obvious value of these individual metrics, variables are rarely combined to reflect all the elements of agent performance.

Your customer service representatives are not one-dimensional.
Why are the metrics you use to measure them?

Imagine the following situations:

  • A telephone service representative (TSR) handles more calls than anyone, but he or she forgets to verify customer order information. The calls-handled data from the ACD report would indicate the TSR is performing well. The quality monitor evaluation scores would indicate poor performance in verifying customer order information. The same individual reviewing the calls-handled data would need to review the quality monitoring scores to identify the performance problem.
  • A TSR has consistently high quality monitoring scores and consistently low customer satisfaction scores. This situation may indicate that the criteria used for quality monitoring evaluations is questionable and needs to be reconsidered. Quality monitoring scores and customer satisfaction survey results would need to be evaluated at the same time to reach the conclusion that the standards used for monitoring may be suspect.
  • Industry standard service level takes into account only calls offered that are answered in a predetermined number of seconds. If the abandoned call rate is high, then a true representation of service level to customers will not be reflected. By redefining service level to include percent abandoned calls along with percent answered calls, a more meaningful metric would be created.

While these may be simplistic examples of the value of combining measurements, the need to simultaneously look at all elements of performance is obvious. It is unlikely that you will obtain an accurate and comprehensive picture of the TSR, department, or call center without the ability to analyze combined sets of data.

Why Cookie-Cutter Statistics Can No Longer Work
Most call centers still rely on the individual statistics. Creating new metrics that cross multiple data sources involves resources and time - time that few have to spend and resources that often seem scarce. It is not uncommon to see personnel sitting at a keyboard, typing in data to a spreadsheet, in an attempt to combine data from multiple sources. While data mining and warehousing are beginning to provide some solutions, in-house expertise is limited and cost can be significant. If you've haven't managed to overcome all these reasons for keeping the status quo on metrics, inertia and other human frailties will surely guarantee that one-dimensional measurements will limp along well into the next century.

A tremendous growth in call center technology has propagated data overload in call centers. New customer interaction touch points, such as e-mail, Internet, fax response and video kiosks, will funnel increasing amounts of data into call centers that are already ill-equipped to manage their data. These new touch points will generate mountains of additional data. And buried within, will be unique performance-related metrics. Even the most diligent analyst would rapidly become overwhelmed.

An environment filled with uninformed decision makers is inevitable if a continued reliance on the old standby metrics of performance persists. Failure to view the call center holistically will negate the benefits of implementing new technologies. Ironically, the reason for analyzing metrics in the first place, a reality check on call center operations, will remain unsatisfied amidst the clutter of data.

Turning data into information can be simplified.
Revisit performance criteria that make sense for your operation.

If you are brave enough to redefine performance, consider doing the following:

  • Be the visionary. The need to view the call center holistically and to integrate your data will only grow. Recognize that Web and e-mail interactions have become permanent fixtures for which performance data is collected and that having immediate access to all your data is critical. The quality as well as the quantity of customer interactions must become the yardsticks for performance in the future.
  • Redefine all the measures of performance. Identify the source of all meaningful data. The answer to how quickly agents respond to customer requests from all sources, how much time at work is spent responding and determining customer satisfaction are just a few of the many absolute performance variables (APVs) you may want to include. APVs are those variables that are independent of call center type and are considered important in most call centers. Typical cookie-cutter APVs include the number of calls handled, average handle time, quality monitoring scores and schedule adherence.

    There are also relative performance variables (RPVs) that call centers use when defining performance objectives. In travel reservations, for instance, the number of tickets sold would be considered an RPV. In teleservices, setting up sales appointments or completing research interviews are typical examples of RPVs. Every call center has its own unique RPVs that can be used as variables in assessing performance.
  • Develop the equations that reflect your call center's priorities. The result will look something like this:

fig1.gif (25343 bytes)

PL = Performance level.
APV = Absolute performance variable.
RPV = Relative performance variable
.

Each variable should reflect a weight factor to determine its overall importance in the formula. For instance, if APVs carry three times the weight as RPVs, the above formula could be rewritten as follows:

fig2.gif (32450 bytes)

By assigning weight factors, less significant variables, in this example, the RPVs, will make a much smaller contribution to the overall performance level.

Let's take a closer look at the way one might redefine performance level for an agent in a reservation center using the formula above: 

fig3.gif (159828 bytes)

A single number "performance level" (PL) metric like the one above, can be generated and used in the same manner as "service level." Service level is a commonly accepted measure of performance that refers to the speed at which calls are answered. Each call center develops its own service level objectives and staffs in support of that service level. Performance levels could be generated that reflect combined variables indicating the overall performance of your call center.

The enormous task of developing performance-level metrics should be obvious. It is extremely difficult, time-consuming and costly to be able to rapidly combine disparate data into new performance metrics. A further complication emerges when trying to obtain disparate real-time data such as the data from an ACD. Necessity drives innovation, however, and some call centers have already found ways to combine multiple data sources to create this type of performance metric for all aspects of the call center, from agents to the enterprise.

While the process may fall short of a mathematician's dream, it nevertheless allows users to develop a comprehensive single-number index that reflects overall performance. By assigning tolerances or thresholds to this index, only the out-of-tolerance performance conditions need to be addressed, saving call centers both time and money.

Performance metrics are not the only ones that can benefit from the combination of data from multiple sources. Dozens of new metrics will support the call center. Considering the complexities involved, one would wonder if the agony of implementing new metrics might ever subside. The obstacles most definitely can be overcome. Technology will continue to drive the need for multidimensional measurement systems. Holistic call center management can become the winning way. And if you are a visionary, clarity in this otherwise muddled environment can be yours.

Debra Disbrow is the director of marketing for CallCenter Technology Inc. (CCTI). CCTI provides information management software that enables managers to combine data simultaneously from all call center sources. CCTI's products use the combined data to generate real-time and/or historic information for display, reporting and decision support.

 







Technology Marketing Corporation

2 Trap Falls Road Suite 106, Shelton, CT 06484 USA
Ph: +1-203-852-6800, 800-243-6002

General comments: [email protected].
Comments about this site: [email protected].

STAY CURRENT YOUR WAY

© 2024 Technology Marketing Corporation. All rights reserved | Privacy Policy