Unified Communications

Measuring the Success of UC

By TMCnet Special Guest
Kevin Kieller
  |  May 22, 2012

This article originally appeared in the May 2012 issue of INTERNET TELEPHONY magazine.

 

Unified communications solutions can provide significant business benefits for many organizations. However, most organizations neither have the planning skills nor the discipline to prove that a previous investment in UC is yielding organizational dividends. I think it is time we commit to measuring the impact and results of UC solutions.

Establishing Objectives

It is often said “be careful what you measure” because conventional wisdom dictates you may spur behavior changes that cause improvement in your measured metrics, perhaps to the detriment of the overall business. However, given the complex trade-offs required to select a particular UC solution, I would suggest that you must define, document and prioritize project objectives if you want to prove your UC project has been successful.

Perhaps most IT and telecom professionals are simply trying to be careful, but I suspect that many UC projects fail to define and document objectives instead because they are being careless.

Sometimes you are given the easier assignment to deploy objects instead of providing outcomes. For example, you may be asked to replace TDM phones with newer VoIP phones. This is a much easier task than if you are asked to reduce costs by deploying UC or even more challenging is to improve business efficiency by deploying new collaboration tools.

Unless you can claim success through the simple deployment of specific technology objects, it is in your best interest to spend the time at the beginning of a project working with the business sponsors to define and document measurable objectives. To do this, you can ask questions such as: “When we are done with this project, how will we know we are successful?” or even more broadly “How will we know when we are done?” Ask the questions; keep a record of the answers.

In my experience, senior leaders are more willing to discuss and expected to provide outcomes. Middle management is often pre-occupied with, and allowed to focus on, the specific technology selection. Stated another way, senior leaders more often focus on the what and middle managers more often focus on the how.

My advice is to make sure you first understand the what for your UC project, regardless of your level in an organization. It is important to remember that results are not specific to a particular vendor or a specific technical architecture; if you find yourself debating vendor selection or deployment models you are caught up in the how.

 

Measuring UC

Traditional voice systems, and many add-on tools for VoIP systems focus on capturing data primarily to support long-distance billing reconciliation or departmental chargebacks, most often using the call detail record records, or, to measure call quality, based on mean opinion scores, a test that has been used for decades in telephony networks to obtain the human user's view of the quality of the network.

Newer UC platforms have the capacity to measure and monitor all of the different modalities of communication: IM, voice, video, desktop sharing, conferencing.

I am going to use Microsoft (News - Alert) Lync as an example of a new platform that provides fantastic tools to measure UC. I choose Lync as an example primarily because many organizations have implemented Lync or Office Communication Server, the previous version of Lync, for IM and presence.

Since the original version of OCS, released late in 2007, Microsoft has provided a monitoring server role that captured detailed usage and quality metrics for all OCS conversations. Granted, many customers choose not to install the monitoring server role, as it required an additional server and space in a SQL database in which to store the collected metrics.

The current Lync monitoring server role acts as a super CDR, in this case really a communication detail record, which records the details of all communication sessions, whether they involve, IM, voice, video, desktop sharing, file transfers or web conferencing.

The original reports provided by OCS, which made use of SQL Server Reporting Services, were very techie focused, and it was difficult to decode what all the details meant – and yet, the commitment of the platform to log all of the communication metrics was significant. This commitment meant at least the data was being recorded and was accessible, via SQL, to anyone who had the skill and inclination to extract and summarize it.

With the release of Lync in late 2011, Microsoft greatly improved the standard reports and as a result it is much easier to extract important UC metrics with Lync. Still, properly interpreting the Lync reports, along with their strengths and limitations, does require a significant amount of time to be invested. The key is that, with the Lync monitoring server role, using the detailed data, answers can be found to common executive questions, such as:

What is the average audio conference size?

How many people are using desktop sharing?

How much did we save moving audio conferencing in house?

How much are the boardroom roundtable videoconferencing devices used?

How many users made or received calls when not in the office?

Are some offices not using the new UC features?

This is in stark contrast to other UC solutions where answering these types of questions would require executing an end user survey and then tallying respondents imperfect recollection of their own usage. For example, two years ago I deployed a UC voice and IM solution and had a senior vice president ask “How many people are using IM?”. Unfortunately, the UC platform did not provide any detailed metrics and as such, this very reasonable question required conducting an end user survey, which provided imperfect results representing only a single moment in time.

Of course, even having access to all of the detailed unified communication data still requires you to convert the data into information. And during this transformation of data into useful and actionable information, it is important to understand and make the distinction between usage and adoption. Usage simply shows how much a service is being used. Usage tells you how many minutes of audio conferencing were used this week or this month, how many IM messages were sent, how many files were transferred, how many web conferences were held. In contrast, adoption indicates who is making use of a service: how many people in the organization used IM, audio conferencing, desktop video or web conferencing.

Adoption rates allow you to understand who is making use of a specific UC feature. Are there specific roles, locations or departments in your organization that are not using a new UC feature? This type of information can point to the need for specialized training or can indicate a deficiency at a particular location. Understanding the top adopters can help you identify champions that can assist in modifying long-entrenched behaviors. For instance, if desktop videoconferencing is being deployed in an attempt to reduce travel costs, then knowing and leveraging the people who have figured this out can accelerate your overall project’s success. Similarly, understanding users who have abandoned a new UC service can help you to pro-actively manage a potential technology, communication or training issue.

Committing to measuring the results of your UC project can provide unexpected wins.

Just today, as I was finalizing this article, I was asked to help a customer identify the best 12-hour window to perform some significant infrastructure upgrades. The upgrade process would likely cause intermittent voice and audio conferencing outages so the intent was to select a period of time that would minimize the end user impact. This customer’s UC deployment services approximately 9,000 individuals who make use of more than 1 million minutes of audio conferencing per week, so there was a the risk that this upgrade would impact a significant number of users.

The good news was that the customer had taken the time to install UC monitoring services. This meant that, based on a historical analysis, we were able to predict that a maintenance window spanning Saturday night and Sunday morning would likely impact on average only seven users. Because this customer measured UC, we were able to use the collected UC data to minimize and quantify the risk and impact associated with this planned outage.

Sharing Your Success

In many cases, if a business plan was created to justify a UC project, once the project is started the business plan is tossed in the trash, never to be seen again. However, in a few cases an executive actually asks you to compare reality with the previously approved business case.

I am working with a CIO now who, after two years, is interested in understanding how an original business case to implement an on-premises UC solution is faring compared to the projected numbers. A key part of the cost savings expected to this project related to shifting audio conferencing from a per minute hosted bridge to an on-premises (predominantly) fixed-price solution.

A commitment to establishing objectives for your UC project and then to implementing the tools and processes to measure outcomes allows you to prove your project was a success. Further, as noted above, the same tools used to measure usage and adoption can be used to make better ad hoc business decisions related to operational matters.

Over the years, analysts have compiled a number of standard industry financial ratios that can be used to compare the strengths and weaknesses of one business to others within its industry. I look forward to the day when we can compare communication metrics in the same way. I hope you will consider measuring and sharing the metrics of your next UC project for the benefit of us all.

Kevin Kieller, who provided this piece courtesy of UCStrategies, is a member of the UCStrategies Experts group providing key insights and analysis to help you better understand UC technologies, products, and trends. He is also a partner at enableUC, a company that helps measure, monitor and improve UC and collaboration tool usage and adoption.




Edited by Stefania Viscusi