Saturday, December 14, 2013

Difference User Experience vs Customer Experience

Some of the key differences between these two established fields.

Difference #1: Scope

UX professionals typically focus on the design and development of digital interfaces—today that translates primarily into websites, tablet apps, and mobile apps. And, as the name “UX” implies, UX practitioners typically refer to the people who interact with those interfaces as “users.”
To belabor the obvious, CX professionals hardly ever mention “users”—they talk about “customers” instead. They focus on the interactions that customers have at every stage of the customer journey: discover, evaluate, buy, access, use, get support, leave, and re-engage. CX practitioners are interested not only in digital touch points, but also in marketing communications, product packaging, checkout counters, receipts, face-to-face conversations with sales reps, and phone calls to customer service.
That’s why at Forrester Research, we define CX as how customers perceive their interactions with your company. We’re not talking about some subset of customer interactions. We’re talking about all of them.

Difference #2: Educational & Professional Background

UX professionals typically hail from one of three primary backgrounds: behavioral sciences (into which I’d lump fields like anthropology, psychology, and cognitive science), design, or technology. Degrees and/or professional experience in these fields prepare UX professionals for tasks like determining what types of products and services people need, designing the appropriate interactions, and bringing them to life.
This background is also relevant for a career in customer experience. After all, non-digital interactions need to be defined, designed, and implemented, too. And yet surprisingly few of today’s CX professionals can claim a UX background. That’s probably because in addition to designing customer interactions, CX professionals must also engage in what we lovingly refer to as customer experience management. This translates into myriad tasks that collectively look much more like massive organizational change management than anything that resembles traditional UX work.
Change management requires one of two qualities: authority or influence. Because most customer experience professionals don’t have overarching organizational authority (yet)—and perhaps because influence is more effective in the long run, anyway—most companies appoint candidates to CX positions based on the quality of their existing internal relationships, not on pedigree.
In Forrester’s recent analysis of 177 chief customer officers, we found that 55% were internal hires. And the most common backgrounds for these professionals included marketing, operations, sales, service, and strategy. Notably missing from that list? UX.

Difference #3: Tools & Methodologies

If you’re a regular reader of UX Mag, you probably don’t need me to get into the details of UX tools and methodologies. You know this like the back of your hand. So let me instead focus on the tools and methodologies of customer experience. Forrester’s CX maturity model describes six disciplines that companies need to master in order to create and sustain high quality customer experiences.
The first three disciplines help companies define the right customer experience—the experience that will meet (or exceed) the needs of customers and that will support the business and brand. Those disciplines are strategy, customer understanding, and design.

Strategy: When business people talk about strategy, they’re often referring to a roadmap or plan of some sort. But a CX strategy is a description of the experience that a company intends to deliver. For example, Holiday Inn defined a CX strategy dubbed The Social Hub. It set the stage for an innovative lobby experience that was rooted in the hotel’s key brand attributes (purposeful, inclusive, social, and familiar) and in the activities that its guests wanted to do (eat and drink, relax, and have fun). The heart of the Social Hub strategy states: “We give our guests flexible options so they can be themselves. That way they don’t have to leave the hotel to get what they want. They can find it at the Holiday Inn.”

Customer Understanding: A company’s CX strategy is only effective if it’s rooted in a clear and accurate understanding of who its customers are, how they’re interacting with the company today, and what they want and need from the company tomorrow. CX professionals sometimes employ research methodologies—like ethnographic research and usability studies—that are familiar in UX land. In addition, CX professionals use surveys and focus groups to solicit customer feedback; dig into analytics and big data; mine social media, phone calls, email, and chats to determine customer sentiment; and tap into the knowledge of frontline and backstage employees.

Design: This is the same mindset and problem-solving process that UX professionals apply every day in their jobs. Here, it’s just applied to a wider range of customer interactions. For example, Mayo Clinic prototyped new outpatient exam rooms with foam core and cardboard, and service design agency live|work redesigned call center interactions for Gjensidige, Norway’s largest insurance company.
Again, the three disciplines above help CX professionals create the right experience. The second set of disciplines helps companies manage those experiences effectively. Those disciplines are measurement, governance, and culture.

Measurement: CX professionals use three types of metrics to determine the business impact of customer interactions. First, we’ve got perception metrics: these tell a company what their customers think and feel about their interactions. Then, we’ve got descriptive metrics: this is the operational piece that tells a company what really happened.
For example, a customer might think that they were on hold for “forever,” and the descriptive metric shows that she was really on hold for two and half minutes. In tandem, these two metrics enable CX professionals to set benchmarks for CX quality. Finally, we layer on outcome metrics, which indicate what customers will do as a result of their experience, like purchase again or recommend to a friend. In total, these three metrics enable CX professionals to build financial models that tell them what’s going right, what’s going wrong, and what kind of business benefits they can expect from making specific improvements.

Governance: We typically talk about two types of CX governance: reactive and proactive. Reactive governance involves listening to customers talk about their problems, prioritizing their issues, fixing the ones that will have the biggest impact, and then closing the loop (telling customers what’s been done to make their lives better). Proactive governance involves making sure that CX problems don’t get introduced in the first place. For example, FedEx employees who want to introduce a new project, process, or technology must fill out a form to identify which touch points their proposed initiative will impact and how. This helps to keep problems from bubbling up to customers as an unintended consequence of other initiatives.

Culture: Culture is about driving customer-centricity into an organization’s DNA, and there are three primary levers you can pull to make this happen. The first is hiring. Companies need to hire people who have an innate desire to serve customers. When hiring new call center agents, American Express doesn’t look for call center or financial services experience, instead it looks for applicants from cruise lines, retail stores, and restaurants. The second lever is socializiation, which translates into activities like training, storytelling, and rituals that celebrate customer-centric attitudes and behavior. The third lever is rewards. This includes informal rewards, like movie tickets and recognition at company meetings. It also includes formal rewards, like bonuses and promotions based on performance against CX metrics.

Conclusion

So what does all of this mean for you, dear reader? I know that many UX professionals don’t give a hoot about CX. They’d rather immerse themselves in the details of the latest technology, or geek out over typefaces and Photoshop shortcuts, or surround themselves with thousands of Post-It notes from their latest ethnographic research study. And that’s OK. In fact, it’s more than OK. (Honestly, there are lots of days when I’m right there with you.) But maybe, somewhere, there’s a UX professional who’s looking for something a little different. And to you, I say: Consider the field of CX. You’d be an awesome fit.

Refer: http://uxmag.com/articles/a-deep-dive-into-customer-experience
 

Top 20 Contact Center Metrics

In the 21st century, the call center has evolved into a multichannel contact center. Customers have heightened expectations of service and frontline staffs have new demands and requirements.
As your contact center evolves, you must ask yourself if the measures of performance that have served you well in the last decade are the same ones that will determine how well your contact center is operating today. This article will examine the top performance measures commonly associated with personnel and the processes in today’s multichannel catalog center.
We’ll take the approach of looking at metrics that supply you with critical information related to each contact center stakeholder group. In other words, as you think about the three main groups of people you need to keep happy every single day, you’ll want to make sure you have measures in place to track how well you’re satisfying each group.

The three main stakeholder groups are pretty obvious. The most important group is, of course, is your customer base. The second group is the senior management team. And the third group is your contact center workforce.

We’ll explore measures of service and quality related to customers, efficiency and profitability for senior management, and workplace and satisfaction concerns for the frontline staff.

Service measures

Customer concerns come first, so let’s begin with some of the metrics associated with how we define service to the caller.
Blockage
Blockage is an accessibility measure that indicates what percentage of customers will not be able to access the center at a given time due to insufficient network facilities in place. Measures indicating blockage (busy signals) by time of day or occurrences of “all trunks busy” situations are utilized by most centers. Failure to include a blockage goal allows a center to always meet its speed of answer goal by simply blocking the excess calls. This can have a negative effect on customer accessibility and satisfaction while the call center looks like it is doing a great job in terms of managing the queue.
The contact center must also carefully determine the number of facilities needed in terms of both bandwidth and email server capacity to ensure that large quantities of emails do not overload the system. While this provisioning is typically monitored by the IT or telecom department and not by the contact center, it should still be a measure that is reviewed regularly to make sure callers are not being turned away at the front door.

Abandon rate
Call centers measure the number of abandons as well as the abandon rate since both correlate with retention and revenues. It should be noted, however, that abandon rate is not entirely under the call center’s control. While abandons are affected by the average wait time in queue (which can be controlled by the call center), there are a multitude of other factors that influence this number, such as individual caller tolerance, time of day, availability of service alternatives, and so on. Abandon rate is not typically a measure associated with email communications, since the email does not abandon the “queue” once it has been sent, but it does apply to web chat interactions.  

Self-service availability
More and more contacts are being off-loaded today from call center agents to self-service alternatives. In the call center, self-service utilization is an important gauge of accessibility and is typically measured as an overall number, by self-service methodology and menu points, and by time of day or by demographic group. In the contact center, self-service utilization should also be tracked. In cases of Web chat, automated alternatives such as FAQs or use of help functions can reduce the requirement for the live interaction with a Web chat agent.

Service level/ASA
Service level, the percentage of calls that are answered in a defined wait threshold, is the most common speed of answer measure in the call center. It is most commonly stated as x percent of calls handled in y seconds or less, while average speed of answer (ASA) represents the average wait time of all calls in the period. In the contact center, speed of answer for Web chat should also be measured and reported with a service level or ASA number. Many centers measure for both initial response as well as the back-and-forth times, since having too many open web chat sessions can slow the expected response time once an interaction has begun. The speed of answer for email transactions on the other hand is defined as a “response time” and may be depicted in terms of hours or even days, rather than in seconds or minutes of elapsed time.

Summary of service measures
The most critical of these availability and speed of service measures is the service level number and it’s worthwhile to note here that this metric has evolved in recent years to provide a more practical, realistic view of the service being delivered to callers. Traditionally, service level (or ASA) was measured and reported as a average number – typically an average number for the day. However, there are many problems with this measurement approach. Since most contact centers have peaks and valleys of calls throughout the day, the service level from one period to the next can vary greatly. The overstaffed periods of the day generate a very high service level, while understaffed periods have very low numbers. The end result is an average for the day that may come close to the goal, but a number that does not reflect the actual picture of service for the day. Overstaffing results in needless expense for staff and low productivity, while understaffing means long delays, overworked staff, and higher costs and the measure of service for the day needs to be one that reflects this service better than just the average for day (or worse, average for the week or month).

A better approach for measuring service level is to have a measure that notes the number of periods of the day where service level was acceptable. If the goal is 80% in 30 seconds, then a reasonable measure may be to look at the number of half-hour periods of the day that were between 75% and 85%.This measure provides more of a view of the consistency of service being delivered, which in turn affects customer perceptions, employee workload, and bottom-line efficiency and cost.

Quality measures

In addition to the “how fast” measures outlined above, perhaps a more significant indicator of customer satisfaction is “how well” the contact was handled, indicated by the following measures.

First call resolution rate
The percentage of transactions that are completed within a single contact, often called the “one and done” ratio or first call resolution (FCR) rate, is a crucial measure of quality. It gauges the ability of the center, as well as of an individual, to accomplish an interaction in a single step without requiring a transfer to another person or area, or needing another transaction at a future time to resolve the customer issue. The FCR rate is a crucial factor in customer perception of quality. The satisfactory resolution of a call is tracked overall in the center, as well as by type of call, and perhaps by time of day, by team, or by individual.

The one-contact resolution rate should likewise be tracked in the contact center for email transactions and Web interactions. The resolution rate will likely be lower for emails, as it generally takes multiple messages between two parties to resolve a matter to completion.
Recent studies have shown that the FCR rate is the single number most closely correlated with customer satisfaction. Nothing impacts customers’ perceptions more than simply getting their question answered or problem resolved on the first try. Therefore, this FCR rate should rank very high on your list of contact center KPIs.

It’s not always easy to figure out and may take some piecing together of information from your ACD and contact management system, but it’s worth the extra effort to track it and do whatever it takes to increase the rate. Remember, the higher your rate, the happier your customers!

Transfer rate
The transfer percentage is an indication of how many contacts have to be transferred to another person or place to be handled. Tracking transfers can help fine-tune the routing strategies as well as identify performance gaps of the staff. Likewise, tracking emails that must be transferred to others or text chat interactions that require outside assistance is useful to identify personnel training issues or holes in on-line support tools. This transfer rate is an important number to track as it plays an important part in the FCR rate that impacts customer satisfaction so highly.

Communications skills
One of the critical factors that impact the caller’s perception of how well the call was handled is simple etiquette and customer service skills. The degree to which general communications skills and etiquette are displayed is generally measured via observation or some form of quality monitoring as an individual gauge of performance. Email and web chat etiquette should also be observed. There are standard wordings that should be followed in both types of communications that should be carefully observed, reviewed, and reported as a quality measure of performance. This is particularly true since a written record of the interaction will exist. One of the keys to measuring the effectiveness of communications skills is to make sure you have specific guidelines and definitions of what content and behaviors look like when done right. Define wording you want to hear (or see) and what processes should be followed and then watch and listen for compliance with the expectation.
Be careful about not clearly defining what a quality transaction contains, such as quality forms that specify “demonstrated professional attitude.” You will want to define the specific content that should be used and what should be avoided in customer conversations so that call reviews and coaching can continually fine-tune skills in the right direction.

Adherence to procedures
Adherence to procedures such as workflow processes or call scripts is another essential element of quality. This is particularly important to perceived quality in terms of the customer receiving a consistent interaction regardless of the contact channel or the individual agent involved in the contact.
In the call center, adherence to processes and procedures is typically measured for individuals through simple observation and through the quality monitoring process.

Adherence to processes and procedures is also important for other channels of contact. Written scripts and pre-approved responses are generally created, and adherence to these is monitored and recorded via observation or screen capture capabilities in a quality monitoring system.
Customer satisfaction surveys. Many of the numbers and metrics discussed so far focus on internal metrics – measuring inside the contact center and judging how well you’re doing based on your own standards of performance. But it’s also important to look outside the center and go straight to the source for measures of customer satisfaction.

Ask your customers regularly how they think your call center is performing. While your company may have regular customer satisfaction surveys to gather feedback on a wide range of questions about products, pricing, etc, it’s important to fine-tune and gather specific feedback related to the service they received in their interaction with the call center.

Most organizations can benefit greatly from some professional help in writing and fine-tuning their survey instrument, administering it in a way that ensures data validity and reliability, and analyzing survey results. A good starting place to help you understand the important elements and design surveys that maximize customer feedback is Fred Van Bennekom’s book, Customer Surveying.


Efficiency measures

Executives in every type of organization are concerned with how well the company’s resources are being put to use. That is especially true in a call center environment where the overwhelming majority of operating expenses are related to personnel costs.

Agent occupancy
Agent occupancy is the measure of actual time busy on customer contacts compared to available or idle time, calculated by dividing workload hours by staff hours. Occupancy is an important measure of how well the call center has scheduled its staff and how efficiently resources are being used. If occupancy is too low, agents are sitting around idle with not enough to do. If occupancy is too high, the personnel may be overworked.
Agent occupancy is the end result of how staffing is matched to randomly arriving workload in a call center. However, the desired level of occupancy may actually drive staffing decisions in a sequential work environment like processing emails. Since web chat interactions are essentially random events like incoming calls, the same measures of occupancy apply here as in an incoming call scenario.

Staff shrinkage
Staff shrinkage is defined as the percentage of time that employees are not available to handle calls. It is classified as non-productive time, and is made up of meeting and training time, breaks, paid time off, off-phone work, and general unexplained time where agents are not available to handle customer interactions. Staff shrinkage is an important number to track, since it plays an important role in how many people will need to be scheduled each half-hour. The same measures of shrinkage that are used for call center calculations apply to the multichannel contact center as well.

It is important to track shrinkage by individual category. While some time categories are unavoidable, such as paid time off and training time, other categories should be tracked with an objective of controlling the loss of available hours over time.

Schedule efficiency
Workforce management is all about getting the “just right” number of people in place each period of the day to handle customer contacts—not too many and not too few. Schedule efficiency measures the degree of overstaffing and understaffing that exist as a result of scheduling design. Net staffing may be measured by half-hour as an indication of how well the resources in the center are being utilized.

Schedule efficiency for responding to the randomly arriving web chats should be measured just like that for incoming call centers. Since emails typically represent sequential rather than random workload, the work fits the schedule and therefore overstaffing and understaffing measures are less relevant. Just like for measures of service, it is likely that schedule efficiency varies over the day and week as peaks and valleys of incoming contacts make it difficult to get the exact right number of staff each half-hour. Rather than looking at the plus and minus status averaged out over the day, it is important to look at the variation that occurs by half-hour so that schedule plans can be adjusted to best match workforce to workload.

Schedule adherence
Schedule adherence measures the degree to which the specific hours scheduled are actually worked by the agents. It is an overall call center measure and is also one of the most important team and individual measures of performance since it has such as great impact on productivity and service.
Schedule adherence is one of the most important measures the multichannel contact center as well. Specific hours worked is less of an issue in a group responding to emails rather than real-time demand of calls and Web chats, but is still relevant in processing the work in a timely manner, especially if response time guarantees exist.

AHT/ACW
A common measure of contact handling is the average handle time (AHT), made up of talk time plus after-call work (ACW). To accommodate differences in calling patterns, it should be measured and identified by time of day as well as by day of week.
Average handle time is also a measure that is important in determining the other types of multichannel contact workload. It is much harder to calculate, however, given the difficulties of truly measuring how long it takes to handle an email or a Web chat transaction. An email may be opened and put aside for varying amounts of time before completing. Likewise, a web chat session may appear to take longer than it actually does since a web agent typically has several sessions open at once. Therefore each one takes longer based on start and end time. Automated tracking of these actual handle times is difficult with numbers coming from email management systems often overstated in terms of actual handle time.

While AHT is almost always one of the top metrics on any contact center’s list, it’s critical not to focus coaching efforts too directly on the AHT number itself. While it is often desirable to correct procedures that lengthen AHT, you don’t want to coach to AHT numbers. When this is done, AHT goals may be reached, but at the expense of proper call-handling techniques. It’s best to identify the specific steps, words, and behaviors needed on a call and coach to those, not to an AHT number.
System availability and accessibility
 When response time from the computer system is slow, or if it is cumbersome to move from application to application, it can add seconds or minutes to the handle time of a transaction. In the call center, system speed, uptime, and overall availability should be measured on an ongoing basis to ensure maximum response time and efficiency as well as service to callers. For example, if the IVR typically handles 50% of calls to completion, but the IVR is out of service, many more calls will require agent assistance than normal causing overtime costs, long delays, and generally poor service. Or, if multiple applications are needed and it’s difficult to move from one to another, it can mean much additional handle time. Often this will be a measure of performance that resides in the IT department, but is also a crucial measure of contact center performance.

Profitability measures

Another category of performance measures near and dear to your executives’ hearts includes those that indicate the inbound and outbound flow of money in the center, as indicated by the measures below. These next two measures are particularly important to catalog centers, where the calls typically focus on the placement of an order.

Conversion rate
The conversion rate refers to the percentage of transactions in which a sales opportunity is translated into an actual sale. It can be measured as an absolute number of sales or as a percentage of calls that result in a sale. Conversion rate should be tracked and measured for incoming calls, as well as outgoing calls, email transactions, and other web interactions.

Up-sell/Cross-sell rate
The up-sell rateorcross-sell rate is measured by many organizations as a success rate at generating revenue over and above the original order or intention of the call. It is becoming an increasingly common practice, not just for pure revenue-generating call centers but for customer service centers as well.
Although more prevalent in the telephone center, it is also an appropriate measure of performance for other communications channels.

Cost per call
The flip side of revenues involves the cost of running the organization. A common measure of operational efficiency is cost per call or cost per minute to handle the call workload, both in a simple call center as well as in a multichannel contact environment. This cost per call can be simply a labor cost per call, or it can be a fully loaded rate that includes wage rates in addition to telecommunications, facilities, and other services costs.

In setting cost per call, it is critical to define the components being used, and to use them consistently in evaluating how well the center is making use of financial resources over time. While commonly used to compare one company or site to another in benchmarking, this is not a good practice as the components included and the types of contacts will often vary.

Employee measures

Unfortunately, many lists of call center KPIs ends with the above measures. However, it’s vitally important to include measures of success with one more stakeholder group – the frontline staff. Here are two final measures in our guide that address how happy the workforce is and these are critical measures since a happy workforce works more efficiently, provides better service, and stays around longer.

Staff turnover/retention
One way to measure the satisfaction of your workforce is to look at the percentage of staff who are leaving. There can be some telling information in these numbers and you will want to track and analyze the turnover rates in many ways.
Look at the rate associated with different call types, as it may be more stressful or less satisfying to handle certain types of calls. Look at turnover by team to see if there are any supervisory influences in keeping people or driving them away.
And you’ll definitely want to look at the level of performance of the people leaving. If it’s primarily the worst performers leaving, turnover is not such a bad thing, but if it’s your better performers leaving the center, it may be time to re-examine your compensation, recognition programs, and career path opportunities to see what’s preventing the retention of these staff.

Employee satisfaction scores
The final metric on our list is one of the most important ones. We stated earlier that the one metric most closely associated with customer satisfaction was first call resolution rate. However, running a very close second in terms of correlation with customer satisfaction is employee satisfaction. The happier your employees are, the better they’re treating your customers.
Once again, it’s important here to do your own employee satisfaction survey, as the general company one from HR (assuming they do one at all), may not address all the important areas that impact the satisfaction of your call center staff.

An employee satisfaction survey directed at call center staff should ask questions in the following areas: demographics, nature of the work, training and development, performance metrics, work schedules, physical work environment, health and wellness, supervisory support, compensation, and general attitudes toward the center and company.

You will want to perform these surveys regularly and share overall results with the staff so they can see how areas of concern are being addressed.

Cloud Ecosystem

Cloud Computing Represents The New Delivery Model For Internet Based IT Services
Technology veterans often observe that new mega trends emerge every decade.  The market has evolved from mainframes (1970’s); to mini computers (1980’s); to client server (1990’s); to internet based (2000’s); and now to cloud computing (2010’s).  Many of the cloud computing trends do take users back to the mainframe days of time sharing (i.e. multi-tenancy) and service bureaus (i.e cloud based BPO). What’s changed since 1970?  Quite plenty — users gain better usability, connectivity improves with the internet, storage continue to plummet, and performance increases in processing capability.
Cloud delivery models share a stack approach similar to traditional delivery.  At the core, both deployment options share four types of properties (see Figure 1):
  1. Consumption – how users consume the apps and business processes
  2. Creation – what’s required to build apps and business processes
  3. Orchestration – how parts are integrated or pulled from an app server
  4. Infrastructure – where the core guts such as servers, storage, and networks reside
As the über category, Cloud Computing manifests in the four distinct layers of:
  • Business Services and Software-as-a-Service (SaaS) – The traditional apps layer in the cloud includes software as a service apps, business services, and business processes on the server side.
  • Development-as-a-Service (DaaS) – Development tools take shape in the cloud as shared community tools, web based dev tools, and mashup based services.
  • Platform-as-a-Service (PaaS) – Middleware manifests in the cloud with app platforms, database, integration, and process orchestration.
  • Infrastructure-as-a-Service (IaaS) – The physical world goes virtual with servers, networks, storage, and systems management in the cloud.
Figure 1. Traditional Delivery Compared To Cloud Delivery


Cloud Computing Encourages Users And Vendors To Focus On Value Added Solutions

Applying The Software Insider Tech Ecosystem Model to Cloud Computing highlights where buyers, sellers, and partners can deliver value (see Figure 2).  As cloud computing adoption increases, users can expect that:
  • Solution providers and partners will invest in value added solutions over commoditized infrastructure. The continued commoditization of technology results in richer and more relevant Cloud stacks.  As a result, a handful of larger players will emerge to drive down the costs of computing while encouraging ecosystems to deliver value added solutions.  Buyers can expect packaged apps, vertical apps, last mile solutions, and implementation partners, to invest in specialized and higher value intellectual property (IP).
  • Customers will care more about service level agreements than the brand name of technology components. The cloud commoditizes the infrastructure components for both tools for creation and tools for distribution.  Users shift their priority for brand components in favor of outcomes based delivery.  Consequently, users will not care about the brand name of hardware, database, middleware, and even business intelligence systems in use.  Client success shifts to the monitoring of pre-agreed upon service level agreements (SLA’s)
  • Integration will emerge as the key enabler and choke point. End users need an enterprise apps strategy for cloud computing that addresses the “I” word – Integration.  SOA principles must be enforced including support for canonical data models and business process haromonization.  Integration must focus on data mapping, business process orchestration, quality of service, and master data management.
Figure 2.  The Software Insider Tech Ecosystem Model For The Cloud


The Bottom Line For Buyers  – Use The Tech Ecosystem Model To Build Out Your Technology Roadmap And Procurement Strategy.
The Software Insider Tech Ecosystem Model can provide a key tool in mapping out the long term apps strategy.  Use the suggested five step approach to determine how cloud computing can support existing and future business requirements:

Telcom Industry - Key KPI's


  
1. Country telecoms sector (13 KPIs)

Mobile penetration
Tele-density
Penetration per household
Mobile market share index (MMSI)
Subscribers per km2
Prepaid relative penetration
Number of SIM cards per user
Competition intensity index (HHI)
Top 2 players share
Mobile revenue per GDP
Market ARPU
Data penetration
Pricing ratios (termination rate, In-out ratio)


2.Marketing and Sales (38  KPIs)

Marketing and Sales metrics are split into 6 categories: subscribers, market position, brand performance, usage, revenue, distribution.

Metrics concerning Subscribers
Revenue generating subscribers (RGS)
Gross connections
Net additions
Churn rate (monthly or annual)
Rotational churn rate
Customer lifetime (in months or in years)

Market position metrics
Market share (subscribers)
Value share (revenue)
Share of talk (usage)
Relative market share
Marginal market share

Brand performance metrics
Top of mind awareness
Total spontaneous awareness
Aided awareness
Share of voice
Brand preference
Brand affinity
Brand health score
Brand index

Customer Usage metrics
Minutes Of Use per User (MOU per User)
Number of calls per user
Average call length
Sphere of influence (SOI)
Sphere of reception (SOR)
Sphere of activity (SOA)
Return call index
Average call distance
Call ratio

Metrics concerning Revenue
Average Revenue Per User (ARPU/ASPU)
Average Revenue Per Minute (ARPM)
Average Revenue Per Cell (ARPC)
Marginal ARPU (or ASPU)
Marginal Revenue Per Minute

Sales and Distribution metrics
Share in shop handling
Numeric (and weighted) purchasing
Stocks volume
Stock cover days
Handling stock (numeric or weighted)




3.Quality Of Service (33 KPIs)

Metrics pertaining to Quality Of Service can be split into 3 categories: network, call center, distribution.

Network metrics
Call setup time
Call setup success rate (CSSR)
SDCCH congestion
Congestion rate (all-hours and busy-hours)
Radio network utilization (all-hours and busy-hours)
% cells > 2% congestion
Call drop rate
Half-rate utilization
Point of interconnection congestion
Prepaid service success rate
BTSs accumulation downtime
Handover success rate
SMS delivery success rate
International link availability
Critical link availability
International link availability
Average Erlang per subscriber
Complaints on coverage per 1,000 subscribers

Call Center metrics
Average call handling time (CHT)
Average delay to answer (ADA)
First call resolution
% service level
Calls per subscriber per month
Call abandonment rate
Conversion rate
Occupancy
IVR completion rate
Agent utilization
Average subscribers/Call center employee

Sales and Distribution
Out-of-stock (numerical and weighted)
Net numeric distribution
Stock cover days
Dealer satisfaction index



4.Operational Efficiency (36 KPIs)

Metrics related to operational efficiency measure the quality of a business’s receivables and how efficiently it uses and controls its financial, material and human resources. This set of metrics is very large and can be split into margin ratios, revenue-based ratios, unit-based ratios and billing metrics.

Margin ratios
Average Gross Profit Per User (AGPPU)
Contribution Margin Per User (CMPU)
Average Operational Margin Per User (AOMPU)
Average Operational Margin Per Minute (AOMPM)

Revenue-based ratios
Cost Of Sales/Revenue
OPEX/Revenue
CAPEX/Sales
Commissions (distribution)/Revenue
Business operations cost/Revenue
Network operating cost/Revenue
Marketing OPEX/Revenue
Interconnect cost/Revenue
Subscriber acquisition cost/Revenue
Labor cost/Revenue

Unit-based ratios
Cost of sales/average RGS
Cost of sales/billed minutes
OPEX/average RGS
OPEX/billed minutes
OPEX/number of sites
CAPEX/average RGS
CAPEX/billed minutes
CAPEX/number of sites
Marketing OPEX/Gross connections
Marketing OPEX/Net additions
Subscriber acquisition cost/Gross connections
Subscriber acquisition cost/billed minutes
Maintenance cost/number of BTS
Rent and utilities/number of BTS
Mobile switching centers/average RGS
Base transceiver stations/average RGS
Base transceiver stations/km2

Billing ratios
Number of days sales outstanding
Bad debt (% unpaid and % of revenue)
Cost to deal with errors (billing)
Cost of collecting revenue
Ratio of bills collected before due date

NB: For the metrics concerning Human Resources, please refer to the post: Top 100 KPIs for Human Resources.



5.Finance and Valuation (18 KPIs)

Metrics pertaining to Finance and Valuation can be split into 3 categories: return and profitability, solvency and liquidity, valuation.

Return and profitability
Gross Profit margin
EBITDA margin
PAT margin
Return-On-Invested Capital (ROIC)
Return-On-Assets (ROA)
Return-On-Equity (ROE)

Solvency and liquidity
Net gearing ratio
Net debt/EBITDA
Invest coverage ratio

Valuation
Enterprise value/EBITDA
EV/(EBITDA – Tax)
EV/Revenue
EV/Subscribers
Earnings per share (EPS)
Price earnings ratio (P/E)
Price-to-sales ratio (P/S)
Free cash flow/Revenue
Weighted Average Cost of Capital (WACC)

ABC's Customer Experience Metrics

A: Always define before measuring

The trickiest part of customer experience is its definition. That’s natural, though: customer experience is more about qualities than quantities. Experiences are subjective, hard to articulate, and as diverse and complex as your customers.
But that doesn’t mean it’s impossible. If it were, this post (and this blog, actually) would end right here. Fin. Finito! Yet it goes on. The lesson remains the same, though: if you plan on measuring anything, you better be certain that you know what a great customer experience looks like in your space. It will only be similar to other customer experiences in outcome: you must wow people. Getting there is the fun, hard, empathetic work that you must suffer through on your own.

B: Build outcome metrics, not action metrics

At Dreamforce 2012, there was an awesome panel on customer experience metrics that focused, among other things, on one insight: actions are easy to measure, but their easiness is deceitful.
Since action-based metrics are ones that focus on an action that an employee can directly and (usually) immediately perform, they get picked as priorities more often than not. They’re easy! But their results start to diverge from your overarching business goals, especially when you’re looking at something so multi-faceted as customer experience, which requires empathy more than hard-and-fast rules.
Customer experiences simply aren’t reducible to, and are sometimes not even connected to, action-based metrics like “how long did a call last?” or “how many people did you speak with today?” Instead, when Zappos picks a metric like “did the agent try twice to develop a personal connection with the caller?” it’s part of answering a larger, more complicated question: did the call create a ‘wow’ moment for the customer?
“Wow” moments are outcomes. To create them, there is no button you can press, no phrase you can memorize, and no amount of time that you must spend in order to get there. In fact, you could do everything in the list of action-based metrics that Zappos measures, and you still might end up without a positive customer experience result. So in an important way, they don't matter: as long as you’re improving the outcome (great customer experiences), you’re winning, so make sure you focus on metrics that show the true scoreboard of business success. The actions your organization has identified may just be part of the path to that success.

C: Create feedback loops

You’ve got your outcome-based metrics, and you’ve got your ideas of what customer experience means. Now it’s time to motivate, refine, and improve what's measured until you’ve got a business that customers won't forget. The first step? Get your management team driving the metrics up. Every supervisor has a role in organizing the whole company culture around customer-focused priorities.
The metrics help on both ends of a team: managers can monitor the metrics to give recommendations to lagging employees, and they can use them to build best practices from the best employees, who can be awarded for their awesome abilities.

Rebuild your company from the customer up

It might look a little like metrics are what this corporate vision gets built on, but nope, that's not the case at all.
The metrics are only as good as the customer experiences they help to build. Measures of success are like binoculars: they’re only helpful for navigating a rough sea if they’re pointed in the right direction. So know what you're looking at, make sure that the metrics are measuring things that matter, and make sure that the metrics matter to the people who make your company work.