Metrics That Matter (and the Bathroom Exit Thermometer!)

Add bookmark
metrics

Last month at SSOW North America in Orlando I had the pleasure of moderating three Interactive Discussion Groups (IDG) on metrics. With so many makeovers in Shared Services including changes in scope and type of work, as well as advancing automation in process delivery, the question arises…why are many teams still focused on traditional KPI and metric reporting?

IDG participants shared best practices and challenges as we examined how to put a fresh coat of paint on the ‘what and how’ Shared Services Organizations measure performance and experience.

Measure What Matters to Your Customers

Each time I’m sitting on the tarmac on a delayed flight and observe the countdown timer at zero, I reflect on how airlines tout their ‘on time departure’ metric as measured when the plane pulls back from the gate. Yet every passenger that has to sprint to make their connection will nearly always say that on time arrival is what really matters to them. The airlines show themselves as ‘green’ while all the passengers are thinking ‘red.’

In other words: engage your customers to discuss what measures and outcomes are important to them and devise metrics that are relevant.

Here are a few examples:

Ease of Doing Business. In a recent webinar session, Sheila March, Cx Leader, shared how her organization changed from NPS to a single measure “how easy is it to do business with Shared Services.” Ease of Use (completing a task, getting information needed) can also be used as a benchmark to measure the effectiveness of automation (find out more in: Human in the Middle – Automation for Xperience).

Hours Returned to the Business. Customers may be skeptical of automation ROI when they don’t see headcount reducing. Automation often involves bits and pieces of a task or process, usually not an entire person’s job. Provide a specific measure of hours saved and demonstrate how those hours were utilized (i.e., we took on 3 new processes, 6 new legal entities, 6,000 new employees with zero headcount add. No attrition backfill. Assumed work for 2 new regions without additional resources.)

If I have to call in, it’s a defect. Examine the effectiveness and adoption of self-service options.

Design measures that provide outcome focused insights vs. transactional reporting. Example: Most hiring managers aren’t specifically interested in the number of CVs/resumes received or applicants screened. What matters most is the date a qualified candidate is onboarded and arrives with the necessary accesses and tools to perform his/her role.

Survey Fatigue is Here – Use Data Instead

Nearly every service delivery touchpoint results in a survey. I have 6 in my InBox as I’m writing this article. From my Amazon order yesterday, last night’s Help Desk ticket and my hotel stay last week, each promises they’ll just need a few minutes of my time. To supplement, or in some cases, replace surveying customers for feedback, consider alternate channels to gauge their experience.

Leverage data and analytics generated by automation that was previously too time consuming to capture or analyze.

Systematically capture the customer journey as a key element of Customer Experience (Cx).

  • How many screens, drop downs, menu selections and search patterns did it take for a user to get the information he/she was searching for.
  • What is the number of minutes from entry into a process or application until the user is able to solve a problem, submit a request, or perform the desired action?
  • At what point in a process, phone tree, website journey do users ‘bounce out’ to call or use another resource?

Process Mining tools now enable teams to identify processes ripe for automation and capture the current state and post automation performance. These tools also provide visualization, in near real time, of where bottlenecks and process exceptions are occurring. SSOs no longer need to wait for the phone to ring or for their Case Management tool to fill up with tickets from upset customers after a change has been implemented. Process Mining tools can follow a task on its journey through a process and highlight where things are go awry (see also SSON's Report on Process Discovery).  

Don’t Create Metrics that are ‘Taxes You Impose and Never Take Off’

Too often we create KPIs, metrics or monthly reporting dashboards as part of an annual process or during the launch of a new center or process. Make a concentrated effort to retire measures no longer relevant and test out new ones as your service delivery model and scope evolve.

Tips for designing new metrics:

Simple, actionable and with an expiration date.
Create a method to separate the ‘what’ and the ‘how’ on satisfaction scores.

  • What: dissatisfaction with outcome (policy disagreement, # of contacts to resolve, difficulty to get help, duration to resolve issue)
  • How: Behavior, knowledge, efficiency, quality and passion.

Bathroom Exit Thermometer – Happy face, Sad Face, Non-Emotion Face

  • Using simple measures can increase response rates. In the end, if you haven’t delighted the customer enough to earn a Happy Face you have room to improve.

 

Finally...

No metric discussion would be complete without a look at benchmarking. Taking quick snapshots to measure your organization internally, across regions or as compared to Shared Service industry peers provides guideposts to know if you’re making headway and a goal for leading performance to strive for.

Check out SSON Analytics' Top 20 Most Admired SSOs to learn how your organization matches up.

 


RECOMMENDED