A financial software company strives to provide the best service to its customers who are both individuals and corporations. To facilitate an environment of innovation, this company partners with several customer experience management companies that specialize in operating call centers.
This promotes friendly competition between the partner companies, pushing each of them to operate their call centers as efficiently as possible. However, measuring performance across partners was manual, time-intensive, and thus untimely.
The company began an engagement with Keyrus to automate a monthly scorecard that evaluates the performance of each partner. Prior to this engagement, the scorecard was developed manually requiring 18 resources from all areas of the organization and consumed ~32 hours each month.
The final output was a PowerPoint deck, sent via email to each partner, that provided partners with their overall performance scores for the month as well as the scores of the other partners.
Due to the manual consolidation of the customer experience data for the scorecard, many of the scores were opinion-driven rather than formulaically derived, resulting in a rather arbitrary scorecard. The KPIs built into the scores were collected by hand from a variety of disparate sources.
Then business leaders would conduct a “round-robin” to arrive at what they thought each partner’s score should be. Upon receiving their scores, partners had no insight into how they were derived. Automating the scorecard was necessary from a time-savings perspective, but even more so from a consistency and credibility standpoint.
How could partners be held accountable for their customer service performance when it was subjectively determined?
It was vital to collect all data points systematically and use a consistent methodology each month to translate the raw data into rolled-up scores. It was also just as important to provide partners with visibility into the detailed data behind the scores, eliminating the reoccurring debate as to why a score was given.
Keyrus implemented a fully automated data extraction and processing system to collect Call Data, HR, Compliance, Technology, and Survey data that feed into the performance scoring. This data, sourced from several different tables and source systems, was consolidated into a data model that integrated all the relevant data sources into a single source of truth.
Using Qlik Sense, the organization’s BI tool of choice, we developed a comprehensive dashboard that displays each partner’s Scorecard Score as well as the detailed customer experience data and KPIs that comprise the score. The dashboard can be sliced and diced by a variety of dimensions to quickly identify specific areas of low performance.
The dashboard includes the following key performance areas, all contributing to an overall score, weighted based on importance:
Transactional Net Promoter Score (TNPS) – How likely a customer is to recommend the customer service experience, based on survey results.
Staffing to Forecast – Total number of days per month in which the number of Agents staffed at a call center did not meet the forecasted number of Agents needed.
Average Handle Time – Average time an Agent is spending on activities interacting with a customer
Transfer Rate – Percentage of calls transferred to other Agents
Attrition Rate – The number of agents who terminated compared to the total number of agents
Percent Availability – The duration of downtime due to a technical incident out of the total minutes in the month
Background Investigations Completion % – Tracks whether agents who onboarded within the month had a background investigation conducted
Offboarding % – Tracks whether agents who have been terminated had their access to all systems revoked within 24 hours
Training Completion % – Tracks the % of agents who have completed the required training
One of the key requirements of the automated scorecard solution was to enable open access to all scores to drive friendly competition between each partner, but lock down the comments, the section in which the organization’s business leaders provide candid feedback to each partner on their performance, automated via Google Sheets.
We implemented Row Level Security based on the partner affiliated with each user’s login so that users from each partner could view only their comments but can see the scores and underlying KPIs for all other partners.
Centralizing the performance management under one reporting solution has enabled business leaders to instantly identify areas of low performance and make data-driven decisions related to partner operations.
The organization spends less time each month explaining the rationale behind the scores with partners and more time focusing on tactical opportunities for partners to improve performance, leading to better customer service.
This solution has opened the doors for a new program in which partners are financially bonused and/or reprimanded based on their monthly scores. This initiative would never be possible without automating the scorecard, thus building confidence in the underlying customer experience data and scoring system.