
e-Book Preview: Measuring performance in a support team

Our upcoming eBook: Scaling a customer support team is launching in the next few days, and we just can’t wait. So here’s a sneak peek of one of our favourite chapters – which metrics you should be paying attention to and what they mean for your surfers.
As your team evolves you will change what metrics are important and that are used for individual agent (or surfer) performance as well as reporting to the broader organisation and leadership team. Every leader we spoke to referenced the need to fight complexity. In doing this, make sure your metrics stay simple and focus on the root cause of what the objective of your team is. For most companies it involves responding quickly, fully solving customer problems, and doing this at a manageable cost.
Sign up to get early access to the full eBook, or check out chapter 1 now
Team metrics
At a team level, the first thing you need to do is define how quickly you want to respond to your customers; also known as your Service Level Agreement (SLAs). These can be set by channel (phone, email, chat, social media channels) or by ticket type and urgency of queries.
At the beginning, SLAs are usually set using some sort of finger-in-the-air methodology. As you scale and depending on which industry you are in, you’ll need to put more analysis into this evaluating the maximum amount of time you want to keep your customers waiting, the cost you want to incur and how quickly you can reasonably scale your team or build automations to help you reach your SLAs.
The measurement to determine how well you are doing relative to your SLAs is called Service Level, and this refers to the percent of tickets you respond to within your defined SLAs. Generally in support, above 80% is considered good performance and you will see variants at different times of the day and different days of the week based on the patterns of when and how customers get in touch.
SLAs and Service Level don’t always tell the entire picture as the metrics are very binary. If you set an SLA of responding to phone calls in 10 seconds, and you respond in 15 seconds, you’re still doing well but that won’t be reflected in your metrics. You need to look at the average wait time as well to understand the distribution.
At a team level, you also want to be measuring quality. This could be Net Promoter Score (NPS), Customer Satisfaction (CSAT) or Customer Effort Score (CES). You may decide to use all three, or you may decide your customer support team just needs one. This depends on how holistically you want to measure the impact your brand has on your customer beyond a specific interaction.
Customer effort is useful but underutilised – it goes beyond just asking if a customer would recommend your brand based on their experience, but specifically asks how easy it was to solve their problem. This is critical feedback for both your support and your product teams.
Revolut introduced a WOW rate as a quality metric to take CSAT one step further. This is the percent of customers that rates their experience as five stars. Instead of a normal CSAT rating, they repurposed the framework to include three gradings: poor, great and WOW.
It’s easy to provide quick and high quality service if you have an unlimited budget. That’s why you also need to measure the cost to serve. This represents the overall efficiency of your organisation. Productivity is an input in cost to serve so it gives you a holistic picture but also helps you assess how ambitious or realistic your service level targets are.
Agent level metrics
While SLA and Service Level are team metrics, separate agent metrics are inputs in understanding your SLA performance.
Average handle time is a ubiquitous measure for customer service teams and a key input into capacity planning. Rona Ruthen said this was the most important metric at Monzo because it reflected how many COPs (customer operations) hours they needed to service their customers and helped them identify which tasks and channels were taking longer, focusing product teams more effectively as well.
Dan Sheldon suggests that in addition to average handle time you need to also look at first contact resolution to see that problems are actually being solved rather than having an obsessive focus on productivity that isn’t customer outcome focused. First contact resolution can and should be looked at per agent but also at a team level as it impacts customer experience. If a customer has to get in touch multiple times to solve the same problem, there are either issues with your training or your incentive models.