The KPI Library defines SLA Compliance as “The total number of incidents resolved within SLA time divided by the total number of incidents.” In fact, this is the calculation used by many service centers, to determine how well they’re meeting their Service Level Agreements. But the simplicity of the metric prevents it from revealing greater detail about how well SLA’s are being met.
This article proposes another calculation that can reveal subtle, yet meaningful differences in SLA compliance. The traditional calculation leaves managers blind to these differences.
Let’s say in January, your service center closed out 1,000 service requests, and 850 of them were closed within their respective SLA timeframes. That gives you an 85% SLA compliance rate.
Then in February, your service center closes 1,100 cases, with 935 closing within the SLA. Using that same SLA compliance calculation, you’re compliance rate is one again 85%. Based on that, you conclude that February’s performance was no better, and no worse that January’s. But that may not be true...
The problem with the standard SLA Compliance calculation is that it doesn’t consider that actual time it took to close the case. It's a simple black and white metrics that doesn't measure shades of grey.
Let’s look at January again:
To keep this simple, we’ll assume that all cases have a 48-hour SLA.
Let’s say the 850 cases closed within the SLA were closed with 2 hours to spare (within 46 of the 48 hours.) That’s 1,700 “positive” SLA hours.
Now let’s say the 150 cases that exceeded the SLA did so by an average of 4 hours. That’s a total of 600 “negative” SLA hours.
Subtracting 600 from 1,700 leaves us with 1,100 positive SLA hours.
Now, let’s run the same calculation for February.
Let’s say February’s 935 compliant cases closed an average of 1 hour less than the allotted time (47 of 48 hours). That works out to 935 positive SLA hours in February.
But the 165 cases that exceeded the SLA did so by an average of 3 hours. That’s a total of 495 negative hours. (165 * 3 = 485).
As the chart below shows two different scores related to SLA compliance. The traditional score shows January and February performance identical at 85%. However, the “Net Case Compliance Score” reveals differences in performance between the two months. In fact, it shows that the average case took 7/10’s of an hour longer to close in January.
The reason for the difference may be a new agent at the start of the learning curve, a higher percentage of more difficult cases, or agents not aligning their efforts with SLA objectives. Those are just a few possible reasons for February’s lesser performance.
But the main point is this:
If you’re only using the traditional SLA compliance metric, you won’t notice these areas that may require your attention. SLA Compliance scores as we’ve known them may be selling us short on our ability to manage areas that can use improvement.
How do you measure SLA compliance? And what do you do with that information?