According to an IDC study, the amount of global data is likely to increase ten times to 163 zettabytes by 2025. That’s a lot of data. And it’s good news if you’re the type that uses data to make decisions to improve your organization’s future. But there are 3 common mistakes using data.
The more we use data, the more opportunities there are to make any one of these common mistakes. So, let's get them out on the table, so we know what to avoid!
Archana Madhavan, Content Strategist at Interana differentiates correlation from causation in the blog post, “Correlation vs Causation: Understand the Difference for Your Business.”
The article quotes Ben Yoskovitz, founding partner of Highline Beta:
“Correlation helps you predict the future, because it gives you an indication of what’s going to happen. Causality lets you change the future.”
When we use analytics in the HR Service Center, we do it with the end goal of making changes, either within the employee experience, or HR productivity. Sure, we like to know if our Tier Zero Resolution rate will be higher next quarter, but ideally, we want to know that we have the power to cause TZR to increase.
Most of us read “To Kill a Mockingbird” at some point in school. In the story, Atticus Finch said, “People generally see what they look for, and hear what they listen for.”
That’s confirmation bias, in a nutshell.
We all have ideas and opinions, and we tend to seek out data that validate our thoughts. But when we focus on data that validates our current view, we make ourselves blind to data that may reveal the real reality.
Let’s say you simplify your open enrollment process. And you’re convinced that the simplifications will reduce the number of employee inquiries to the HR service center. After the first week of the open enrollment period, reports show an increase in inquiries from some business units, but not others.
Your bias causes you to focus on the lower-inquiry business, and ignore those that show an increase. Influenced by your bias, you decide not to increase call center staffing for the remainder of the enrollment period.
The next week, the call center gets slammed, and you don’t have enough agents to handle the load. Your confirmation bias caused you to make a decision that resulted in what’s become a bad employee and agent experience. %^$#&!!
Sometimes, we’re quick to draw conclusions from graphs that present a very clear picture. It’ like, “Hey, who could argue with this?!”
The problem is, even in graphs, looks can be deceiving. A graph is only as reliable as its underlying data. And if there’s not much data behind the picture, there’s not much reliability either.
Here’s a good example: The cartoon below (thank you Kunal Jain) demonstrates what can happen when you try to draw conclusions from an inadequate sample size.
The woman in the cartoon – let’s call her Zsa Zsa – just got married. Yesterday she had 0 husbands. Today she has one.
A simple graphic plotting the number of husbands over time makes it appear that Zsa Zsa will have dozens of husbands one month from now. The thin-data issue is that the graph includes only a single point of data. And that’s hardly enough to predict even Zsa Zsa’s future nuptial numbers.
It’s easy to make these “thin data” errors when you feel the need to take action.
For example, a company posted an open survey on their HR Self service portal where Employees could provide feedback on their portal experience. After 30 days, the portal administrator identified a trend in the responses, and considered making a change to the portal design – but decided to hold off.
After 60 days, there were significantly more response data. This time, the report showed a different trend. Knowledge base use was up significantly, and the original portal design was validated.
Our burning desire to take action can lure us into making decisions before we have enough information.
Today’s abundance of data enables us to make more data-driven decisions. But proceed with caution. The more decision we make from data, the more we avail ourselves to the common errors of Correlation vs. Causation, Confirmation Bias, Concluding from Thin Data.
Maintaining an awareness of these common errors more likely to avoid them when we make decision from data. And those decisions will then be more likely to cause the results we’re after.