Getting Executives to Care About Data Quality
In the last post of this blog series on being successful with digital analytics, I shared my thoughts on why I see so many data quality issues and why data quality is so important to digital analytics programs. Much of it has to do with the perception of your data and of your analytics team. But addressing data quality takes time and money. Unfortunately, that time and money often comes from executives and I have seen many cases in which executives who are willing to spend money on websites, apps and analytics tools are not as open to devoting money and resources to data quality. In this post I will share my thoughts on why this is the case and some suggestions on how to win over these executives.
Many executives, rightly so, focus on major projects and milestones. For example, launching a new website with increased functionality is a highly visible effort within an organization. Delays or cost overruns in major efforts like this make executives look bad, so they normally do whatever it takes to cross the finish line on time and on budget. But once those major milestones are met, the follow-on work like data quality isn’t as sexy as the project itself. There are no key dates to hit and no tangible thing like a new website that you can view. This is one of the reasons why data quality doesn’t get the attention it deserves.
The other key reason I believe that data quality doesn’t get executive focus is that they don’t directly feel the pain of data quality issues. In many (not all) cases executives are not logging into digital analytics tools themselves. Unlike you or your team who can see that a metric is broken, executives only learn about data quality issues if those reporting to them complain about it at staff meetings. While you may be frustrated that an Adobe Analytics eVar has been broken for weeks, it is likely that your executives don’t even know what an eVar is!
So how can you get your executives to feel your pain and care more about data quality. While there is no silver bullet for this, I do have a few approaches that have helped me over the years. The first is to leverage the business requirements work that we have been discussing throughout this blog series. Back in the fourth post of the series, we identified your website objectives. These were the high-level reasons why your website or app exists. Then in the sixth post of the series we identified your business requirements and tied each of these business requirements to high-level business objectives. Then in the eighth post of the series, you identified the data points that were needed for each business requirement. Without realizing it, what you have done is applied the transitive theory to map high-level business objectives to detailed data points in your implementation. Business requirements serve as the conduit between objectives and data points.
So why does that matter? Your executives care more about your high-level website objectives than they do data points. But if you can show your executives that there are data quality issues that are negatively impacting your team’s ability to answer questions about their high-level objectives, you can include them in your pain! For example, imagine that there is a major data quality issue around tagging for real-time chat and you are having a tough time getting your development team to focus on fixing it. Using your requirement-driven SDR, you can show your executive that you are essentially flying blind on the “Improve Customer Sat Score” objective that she cares about:
In many cases, I have seen this get the attention of executives so they can help you to get your data back on track.
Your homework for this post is to:
- Make sure your SDR has all of your business objectives mapped to data points and identify which high-level data points might be at risk due to data quality checks you performed in the previous post.
- If you have time, you can hear me talk about some of the ideas presented above in this video.
In the next post, I will switch gears and talk about the analysis part of digital analytics.