The Data Literacy Triad

“Data literacy” has become a hot topic. At least, in the world of business data it has. There are multiple good reasons for that.

There is a data explosion:

 

And, there is an explosion of tools for accessing and analyzing the data:

 

With more data and more ways to work with that data, it is no surprise that organizations are increasingly embracing the need to be knowledgeable and competent — literate — when it comes to the application of data.

Do a little Googling and you will find any number of definitions of data literacy, like this one (the one Google bubbles up and summarizes) and this one (which includes a nice little 2-minute video) and this one (which has tips for closing the “data literacy divide”) and even this one (because Wikipedia — it’s a surprisingly brief, yet broad, entry).

When it comes to a practical discussion about data literacy within an organization, it seems like the term is often defined situationally (and narrowly). While that may meet an immediate need, it can be short-sighted. Data literacy is a mult-faceted thing, and it’s important to think through, plan, and educate along all of those facets:

  • Metrics Literacy — knowing what different data means
  • Tool Literacy — being able to self-service data needs efficiently and to an appropriate extent
  • Conceptual Literacy — approaching and applying data with clarity and sophistication

The first two of the above are, by far, the most common definitions used. The third, though, is equally critical and is where many — if not most — organizations fall short.

Metrics Literacy

Metrics literacy is, at its core, having a strong understanding of the data actually being used within an area of the business.

 

There are, essentially, two levels of metrics literacy:

  • A general understanding of a particular dimension or metric — even if an employee does not know the exact mechanics and definition of a value that appears in a report — lead, opportunity, page view, session, bounce rate, channel, open rate, conversion rate — he may still have a fairly accurate conceptual idea of what the data means. This can be dangerous. In the world of website analytics, for instance, many business users think they have a strong conceptual idea of what the metrics “time on site” and “time on page” mean, when, in reality, those are two consistently misunderstood metrics.
  • A true and accurate understanding of a particular dimension or metric — this is where true metrics literacy comes from. It actually requires having an understanding of the underlying process that created the data. As an example, a salesperson who uses Salesforce.com as a CRM on a daily basis likely regularly creates and maintains leads, contacts, accounts and opportunities. So, when she sees data from the CRM — regardless of whether that report is within the CRM itself, in a spreadsheet, in a dashboard within a BI tool, or even simply referenced in an email — she likely will intuitively and correctly interpret any of that data.

The challenges of achieving metrics literacy are many:

  • A single employee may be highly literate with some data and not literate at all with other data
  • It is human nature to assume one has achieved a true and accurate understanding of the data when, in reality, one only has a high-level, intuitive understanding of the metric, or, much worse, one’s understanding is actually incorrect.
  • The best way to become deeply literate regarding a data set is to work — operationally — with the underlying system that generates the data (salespeople when it comes to CRM data, digital analysts when it comes to web analytics data, bookkeepers when it comes to cost accounting data, etc.), and this isn’t practical in an environment where collaboration occurs (as it should) across many different groups.

Recognizing that metrics literacy is actually more involved than it seems on the surface means data definitions need to be readily available (embedded in reports and analyses to the extent possible), and they need to highlight data that is most prone to being misinterpreted.

Tool Literacy

Unlike metrics literacy, tool literacy is not, definitionally, something that varies based on the type of data. Every company has that person that everyone knows is “really good with Excel.” That person is highly literate with Excel (or, at least, is sufficiently more literate than others in the organization to be perceived as such).

Tool literacy is a measure of how efficiently, intuitively, and effectively an employee uses the tools that are needed to access, explore, and present the data.

 

Which tool(s) an employee will benefit the most from a high degree of tools literacy varies based on the organization and the employee’s role. Microsoft Excel (or, these days, Google Sheets) is still, often, a core tool in an organization, but many organizations have started adopting second generation business intelligence (BI) platforms such as Tableau, Domo, Qlik, PowerBI, and the like. Business users do not necessarily need to be adept at the ins and outs of getting enterprise data into these platforms, but the ability to quickly filter, drilldown, combine, and visualize data within the platform is a sign of a high degree of tools literacy with a BI platform. Simply knowing how to log in and create reports or dashboards created by someone else is only the barest level of tool literacy with BI tools.

Historically, employees with “analyst” in their titles tend to be fairly tools-literate. That makes sense, as they spend more time with the analytics tools, and they will be ineffective in their roles if they are using those tools inefficiently (although, sadly, there are many analysts out there who, while “way better with the tool than anyone else in the organization,” are simply benefitting by an excruciatingly low level of tool literacy in the organization, overall). Increasingly, though, tool literacy is being expected of a broader set of employees:

  • The analytical tools built into the operational systems they use are much richer (see Einstein Analytics from Salesforce.com, Analysis Workspace from Adobe Analytics, Bizible Discover from Bizible, etc.)
  • The organization has invested in a BI platform that was sold as being a means for “democratizing the data” — the executive who made the decision to invest in the platform bought into the platform’s assurances that all employees would quickly be digging deeply into the data using their intuitive user interface.

The challenge is that more powerful data analysis tools are, inherently, more complex. Certainly, the user experience varies drastically from tool to tool, but no amount of UX work can change the fact that complex systems make for complex data, and offering a high degree of flexibility adds complexity in its own right.

And, to add to the messiness, many analysts are finding that their toolsets are expanding to include the use of open source programming languages like Python or R.

On the one hand, tools literacy is the “easiest” aspect of data literacy to manage, as most tools have a wealth of training resources available. And, as long as users are regularly using the tools, they’ll inherently get better with them through repetition. But, on the other hand, tools literacy requires a degree of diligence on the part of the employee to ensure that his use of the tool does not stop at the “good enough to get by” point and, instead, is a steady and continuous progression.

Conceptual Literacy

Some organizations focus almost exclusively on metrics literacy and tools literacy without tackling the murkier — but critical — area of conceptual literacy. Conceptual literacy is the ability to think about data in ways that are appropriate, meaningful, and actionable.  

At a basic level, this aspect of data literacy means:

  • Truly understanding the purpose of KPIs, and being well-versed in approaches for establishing them
  • Being able to clearly articulate and prioritize hypotheses
  • Having a reasonable degree of comfort with different approaches for using data to validate hypotheses and where each is most appropriate: historical data analysis, qualitative research, quantitative research, split testing, etc.

Beyond the basic level of conceptual literacy is a more advanced level that goes to internalizing the relationship between cost and uncertainty, as well as the inherent variability in the data.

 

This is a tricky set of ideas, but they are important. With all of the buzz and excitement about machine learning and artificial intelligence, it’s clear that many business users are hoping that they can shortcut the need for conceptual literacy and “just throw the data into the machine.” This is dangerous territory and, every time I hear a business user make a version of this claim, I cringe. Despite the increasingly powerful analytics platforms, there remains a critical need understand where the machines fit — and where they cannot or should not — in an organization’s overall analytics ecosystem. And, that requires a high level of conceptual literacy.

The seed of this post were some internal discussions we’ve been having at Search Discovery about the different ways we help companies develop and activate strategies for effectively using their data to drive meaningful business outcomes. Ensuring that the stakeholders have a sufficient degree of data literacy across all three of these aspects is a critical component of that work. If you’d like to discuss how we can help you assess your data strategy, we’d be happy to chat with you.

Search the Blog

Scroll to Top