A hedge maze.

A Data-Informed Framework for Decision-Making

Tim Wilson
,
Senior Director of Analytics
,
Jan 25, 2021

“Be data-driven!” This buzzword directive guides many organizations. Unfortunately, while they have access to a lot of data, few have a data-informed framework for how to think about putting that data to use. Here’s a way to think more clearly about what you’re actually trying to do with the data, including some straightforward constructs to internalize and put to use.

Data-Informed Decision-Making

As much as organizations and the marketing press touts data-driven decision making, the case has been made for years that data-informed decision making is a more useful mindset. Rather than expecting that simply amassing broad and deep datasets will readily reveal “actionable insights,” organizations that consistently put their data to meaningful use do so by establishing clarity in the different uses and expectations they have for their data. And, they establish processes that put human thought and creativity first, with data then being used to refine, validate, and, inform the actions that can be taken from those ideas.  

Consider two distinct ways that data can be used to inform decision making”

  • Performance Measurement—this is the objective and quantified measurement of how marketing activities are performing relative to expectations
  • Hypothesis Validation—this is about making recommendations in support of decision-making and future actions

The following diagram summarizes the distinction between performance measurement (where KPIs are used, as well as a limited set of supporting metrics for context and preliminary diagnostic work) and hypothesis validation (ad hoc analysis and experimentation that drives recommendations, decisions, and action):

Illustration of a data-driven framework to create a measurement plan using performance measurement and hypothesis validation

Hypothesis validation includes historical data analysis, but also includes experimentation (e.g., media tests, A/B/n tests, etc.), and is not limited to any particular data source, so can include spend data, behavioral data, search data, customer data, and voice of customer data as appropriate.

Performance Measurement

Performance measurement is about objectively and quantitatively being able to answer a simple question in the future:

Where are we today relative to where we expected to be today at some point in the past?

At its core, performance measurement is about establishing appropriate key performance indicators (KPIs). Unfortunately, “KPI” is a term that often gets treated as a synonym for “metric” or “possible metric of interest,” which is a mistake.

One way to establish a finite and manageable list of meaningful KPIs is by answering two business-oriented questions:

  1. What are we trying to achieve (in business terms)? This is a succinct articulation of what the initiative is actually trying to accomplish for the business. It does not need to include any data or metrics (although it can). The answer to this question is the elevator pitch response to the executive who asks, “What are you working on?” and then, “What’s that going to do for us?” on a quick ride up from the lobby.
  2. How will we know if we’ve done that? This question doesn’t get asked until after the first question is clearly answered. The operative word in this question is “that.” What metric(s) hitting what target(s) will indicate if the business need that’s articulated in the first question is being met? And that’s it.

This is the lens through which we operate when establishing KPIs with our clients, be it for a channel, a website, a campaign, or some other initiative or activity. It sounds simple—and it should be—but the exercise often provokes vigorous discussion among the stakeholders. This is actually a good thing! It’s much cheaper to identify misalignments before investing in an effort than finding out after the initiative is underway or completed that the stakeholders were not on the same page as to what it was intended to accomplish!

KPIs serve a fairly narrow purpose. They’re most effective when they:

  • Are limited in number (3-5 is our recommendation)
  • Always have a target established (this is also why we push to limit the number of KPIs)
  • Are not expected to provide “insight,” but, rather, are in place to monitor performance—as objective indicators to how the campaign/site/initiative is performing relative to expectations

Performance measurement is then realized through automation or partially-automated reporting and dashboards. This can be by way of a BI platform, automated (or pseudo-automated) slide decks or PDFs, or even through automation built into spreadsheets (although this is generally not ideal).

These dashboards should follow best practices in data visualization when it comes to their design to ensure that the audience gets a clear, objective, at-a-glance understanding of how the initiative is performing relative to expectations (KPI performance against their targets!).

image3 2

(As a side note, performance measurement dashboards and reports often do include some limited data beyond simply the KPIs. It’s important to be judicious with this—limiting that additional data to the handful of metrics that stakeholders will most likely want to check if a KPI is wildly off target. Remember: performance measurement dashboards are not about providing a means to answer any and all questions the business might have.)

Hypothesis Validation

Consider this definition of an hypothesis (emphasis added) from Merriam-Webster:

A tentative assumption made in order to draw out and test its logical or empirical consequences.

Breaking that down a bit:

  • tentative assumption—hypotheses start with an idea. Those ideas may be inspired or informed by data, but they don’t have to be. They can come from experience, instinct, or even a lightning flash of creativity. (This doesn’t mean every hypothesis is worth validating. It just means that it’s cheap to capture one and write it down.)
  • draw out and test—it’s only after the tentative assumption is made that the data comes into play! The idea comes first!
  • consequences—we don’t want to validate hypotheses that turn up things that are interesting but not actionable. Hypotheses should actually be prioritized that have the potential to drive a decision or action if they hold up when they are “drawn out and tested.”

Every ad hoc analysis or test/experiment should be hypothesis-based. This means that every question, idea, assumption, or data request should be translated (this is that “human thought” thing again) into a structured hypothesis!

We can think about hypotheses as having two distinct types:

1. An idea about something that is occurring that, if so, could be used to drive action.

For this type of hypothesis, we use the following construct:

We believe [the idea] because [evidence or observation supporting the idea]. If we are right, we will [take some specific action].

2. An idea to test a change to the current experience in order to measure its impact.

For this type of hypothesis, we use the following construct:

If we [make some proposed change] then we will see a positive impact on [one or more key metrics] because [evidence or observation supporting the idea].

By using these fill-in-the-blank constructs, we can ensure that we have:

  • Clearly articulated a specific idea (hypothesis)
  • Interrogated the idea to determine the strength of the existing evidence supporting it
  • Confirmed that the idea is truly actionable—ensuring that resources are not expended validating hypotheses that are interesting but that don’t actually have the potential to positively impact the business

This approach ensures alignment across all stakeholders before the more expensive work of analysis and experimentation is initiated, and it enables effective prioritization of analysis and experimentation activities on an on-going basis.

Putting it All Together

A process that incorporates these concepts can feel foreign to an organization that has been amassing vast volumes of data and wondering why repeated drilling into that data has not yielded multiple gushers of actionable insights.  It does require a fundamental mindset shift: from starting with the data and expecting it to deliver answers…to realizing that it can…but someone still has to come up with smart and productive questions!

If you are interested in this approach, as well as some tips about establishing meaningful targets, check out this 1-page PDF!

We can help you get started with clear, data-informed decision-making today. Reach out here.

Tim Wilson
,
Senior Director of Analytics
,

Read More Insights From Our Team

View All

Take your company further. Unlock the power of data-driven decisions.

Go Further Today