Earlier this month I recorded “Achieving Actionable Insights: Attribution Best Practices,” a webinar for ObservePoint’s Annual Marketing Attribution Symposium, with fellow colleague Lisa Altshul, and Chris Thornton and Molly Pilgrim from Ashton Woods Homes.
We discussed attribution, the importance of investment in our data, best practices around getting started, and pitfalls to avoid. Listen to our panel for the full discussion, but what follows is a quick recap of our journey towards full attribution and better data-driven decision making.
Some helpful context: We started working with Ashton Woods Homes back in 2013 (read more about our engagement here), and since the very beginning, we’ve worked in lockstep to push actionable insights and data-driven marketing.
This wasn’t an overnight transition or a quick fix, but a thoughtfully derived plan that took a lot of individuals working together. It’s taken years of conversations and educating to convince the right people of the importance of data in digital marketing. I say this not to scare anyone away from doing the work, but to encourage you through the hard parts, to say that the grass is greener on the other side, and your business will be so much stronger for it.
First steps: Advocating for end-to-end attribution
Our client begins by speaking to his role in advocating for end-to-end attribution and evangelizing the need for data-driven decision making. What this means in action is:
- Following our guts and investigating the tactics that we felt are pulling their weight in the media mix while ruthlessly cutting out those that underperform, and investing the lion share of our budget in proven winners.
- Transitioning investment from traditional tactics like billboards and newspapers to more online-focused tactics. At Search Discovery, we never maintained the client’s traditional advertising, however, as we moved further down the line of making data-driven decisions, we saw that our budgets continued to increase as performance was trackable and scalable, while the investment in offline tactics that are historically difficult to track has continued to decrease.
- Figuring out which attribution model to lean on and understanding the importance of combining a few models to get a comprehensive view of the data and the end user’s journey.
This part of the process toward getting our full data stack in order taught us a lot about how our data was being tracked and potential pit-falls and areas to clean up. But it also gave a deeper ability to trust the data and to know that we were making decisions on a solid foundation.
Overcoming initial roadblocks
We next discussed different roadblocks that will likely sound familiar to anyone who has worked in data attribution or has taken steps to simplify their organization’s approach to data in digital marketing.
- Importance of cleanliness of the data, tagging regulations and naming convention protocols
Continued internal education and advocacy for making data-driven decisions
- Cleaning up internal reporting and mapping everything successfully to the right channels, tactics, and campaigns.
- Laying the groundwork for better lead nurturing by clearly defining the parts of the funnel and training their sales teams on best practices around data entry.
Where we have seen the biggest area of opportunity from the agency side has been in holding our client partners, agency partners, and ourselves accountable for the continued success of this effort. Not only is data at the very core of what we do, but we use it in every conversation we have—so we relentlessly ensure that we are educating everyone we interact with on this account. We want to ensure that we’re all speaking the same language around data. We align reports to the most important business questions, we pry for more and more information that specifically relates to how we can tie online data to offline sources, and we always listen to the data.
Immediately actionable insights
There are a lot of things that can change about how you run your business when you have attribution in place—media mix changes, how a sales team nurtures leads, onsite testing, and more. We talked through a few of the approaches we took with this client on the webinar.
- We’ve shifted the media mix and the way we spend money on behalf of our client.
- In redefining and calculating new KPIs, we saw what channels drove the best quality leads, a metric that was impossible to truly quantify prior to seeing full attribution in action. We also started calculating velocity, a channel’s ability to convert a lead to a sale quickly, as well as the cost-efficiency of that tactic.
- We changed how we approach testing. With the granularity of data, we can determine winners of a test not just based on front end metrics like efficiency of driving traffic to the site or even onsite engagement, but on bottom-line impact to sales.
- Longer-term, we have adjusted our budget forecasting process when we are looking at annual budgets, and we’re looking at larger budget shifts throughout the year.
We have more data at our fingertips and can, therefore, make better-informed decisions. Our work is rooted in the data and tapping into the full potential of attribution makes all the difference in delivering excellent results to our client.
Surprises in the data
There were many surprises in the data, but it was the first time we could say that we truly understood just how complex our user’s journey was, and it was the first time we had the data to back the hunches and assumptions we had been working on.
- We were surprised by the length of the user journey and the number of touchpoints involved.
- The lower velocity of certain tactics that we thought were top performers actually showed that the investments were taking too long to pay out. And, on the flip side, we had some tactics with a higher cost per lead, but they converted to sale much more reliably and much faster.
Understanding these key pieces of information changed the way that we organized our media mix and changed how we went about making suggestions for new media opportunities when a specific challenge arose. This ultimately changes how conversations with the client went. The script transitioned away from spending incremental dollars in a low cost per acquisition channel, to spending those incremental dollars in a high-velocity channel if sales were needed quickly or spending those same dollars laying the groundwork in an upper funnel tactic that drives high quality leads that ultimately take a bit longer to convert.
WHAT’S NEXT:Hypothesis testing and user experience improvements
Our conversation was primarily focused on onsite testing and optimization. Now that we understand where our conversions are coming from, and the nuances of how the quality and velocity of those channels stack up, we can start testing hypotheses and making changes on the site to improve the user experience. We have dabbled in this work as our agency/client relationship has evolved, but now that the data is all there, we are empowered to start rolling out those hypotheses and tests on the site. Some other steps we’ve taken and that other companies can try out:
- Media mix adjustments – Cut spend immediately from tactics that don’t impact your bottom line. Invest in testing new channels and tactics that you haven’t tried out before.
- Differentiate creative by platform and tactic – Now that you know where this touchpoint falls in the typical user journey, use that information and speak to the customer like you truly understand their needs.
- Ad copy testing and personalization
- Audience testing and personalization
- Understand where your traffic is coming from—what sources, tactics, and campaigns are truly pulling their weight for your business.
- Define the categories you want to use long-term as you are structuring your data. This helps in keeping the data clean but also means your reporting will be consistent and your data will be actionable
- Determine what touchpoints are important to you as an organization and invest in those. You don’t have to continue to invest in a platform just because your competitors are present there. If it doesn’t work for your customers and the data proves that, then invest where it makes sense for your business.