What have you done the same way for the past 20 years? If you’re into analytics, the answer to this question is a lot. But change is finally coming to the way we manage implementations.
When I started in digital analytics (back in 2003), implementing code for tools like Omniture SiteCatalyst (now Adobe Analytics) and Urchin (now Google Analytics) was pretty hard. You had to manually add code to your website, sometimes doing it page by page. Most of us in the field back then didn’t know what we should track, so we reactively tagged things as they arose.
Many years later, when I worked in the Omniture consulting group, we were responsible for getting new clients tagged right after the contract was signed. At first, a client implementation was heavily influenced by who was assigned as the consultant. If the client got an experienced consultant, they likely had a better implementation than if they were assigned a more junior one.
After a while, Omniture noticed that clients, in general, were not doing all they could with the product (I am sure Google Analytics customers had the same issue). To remedy this, we created “Omniture Fusion” documents, in which we documented the main things that each industry vertical (retail, finance, etc.) should do with the product. This started to bring some semblance of standardization to client implementations, but since it was just a MS Word document, it didn’t go too far and wasn’t mandatory. This was a new approach, but one that didn’t go too far.
Tools like Ensighten, Tealium, and Satellite (from Search Discovery) appeared on the scene, with the latter being acquired by Adobe and renamed DTM. While tag management tools didn’t necessarily improve the quality of the implementation from a business standpoint, they did improve the quality from a data integrity and speed standpoint. In other words, you could still have a crappy analytics implementation, but the tagging of your crappy implementation would be very robust!
Unfortunately, there haven’t been too many changes in analytics implementations since tag management systems. Reporting tools have become more robust (Data Studio, Analysis Workspace), but the quality of digital analytics implementations has been left to each organization to fend for themselves. As I mentioned in my last blog post, most organizations have built custom implementations. These implementations are often not great for a few reasons:
- Lack of Business Requirements – Most organizations don’t take the time up-front to identify their business questions
- Lack of Best Practices – Many organizations don’t have people who know the ideal way to implement their business requirements due to lack of tool knowledge or because they haven’t been through many past analytics implementations
- Implementation Issues – Many organizations don’t leverage the most recent types of data layers and they re-implement every few years instead of continuously updating their implementation
- Poor Documentation – Many organizations have static, outdated implementation documentation which ultimately leads to data quality issues
When I review analytics implementations, I often find that organizations are only scratching the surface of what they could be doing. Also, even within a single organization, implementations are different among their various websites and apps.
As I stated in my last post, our industry needs to shift its thinking from custom implementations to implementations that are driven by proven best practices. Our industry has now been around for almost twenty years, but we have not found a way to fully leverage all that we have learned along the way. My father used to say to me: “It is noble to do hard work and climb a ladder, but before you do, make sure the ladder is up against the correct wall!” His point was that doing a lot of hard work is useless if you aren’t doing it in the right way.
In my next post, I’ll share what I think will be the next paradigm shift in digital analytics implementations…
[Spoiler alert: It’s Apollo, the world’s first analytics tag management system.]