Everything You’ve Always Wanted To Know About DTM Performance (But Were Afraid To Ask)

I’m pretty new to the analytics field. My background is in software development, building websites and apps for ad agencies on behalf of big brands. As a developer I was painstaking in my attention to the architecture and implementation of the sites I built. I used the right tools, the best practices, and the smartest patterns. My sites were lean, fast, and cleverly built. They were works of art. They were my beautiful babies.

In this context, it is not hard to see how “analytics implementation” can become dirty words the web artisan dreads to hear. You want me to scatter your inline scripts all over my beautiful markup? You want me to wedge some mystery JavaScript in the midst of my masterpiece? When my website is so lean and beautiful, why you gotta gum up the works? Get your grubby paws off my beautiful baby, baby.

Take A Little DTM To Ease Your Troubled Mind

Adobe Dynamic Tag Manager was created specifically to address the concerns of your most discriminating developer. With DTM, you don’t need to litter your business logic, interface code, or markup with extra event listeners or data collection calls. If your application code is broken up into discrete modules that each perform a single well-defined function and you want to keep it that way, I salute you, and DTM is here to help you accomplish and maintain your design goals. All you need to do is add a couple of script tags, and you’re done.

Too Good To Be True?

Good web developers are a skeptical bunch. While this is one of our many estimable qualities, it is natural that our virtuous skepticism should lead us to wonder what black magic DTM is performing behind the curtain in order to provide us with this fantastically non-invasive analytics implementation process. Because DTM takes so much work out of the developer’s hands, it raises questions about how that work is being done. What corners are being cut in order to make such a sexy product? What about performance?*What about best practices?*

With these concerns in mind, I want to address some of the common questions and objections that sometimes arise from the concerned citizens of the web development community.

DTM is such a big JavaScript file! It’s going to increase my load times significantly.

This may be the most common objection we receive. The DTM production library weighs in at around 55KB. To those considering making the jump to DTM, this can feel like a lot of weight to drop on top of an existing page load.

This seems compelling: if you want to be faster, you gotta drop some weight, right? Count those calories. I’ve seen The Biggest Loser. I get it.

But this number (55KB) does not tell the whole story. First of all, DTM’s CDNs use gzip to compress its assets on the fly. This means that your 55KB DTM JavaScript file comes down the pipe weighing something closer to 15KB. Not so scary right?

To gain some context for this number, let’s start by visiting your website. In fact, we can start by visiting virtually any website. I’m going to go to Google, the famously sparse, prodigiously quick-loading search page. If we pull up our developer tools’ “Network” tab, what do we see?

Well, we see 11 requests of several different flavors. Mostly images and JavaScripts. Google.com’s JS when added together weighs in at 380KB, which seems like a lot, but hey, Google gets a pass since those 380 kilobytes are powering the most popular web page in the world, right?

Screen Shot 2014-03-11 at 2.57.00 PM

But what about those images?

Screen Shot 2014-03-11 at 3.01.24 PM

Those images add up to 90KB. Let’s consider that number for a moment. 90 kilobytes seems quite a lot for a page so visually sparse. Let’s take a look at the image load for a more visually complex website. The homepage for Tealium (analytics provider and website weight loss advocate; the “Biggest Loser” of tag management providers) is pretty snappy looking. Let’s shuffle our way over yonder.

Screen Shot 2014-03-11 at 3.17.13 PM

Whoo boy! 793KB of (gzipped) images! That’s a bunch. You could load DTM fifty times over and still not reach that kind of load.

The point here is not that websites use too many images–we all love pictures and they have made the internet nice to look at. The point is that conversations about the supposedly camel-back-breaking weight of a 15KB JavaScript file are happening in a context where no one bats an eye at >500KB of images. In other words, the kilobytes taken up by images are generally not subjected to the same scrutiny as JS kilobytes. Why is this? Probably because the value of images is immediate and self-explanatory, while the value of any given JavaScript is more abstract.

However, if 15KB of JavaScript helps you to efficiently and affordably accomplish your business objectives and provides you with valuable insight into how your customers are using your web properties, what are you going to do? Ditch the JavaScript, or use one less jpg?

Okay, maybe DTM is not so big, but still: it’s loaded in the top of the page. While it loads, it blocks everything else on the page from loading.

Since DTM needs to be able to load any kind of analytics tool, and some analytics tools need to be loaded at the top of the page, DTM also needs to load at the top of the page. Therefore, it’s true that DTM is “blocking.”

However, this blocking occurs only the first time DTM is loaded on your site. Every time thereafter, DTM will be loaded from your users’ browser cache, creating a trivial delay in page load times on successive page views (generally <50ms).

Screen Shot 2014-03-11 at 4.22.11 PM

But the size of DTM isn’t the only thing that matters. What about the speed and reliability of delivery?

When you’re trying to increase the speed of a page load, overall page weight is a major consideration, but there are other variables to consider.

To understand the lifecycle of an HTTP request, consider the following: if I want to carry a box from one side of the room to the other, there are a few things that will dictate how quickly this can happen. First, I have to walk over to the box from where I am standing, pick up the box, and then carry that box back to the point where I started. How long this takes is determined not only by how heavy the box is, but by how far I am from the box in the first place.

It works the same way with HTTP requests: The request goes from point A, travels to point B, then returns to point A, this time carrying a heavy load (the response). Consequently, the actual geographic location of the request computer in relation to the response computer will affect the time it takes to serve up content.

In order to facilitate faster load times, distribution of content to client computers can happen via content delivery networks (CDNs), which are server centers distributed around population clusters. When I make a request for an image hosted on a CDN, that request will be distributed and returned via the closest geographical server.

DTM provides this functionality right out of the box, and it does this using best-in-industry CDN services. If, however, you wish to use a different CDN or even your own servers, DTM provides you with the flexibility to choose other options.

Why load so much JavaScript anyway? Why not just use a smallish JS library that sends all my analytics requests to a server that will process them there?

When building applications and services that involve an interaction between a client computer and a server where a core application is hosted, there are always lots of decisions to be made about which computer–the client or the host–does the heavy lifting. It is easy to imagine that DTM could have been built as a much smaller JavaScript library running on the client computer that interacted with a robust “middleman” server that processed data and then sent it on to analytics data collection services (like SiteCatalyst or Google Analytics). And in fact, there are several tag management systems that follow this pattern.

But there is a definite reason DTM chose early on not to opt for the man-in-the-middle approach that these other companies have jumped into: By putting a processing server between customers and analytics services such as SiteCatalyst or Google Analytics, these companies create a single point of failure for the collection of analytics data; if their service goes down, your data is gone. Since DTM is distributed and cached across all client computers, it is impossible for it to “go down” in this same sense.

By forgoing a middleman, DTM not only removes this single point of failure, but also becomes essentially transparent from the point of view of data collection services. These services wrote their libraries in JavaScript (see Google Analytics’ ga.js and SiteCatalysts’ s.js), and DTM is a JavaScript library that speaks their language and facilitates their use. DTM works so well with these libraries and services because it was built with the same philosophy and operates on the same plane–JavaScript running in a browser on the client computer.

DTM is going to put crazy amounts of event listeners all over my page! That will drag down the performance of my UI.

Although DTM can listen for virtually any kind of event occurring within the DOM, by default, it does not do this by attaching events to every single element to which it wants to listen. Rather, it uses a technique for listening for browser events called “event delegation.” This way of listening uses a principle of the browser’s DOM API we will call the When Children Make Noise, Their Parents Hear It principle. What this principal signifies is that, if an event happens on an element that has a parent (i.e. a “child” element), this event does not fire on the child element and then call it a day. No–this event “bubbles” up to the clicked element’s parent element, and then it goes up on to that element’s parent element, and so on, until it reaches the final parent. This is where DTM listens. In this way, DTM “delegates” the task of listening to the parent, and then itself determines how best to deal with the received events.

DTM understands me.

I want to be clear: DTM is not “perfect,” and any technical solution has its upsides and downsides.

That being said, the “upsides” of DTM were targeted and developed with the exacting concerns of people like me–web developers and software architects–in mind. As a result, DTM is a tool that I believe any marketing team and development team can be confident adding to their technical toolbelt.

If you have any questions about DTMs architecture or technical details not covered here, please reach out to us–we would love to hear from you.

Leave a Comment

Your email address will not be published.

Contact Us

Related Posts

Join the Conversation

Check out Kelly Wortham’s Optimization based YouTube channel: Test & Learn Community.

Search Discovery
Education Community

Join Search Discovery’s new education community and keep up with the latest tools, technologies, and trends in analytics.




Scroll to Top


Catch the latest industry trends we’re watching and get new insights from our thought leaders delivered directly to your inbox each month.