A circular puzzle of Google Analytics bar chart logomark.

GA4 Guide Chapter 8: Validating Your GA4 Tracking Code

,
,
Sep 13, 2022

Now that you know how to implement your GA4 tracking, it is time to discuss validating your tracking. Think back on a time you attempted something brand new. How did it turn out? Was it perfect? Probably not. Were there some areas for improvement? Probably. Whether this is your first digital implementation or you’re a seasoned veteran, you will want to assume that the first time you validate your GA4 tracking, there may be an “i” left undotted, a “t” or two left uncrossed.

In this chapter, we’ll cover what tracking validation is, why you should always validate your tracking during an implementation, and, of course, how to validate your GA4 code. Following our recommended best practices will solidify your foundation and pay dividends throughout the implementation process and beyond. Neglecting these steps will incur the risk of poor data quality and costly re-work downstream.

There are four ways data can be sent into GA4:

  1. The Firebase SDK (for native apps)
  2. The Google tag (JavaScript for websites)
  3. A tag manager
  4. Measurement Protocol

This chapter will not cover the fourth method, Measurement Protocol, but you can apply many of the concepts discussed here to that method.

WHAT IS TRACKING VALIDATION?

Tracking validation is a rigorous process for testing if the tracking is working as intended. As the saying goes, “You can’t build a great building on a weak foundation.” Your digital analytics implementation is the foundation for capturing valuable business insights, but can you ensure that foundation is strong? If you cannot answer these questions, you cannot conclude that your foundation is strong or that your implementation is valid.

  • Regardless if you’re using gtag.js or GTM tags for Web, it’s paramount to know if your data variables are mapped correctly to the variables in the data layer. Is the trigger logic consistent?
  • On the GTM side, are your event tags configured with the correct event details, and parameters, and are they set with the appropriate trigger?
  • Are the variables populating as expected in the data layer?
  • Are they being captured and sent to Google Analytics during the expected use case?
  • Are we seeing the data captured in Google Analytics?

Tracking Validation is your implementation’s Quality Assurance (QA). Without proper QA, you cannot trust your data. Establishing trust in the data is essential to a successful implementation that meets your organization’s business requirements.

There are three primary goals for tracking validation. The first goal is to reveal any issues within the implementation, which is fairly obvious. But to ensure the tracking code is working as intended, we must reveal and resolve all issues leading to unexpected outcomes. Recording these issues, their fixes, and the final resolved state is essential for establishing assurance in tracking quality. Providing this record of assurance is our second goal, and the third goal is to overcome any challenges to accomplishing the first and second goals, including technical and systemic challenges.

Note: Although there are several methods for tagging web applications, this chapter will mostly cover examples using GTM. Not all, but most concepts translate to other methods, such as gtags.

PRIMARY GOAL: UNCOVER IMPLEMENTATION ISSUES

Step One: Beacon Creation
Creating the beacon involves the client-side browser executing a rule pulling from an original data source (usually the data layer, DOM element, or SDK). Rules can be governed by the client-side tag management system (TMS), which applies to web only because native apps do not require a client-side TMS. Issues in the TMS rules can lead to tracking issues.

An example of an issue here would be an incorrectly created variable, such as a transaction variable that’s looking to capture the product purchase price at checkout as a numeric value. However, if that price data point is passed as a string into the data layer, the variable will not set the expected price value in the beacon, resulting in Nulls seen in GA.

Other times, the combination of the TMS and the data source can lead to a wide array of potential issues. The first step to mitigating such issues is to pull data from the data layer.

The data layer is an industry term that is meant to describe a central place or mechanism that may be used to facilitate the communication of information (data) from one piece of website functionality to another. In application, it most often depicts a globally scoped JavaScript object or array [which is a digital repository holding captured user-browser interactions on any domain page] that is used to communicate information from a website application to a tag manager.

Stewart Schilling, Search Discovery Analytics Architect

Google’s standard method for the data layer is the Event Driven Data Layer (EDDL) approach. If you’ve only worked in the Google Stack, then you may not be aware that there are alternative methodologies for the data layer. Users of other platforms, such as Adobe, may be more familiar with other options.

When executed correctly, the data layer is the most robust source for continuous and accessible data. Other sources, such as DOM element scraping, fall short in this regard and require a closer relationship with the dev team and QA to maintain tracking. An example of an issue could be a DEV page update resulting in a DOM element missing a class name that your tracking depends on.

As the data layer is the primary focus, let’s briefly cover common issues arising from the configuration of your tracking with the data layer:

Event Missing: When you complete an action on the website, but no event is added to the data layer. Note that for link/CTA clicks, you’ll need to add JS to the dev console to prevent the click from taking you to the next page in order to accurately check the data layer.

Example: You are on a product listing page (PLP), but there is no “Listing Viewed” event in the data layer.

Event Formatted Incorrectly: The data layer event is present, but the format or structure does not match the data layer specs, preventing the Google Analytics beacon from firing.

Example: You are on a product listing page (PLP), but Listing Viewed > listing > listingResults > item is not an array.

Value Missing/Incorrect: The data layer event and key:value pair exist, but the value(s) associated with one or more keys is either blank or does not match what you see on the site.

Example: After placing an order, the data layer captures the item price as the shipping cost.

Value Formatted Incorrectly: The data layer event and key:value pair are both present, but the value format or structure does not match the data layer specs, preventing the Google Analytics beacon from firing.

Example: You are on a product listing page (PLP) and see that Listing Viewed > listing > listingResults > resultsCount is a string when it should have been numeric.

Timing Issue: Data layer Events are firing in the wrong order, or the order in which they fire is preventing the Google Analytics beacon from being sent.

Example: You are on a product listing page (PLP) and the Listing Viewed event fires after the Page Loaded event, preventing all the product information from being sent on the beacon.

Wrong Event Fires: You complete a user interaction, but a data layer event for a different interaction is added to the data layer.

Example: You add a product to the cart, but the Product Removed event is added to the data layer, rather than Product Added.

User Action/Event Mismatch: When a data layer event does not fire at the correct point of a user interaction, as defined in the data layer specs.

Example: On a two-step form, the Form Started event does not fire until the user clicks “Next” on the first step.

In addition to data source issues with the data layer, other common issues stem from the TMS set-up or interaction with the host domain:

Multiple Properties Conflicts: Data sent to local storage, but the next page view can be to a page not currently on a GA property, so the data is not included on the analytics beacon.

Example: You interact with the advisor popup, but navigate to the homepage before completing the questions, and the interaction is not captured in the beacon sent on the homepage Page View.

GTM Issues – Front-end beacon rules: If you complete a user action and the Google Analytics beacon does not fire, and you have ruled out all the issues listed above, then there is likely an issue with the GTM configuration.

Example: Checking the variables, tags, and trigger configuration, the applicable trigger was not set up to specification, resulting in no action.

Site Functionality: There is an issue on the website you are testing that prevents you from completing the user action(s) needed to complete a test case.

Example: Before accepting Consent Management, you are unable to complete an order.

Beacon Firing Inconsistently: When the Google Analytics beacon fires, the values are correct, but the beacon does not fire under all the conditions it should.

Examples: The Product Removed event fires when a user clicks the trash can icon or minus button but not when manually updating the quantity and hitting enter.

Step Two: Processing the Data
When an issue is detected and we can rule out the beacon-level causes above, we next investigate potential issues during the data processing step. Note: Do this test for all events after passing the beacon-level testing. If you’re using a server-side tag manager, check the configuration. Otherwise, you can skip and test if Google Analytics is working. Check the Property (GA4) or View (GA Universal) filters—frequently, these might exclude data generated by certain IP addresses. Check the Property configurations. Is the data populating reports correctly? Check that someone has marked a parameter as a custom dimension and events as conversions.

SECONDARY GOAL: PROVIDE A RECORD OF ASSURANCE

With the primary goal understood, we can move to additional benefits from logging the testing to ensure trust in the data provided. No matter how shiny and lean our implementation is, if there is no organizational trust in the data provided, then there is no way clients will draw insights to drive business impact. Maintaining a record of QA is essential for building and maintaining trust in the data, and success starts with the record validating the implementation rollout.

Where should this record be?
Maintain the QA record in a designated testing document. This testing document should record pertinent details for all test cases, including the date of the test, the testing environment, test parameters, results, and any subsequent fixes put in place as a result of the test. Records should span each test case’s planning, executing, resolving, and closing.
The testing document can reside within a more comprehensive technical document, and it’s good practice to include the testing document within the Solution Design Reference (SDR) Document. This gives you the benefit of creating a centralized solution location where you can find all applicable business questions and all variables necessary to answer these questions.

Note: Variables include events, parameters, and user properties. Optionally, you can include instructions for the development team in the SDR document. In this case, the SDR becomes a true one-stop shop for all parties.

Is it essential to maintain good documentation?
Absolutely! There is no way to validate and convey that your tracking is doing what it is supposed to do if no one has written down the test results somewhere along with test instructions. This is even more important when merging data between multiple streams. Recording in a lean test doc described above streamlines this process. Later, we’ll discuss our recommended approach and a template for documenting test cases in GA4.

TERTIARY GOAL: OVERCOME CHALLENGES TO QA

While validating your GA4 code, you might face some challenges like these:

  • Ensuring complete regression testing is performed for each tag update. The term regression in testing does not pertain to a form of predictive statistics that models the relationship of independent and dependent variables using algebraic equations. In tracking QA, regression testing means testing each net new or previously validated tag after a tag update. Do this to ensure that your tag update does not create any unexpected or malicious consequences for your existing tracking. Regardless if your case is during the implementation roll-out, regular maintenance, or downstream updates, you’ll need to ensure you’re performing comprehensive testing for your test case and across all domains, environments, and user experiences (if applicable).
  • Understanding the many different scenarios in which a given tag/event should be implemented & validated across a site. This is essential for accomplishing the previous item.
  • A great deal of metadata is required even to begin tagging or QA.
  • Tagging validation communication is often decentralized and inadequately detailed. This is where incorporating your testing document within the SDR helps centralize your contributors (Analyst, DEV team, marketers, other stakeholders) to reference one document for alignment and optimal work stream. This incorporation helps with grasping overall tag implementation status, bottlenecks, and task ownership. Since it’s not always easy to know what variables are related to one another, having streamlined communication can help accelerate alignment and other issue resolution.
  • Often, the tracking validator is unfamiliar with the analytics tools, including Google Analytics Checker or Google Analytics Debugger. We discuss these tools below.

HOW TO VALIDATE YOUR GA4 IMPLEMENTATION

Effective QA focuses on two key areas of an implementation: technical validation and analytical validation.

Technical Validation: During this phase, we validate that we have clearly mapped all events, parameters, and user properties to the testing conditions. We ensure that the beacon and data in GA4 include all specifications outlined in the QA document. Then we validate whether the parameters are set to the correct parameters.

Analytical Validation: This validation is not as technically involved but is paramount for ensuring our tracking supports the core business questions it was implemented for. Questions we consider include these:

  • Are we clearly mapping all events, parameters, and user properties to the business questions?
  • Are all business questions being answered by the mapped variables?
  • Will analysts be able to understand/interpret the data being passed easily?

Note: QA revolves around test cases. The Test Case document is a living, breathing document, and test cases can be created for a new or existing implementation. The Test Case document is not any of the following:

  • A replacement for the SDR (Solution Design Reference)
  • A template for audits
  • A change log
  • A business requirements document*
  • A data layer validation document*
    A reporting validation document*

TECHNICAL VALIDATION STEP 1: INTRODUCING OUR RECOMMENDED TEST CASE TEMPLATE

When constructing a testing document for technical validation, it is good practice to start with the beacon. Beacons are server calls from the organization’s domain hosting server to an analytics platform server, e.g., GTM.

For an effective QA, test case documentation must be structured and clearly expose in full detail the pertinent testing criteria. This includes what technical features are to be tested, where they reside, and how they are tested (defining the case scenario). In addition, it must be clear who is doing the testing, the credentials that person needs, if any, the expected and returned results for each feature tested, and a clear space for pertinent notes/comments.

All this must be clear to the tester, whose background in the project can vary from a technical rock star with years of industry experience to a fresh recruit pulled in for emergency test support. Our best practices provide a set-up that, when followed, will make testers of all levels successful.

Of course, a suitable Test Case document can be structured in various ways. Search Discovery recommends the following syntax to test GA4 test cases. This document’s structure is geared primarily toward technical validation. You can find the link to our Test Case document template here.

Using the below screen grab of our Test Case example, let’s take a moment to review the document structure. We’ll go column by column, starting from the left.

Test Case Document

Scenario: This section can be as broad or as descriptive as needed, but a brief descriptive label of your test case resides here. In the example, page load is site behavior we’ve tracked and now aim to test.

Region: This section can define a distinct geographical location where the target user audience may reside. In our example, Region relates to the country level and specifically the United States (US). Greater geographic levels can be specified (such as the EMEA region) or lower levels, such as Cities. If not applicable, it can be left blank or labeled “‘worldwide.”

Environment: Does your test case only pertain to Desktop and Web properties, or does it pertain to Mobile Applications? Here you can inform your tester of how wide or limited the user audience is, by technological device.

Test URLs / Screen Name: This column holds the URL for all pages, a single page, or a subset of pages (typically site sections). We recommend the entry link directly to the page on which the test is to be performed. If the scenario is relevant to all pages, such as a Global Pageview event, we commonly specify “All Pages.”

This is acceptable as long as the home domain link appears in the header of the Test Case document along with any pertinent account details. Sometimes, you will test “ungated” sites that don’t require a login, but other times a test account is required and credentials for your tester should be placed in the first rows, as shown.

Test Instructions: Here you define, in as much detail as possible, the steps your tester will take to sufficiently test tracking for the event scenario. For experienced testers, you can include as much detail as necessary.

The next three columns relate to the GA4 tags added during migration/implementation:

  • Event Name: Include the name of the GA4 event, as shown in GTM.
  • Parameter Name: Include all pertinent parameters for the Event.
  • Expected Value: Include the expected value for each parameter.

The following two columns are only pertinent for GA4 migrations from GA3 (UA). If the GA4 implementation is net new, then no entries need to be made in this section.

Parameter Name: Include all pertinent UA parameters. These may be codified as shown above. For example, t stands for Hit Type, which is usually set to ‘pageview’ or ‘event,’ while cd1 stands for Custom Dimension 1, the 1st custom dimension in our former UA implementation.

Expected Value: Include the expected value for each UA parameter.

QA Status: This important column reports the status at each parameter level for test scenarios. Starting as a blank entry, various options are possible to record as you go through the QA. Typically, we are most interested in whether the test Passed or Failed. Using our template, these entries are automatically run through a vlook-up calculator to report the progress of assessment in the header, as shown above. This streamlines communication and awareness of testing progress and any blockers.

Assigned To: Here we report the test case owner at the parameter level, typically the tester at first.

QA Note: Lastly, here is where the assigned team member can log comments and notes to be shared with the team. We recommend all entries include the assignee’s initials and date, followed by the note.

At a high-level, using the Test Case document could look like this:

  1. A tester has completed a test scenario and found some parameters did not pass the expected values.
  2. The tester would mark ‘Fail’ for those parameters in ‘QA Status,’ add a description for the developer in ‘QA Notes,’ and assign to their name in ‘Assigned To.’
  3. They then would ping the engineer, as the document will not auto-alert them, who’ll review the entry in the Test Case document.
  4. Once the developer has completed a fix, they will assign it back to the tester in ‘Assign To’ and alert them.
  5. The tester will again follow the QA steps and close the loop if the test case fix was successful, marking the item ‘Pass’ in QA Status. If not, once again, the tester will mark ‘Fail’ and repeat the previous steps, noting the finding and assigning back to the developer for resolution.

This cycle will repeat until all test cases are passed and the implementation is marked complete.

TECHNICAL VALIDATION STEP 2: USE THE VALIDATION TOOLS

Let’s start with some fantastic news: You won’t need to use a proxy like Charles or Fiddler to validate GA4. For some more good news, if you’re familiar with popular testing tools in UA such as Google Analytics Debugger or GTM Preview Mode, you can still use them.

These are good tools for validating websites, but they are not helpful at all for mobile apps. If you use GTM’s Preview Mode, be sure to install the extension Tag Assistant Companion, which supports features of your standard Mode for an enhanced experience. No one wants a URL-restricted pop-up when they’re testing, am I right!?

Analysts can use GTM Preview Mode and/or Google Analytics Debugger to perform technical validation, but for mobile app support, additional installation is necessary.

GA4 introduces a new reporting tool called DebugView, which provides a robust new special QA report that will significantly improve your testing experience across devices, especially mobile apps.

DebugView is a real-time report that allows you to isolate the events, parameters, and user properties that are flowing in from a specific device where Debug Mode has been enabled.

What is Debug Mode? Well, let’s just say it makes your testing life much easier. Without it, Google Analytics 4 will batch your mobile app events together and send them across the network in bundles (which is part of the reason you shouldn’t use a proxy!). When Debug Mode is running, your data will be sent immediately as you run tests on your app. Also, data captured while Debug Mode is running will be filtered out of your other reports so that it does not artificially inflate your metrics. This eliminates the need to separate GA properties in a production or staging environment.

GA4 introduces a new reporting tool called DebugView, which provides a robust new special QA report that will significantly improve your testing experience across devices, especially mobile apps.

To enable DebugView on each platform, you’re going to need to install some software for each case:

Websites: Install the Google Analytics Debugger Chrome Extension

Android & iOS Apps: No installations required. You’ll simply add a single line of code to enable or disable debug mode.

Android Apps

  • To enable: In Android Studio, enter the command line argument in Xcode: $ adb shell setprop debug.firebase.analytics.app PACKAGE_NAME
  • To disable: In Android Studio, enter the command line argument in Xcode: $ adb shell setprop debug.firebase.analytics.app .none

iOS Apps

  • To enable: In dev UI, enter the command line argument in Xcode: FIRDebugEnabled
  • To disable: In dev UI, enter the command line argument in Xcode: FIRDebugDisabled

For a more in-depth description, please visit this stellar article by our very own Ken Williams.

TECHNICAL VALIDATION - THE SIMPLIFIED TESTING PROCESS

Simplified Testing Process

Now that we’ve introduced the Test Case document and the tools to perform technical validation, we’ll briefly cover the core methodology of the testing process. The above schematic diagrams the general testing flow process. Using one of the recommended testing tools, a tester will validate that the expected parameter values are all present within the beacon under each of the specified conditions within the Test Case document. If so, our tester can mark “Pass” and move on to the next item and repeat testing until all cases are validated. If not, the tester should mark “Fail” and consider if there are errors lying in the data layer and/or within the tracking configuration.

Checking the data layer, the tester may find no value is set to the data layer variable or the variable is missing entirely. If the expected value is present for the applicable variable in the data layer, the tester should check the config in GTM. The tester may find the GTM variable does not map correctly to the data layer variable, or the trigger does not reference the GTM variable and/or lacks the appropriate firing conditions. Check that the tag is correctly configured to set the expected value. Narrowing this down provides valuable context for the developer to resolve.

In the previous section, we explored the common data layer issues you can face during implementation testing. Referencing those while testing will help identify most issues you will encounter. Once the issue is sufficiently explored and defined, provide sufficient details within the QA Notes of your Test Case document and assign the test case to the developer. Once resolved, retest the case by going through the testing process. Repeat these steps until all test cases are validated and you can rest assured that your implementation is ready to push live. Yahoo!

A BETTER WAY - ENTER APOLLO!

Shameless plug alert! After making it this far, I’d be remiss if I did not inform you that there is an even better way to execute this fundamental process. That way is a solution we offer at Search Discovery, Apollo.

Apollo is an Analytics Management System (AMS) that helps you streamline and improve your new or existing analytics implementation. Its primary validation feature bridges the gap between the technical validation and analytical validation domains.

You may have noticed that we have yet to revisit Analytical Validation and where it falls in testing. Well, that’s the thing: It typically falls out of scope during tracking testing. Consider that alignment of your implementation goals and testing goals with your desired business outcomes is the top priority during pre-implementation planning, during the implementation, during the validation phase, and extending beyond the implementation production launch. Typically, alignment is at the forefront in the planning and the post-launch phase; however, it tends to fall under the radar during the implementation and validation phase.

One reason for this is the decentralization of your implementation spec, your applicable business requirements, and your Test Case document. Analytics validation can be supported during the testing phase by including the Test Case document within the SDR hosting your GA4 Developer Requirements. These reqs should be mapped to the business requirements, and they should also be in the document and readily available for reference during testing. When the requirements aren’t included (as they often aren’t), they’ll be missed after the test flow is underway.

Risk: This can result in a valid implementation that unfortunately is incongruent with the business requirements warranting the implementation.

Apollo’s robust production system requires mapping of developer requirements with business requirements in a manner that standardizes analytical validation throughout the implementation. This ensures that tags pass business compliance requirements.

In Apollo, you first define business requirements that are mapped directly to your tracking events. Within the UI, you can create test cases via a structured intuitive flow from mapped event selection, attributes you’re interested in, expected values/success conditions, and all other pertinent details.

Apollo goes a step further by analyzing your entered test case details and generating a test code in the form of a site URL appended with a test query string. Loading this test URL in your browser will auto generate the testing conditions. This greatly simplifies the work for your tester. Finally, you can export a test case document straight from Apollo. This report is customized to your implementation and is dynamically configured by Apollo.


Apollo Demo

WHAT TO THINK ABOUT GOING FORWARD

YOU DID IT! NOW WHAT?

Now that you’ve completed the rewarding exercise of tracking validation, you can finally relax and grab a cold beer! Proceeding through all the steps and successfully validating your tracking, you can now rest assured all your GA4 implementation team’s hard work has paid off, and you are ready to go live. And for that we can certainly throw our hat!

COMMON ISSUES THAT ARISE

But one step remains: prepping for the post-launch phase. Post launch maintenance may seem daunting, but taking the time to develop a structured plan akin to pre-launch implementation will cover your bases. We can break your post-launch duties into three core tiers once you start ingesting production data.

  1. Monitor for internal tracking issues. This includes any of the data layer common issues mentioned in the previous section. Also, If you configured any tags to pull from non-datalayer sources, e.g., HTML elements, you’ll need to get comfortable checking in with the site developer updates and regularly check such tracking for disruptions.
    Interaction of events may result in race condition issues, which occur when the rules governing events are not met because of the order in which that event fires in relation to another event. For example, within an ecommerce checkout flow, an enter_promo event fails to grab the entered promotion and fire, due to the time this action takes relative to the user submitting the order resulting in the purchase_product event firing without inclusion of the promotion in its promo parameter.
  2. Monitor for external tracking issues, including bot/DDOS attacks and updates to platforms structures, e.g., Vimeo player updates configuration for video play tracking.
  3. Test all deltas (post-launch tracking updates). Deltas are any change made to your TMS, including addition of new tags/triggers, modification or pause/removal of existing tag structures. You’ll need to QA all additions and reductions to the tracking, but the good news is that the testing flow used in implementation is unchanged for validating deltas.

SOME RECOMMENDATIONS

For ease of maintenance during the post launch phase, we recommend the following:

  • A biweekly/monthly cadence reviewing all deltas added to implementation.
  • A quarterly or bi-annual review of all tracking.
  • Setting alerts in GA4. At a minimum, you’ll want to configure automated alerts for core traffic and metric drops (from potential tracking issues) and spikes (from potential bot traffic). You can also use specialized tools for automated technology governance such as ObservePoint for monitoring the health of your implementation.
  • Creating designated QA reports to be reviewed monthly and diagnostic QA reports to be reviewed when alert is triggered.
,
,

Read More Insights From Our Team

View All

Take your company further. Unlock the power of data-driven decisions.

Go Further Today