This post is authored by David Vallejo. At the current juncture, this blog is not configured to support for multiple authors. I hope to remedy that in the future. ~~Yehoshua

Since Google Tag Manager was released, here at Analytics Ninja we have faced a number of problems when migrating a “hard-coded” Google Analytics implementation.

The most common problem relates to actual moment of deployment, when the old Google Analytics code has to be removed from the site. This is not much of a problem if we have access to the site and we can directly remove the hard-coded snippet at the same time you publish your new container. But let’s be real, this is not the usual scenario. We usually rely on some other company or the client’s IT department to try to synchronize the deployment. This leaves us in a challenging situation because if they remove the code and we don’t publish the container right away, some data may get lost. Or, on the other hand, if they don’t remove the code and we publish the container the hits will be sent twice. Or even worse … if we are migrating from Classic to Universal at the same time we’re moving to Google Tag Manager we could end even messing up the sessions/users/bounce rates and any other metrics/dimensions.

This problem is aggravated if we are working on a multi-domain implementation where each of the domains is being ran by different business groups… if it was hard to synchronize with one team, just imagine if there are 2 or more business groups in different time zones and having everyone required to make changes at the same time …

Even if we are able to get everyone involved in the migration, there will always be some little time gap between the code removal and the container publication.

One solution is to use a piece of code that will allow us to block all our new tags if the old code is still on the site. We will just need to schedule a date for the migration and using a simple macro and rule we’ll be just firing our new tags on the pages that have the old code already removed. This way even if there is a long timeframe to get everything sorted out, the GA data won’t be affected at all.

For this to work, we need a macro to get the current Google Analytics status on the page. For this we’ll configure in our Macro the UA property names (we’re using an array because we may have a dual tracking implementation, or the page may have a third party GA tracking and we don’t want to mess up with them). Then we’ll loop through all the trackers available to get their configured UA account, and if it matches our properties arrays we’ll return true. See following flow diagram to see what’s the macro’s logic:

Google Tag Manager Migration Macro Flow Chart

The next step we need to take is to setup a new firing rule, that will allow us to block our tags if the old trackers are still on the pages.

Macro Code

We then add a blocking rule to our Google Tag Manager tags so they are not executed if any previous tracker initialized on the page.

We’ll need to keep one more thing in mind. As GA/UA code is asyncronous it may happen that Google Tag Manager tags get fired before the old code get executed, so we’ll need to delay our current tags while we’re migrating . This can be done setting the firing tag to DomReady event ( gtm.dom ), or even better for Window Load ( gtm.load ). This is not the best way to run an analytics implementation as we normally want our analytics tag to get fired ASAP, but we’ll be changing our tags firing rule to gtm.js/All Pages when we check the old code is already gone from the pages.

Let us know what you think about this implementation method in the comments section below.

A better way to measure content engagement with Google Analytics

This post is inspired by a conversation that I had with my friend and colleague Simo Ahava at Superweek as well as a recent work request from a well-established Italian publisher. In short, the publisher was quite challenged by the fact that they had an 85% bounce rate, and that their time on site was so low. Their articles tend to the get many hundreds, if not thousands, of Facebook likes, so “how could it be that users were spending so little time on site?!” Their average time on page was around the three minute mark, so how could be that average session duration was significantly lower?

Challenge 1: Google Analytics tracks time on page / on site by measuring difference between time stamps of hits. If the page is a bounce, no time will be recorded.

Challenge 2: Even if the page viewed is not the bounce/exit page (and thereby has a time greater than zero), GA doesn’t distinguish between time on page/site if the browser window is in a hidden or visible tab

After a lengthy explanation to the client informing them of the way the Google Analytics tracks time on page (and by extension, time on site), they were still stuck without a way to accurately measure content engagement. First of all, there are a number of different ways to measure engagement besides time on page / site. Many posts have been written about this and I urge readers to seek those out since time metrics gain too much undue focus as it is. As things stand, since this publisher’s site was not configured with any event tracking (a scroll tracking module would be great), they were seeing many users come to their site, view one page, and then leave. Unfortunately for them, “out of the box GA” does not provide very good insights into the nature of how users are interacting with their content. “Are they even reading the content?”Continue reading REAL Time On Page in Google Analytics

From Data Layer to Dollars…

Some visitors are more profitable than others, and thoughtfully created Remarketing Lists can help businesses focus their ad spend on the most valuable visitors. This can improve revenue, reduce costs, or both.

The key is discovering the common characteristics of visitors which make them more valuable than an average visitor, and then preferentiallydelivering ads (i.e. bid higher) to users who have these characteristics. In other words, if you can segment your visitor base to identify which users have a higher potential value, you’ll be able to make smarter decisions with your advertising budget. Utilizing features of Google Analytics and Google Tag Manager provides the opportunity to do this.

One of my favorite features of the Google Analytics / DoubleClick integration is the ability to add users to Adwords Retargeting lists with the click of a button. Here’s an example of how I might come up with a good remarketing list:

Let’s start with a curious question –> How long does it take users to convert on the site? The first place I would go to begin answering this question is by applying a “converted” segment (in this case, a purchase) to the Session Duration report.

Right away I notice that it takes a large percentage of users over 10 minutes in order to make a purchase, and over almost 13% require a half an hour or more. While I very much like segmenting the Engagement Reports, in this particular case I’m going to look at the User Timings report as I believe the data visualization is more helpful there (you can expand the histogram).
Continue reading Advanced Remarketing with Google Analytics & Google Tag Manager

Universal Analytics

The big news last week (at least for folks like me) was that Universal Analytics finally came out of beta. Is it time for you to switch?

Short answer –> yes, soon.

What exactly is the big deal about Universal Analytics? My current take on the product’s features is what follows:

UserID

One of the most touted feature improvements over Google Analytics Classic is the introduction of UserID. Google lists 4 benefits of using UserID.

More accurate user count

Analyze the signed-in experience

Access the User ID View and Cross Device Reports

Connect your acquistions, engagement, and conversions.

While I see the move towards a most person / customer centric view by the GA team to be a big step in the right direction, I think that at the current juncture the UserID reporting (and data model) falls flat. Full disclosure: I’ve only had access to the UserID reports for a few days. AND I am acutely aware that the GA team is constantly innovating and improving their product at a dizzying pace. That means that the only thing I can truly count on when it comes to GA is that the product will continue to improve (and hopefully not make this blog post completely irrelevant in the next 3 days).

So, why does UserID currently fall flat? Doesn’t the ability to connect all the dots sound like a marketer’s dream?

The following is the tale of some investigative work I did for a company which approached me to help them with their Google Analytics tracking for their online store. This company, like so many others, sells items on the Internet and wants to be able to properly attribute their sales to the correct channel using Google Analytics. Not surprisingly, in addition to having subdomains they were using a third party shopping cart and needed to have cross domain tracking configured. Pretty simple. Or so I thought…

As it turns out, the third party shopping cart they were using is called Shopify. As far as being an intuitive user interface that makes it easy for non-technical users to set up their store, I certainly see a lot of positives with Shopify. Unfortunately, Shopify also tried to make setting up Google Analytics “dumb proof.” In the end, trying to figure out why I couldn’t properly set up a simple, working basic GA tag led me to be “dumbfounded.” Shopify has a very simple interface where one simply needs to copy and paste their Google Analytics tracking code in order to get started.

However, as soon as you save the file, Shopify takes the Google Analytics code and rewrites it to match their own settings. In particular, they are choosing to _setDomainName to “none”, adding _setAllowLinker (which would indeed be required for cross domain tracking) and switched to dc.js. (I am not sure if the folks at Shopify paid attention or not, but to use the DoubleClick cookie, users are supposed to Continue reading Shopify Google Analytics Integration

A question that is commonly asked of analysts is “WHERE DID MY TRAFFIC GO?!” (Yes, even occasionally emailed in all caps). Indeed, this is a question that I received from a publisher recently, though they were very polite and didn’t use all caps. For publishers especially, this question is directly tied to their bottom line as advertising revenue is linked directly to pageviews (CPM models etc). So, I rolled up my sleeves and got ready to do a bit of analysis to see where their traffic went.

The following is simply a recounting of a bit of my process. The purpose of this post is to share some of my methods with a target audience of beginner to intermediate level analysts. This is a “how to” oriented post; nothing particularly new or groundbreaking here. Just some good old fashioned analysis of a common client question.

Leveraging Custom Dimensions and Custom Metrics to gain insights into Merchandise and Profitability.

I spend a bit too much time on Twitter. It’s not a terrible thing, as in addition to Twitter being a forum that truly keeps me informed about what is happening in my industry allowing me to stay on the cutting edge for my clients, it is also a social outlet that helps keep me from completely getting swallowed by work. That said, sometimes it is hard to wade one’s way through all of the chatter in order to find the good stuff. One of the people out there who is almost always tweeting quality things is Kevin Hillstrom, @minethatdata. It just so happens that yesterday seemed to be a minor “@minethatdata appreciation day” with some other industry peeps giving Kevin a well-deserved thumbs up.

One of the things that Kevin consistently wants others in the digital measurement industry to think about is merchandise and profit. A simple search on his timeline for July 2013 shows that he mentioned profit no less than 43 times and merchandise at least 28 times.

DudaMobile + Google Analytics = #Fail

Sometimes I find myself really bothered by 3rd parties who claim to have Google Analytics integrations not take the care to make sure that is done correctly. Indeed, one of those companies is DudaMobile.
image from the DudaMobile site

The strange thing about the DudaMobile situation is that they have some sort of official partnership with Google Analytics. Indeed, when you search for Google Analytics and DudaMobile, there are lots of articles about this relationship out there. I didn’t take much time to read into exactly how this partnership works, but my understanding is that it is different from the actual DudaMobile product.

DudaMobile makes it really easy for webmasters to create a mobile site which is hosted by DudaMobile. That’s a great thing. I love it. But in order to get a user to the mobile site, they provide webmasters with some javascript which does a 302 redirect. Oy vey. 302… <shutter> In addition to the SEO problems that are bound to pop up, especially as there is now lots of duplicate content problems for webmasters since they have a 2nd site hosted on a DudaMobile domain. But I digress…
Continue reading DudaMobile Google Analytics FIX

Ever since returning from Superweek in beautiful Galyateto, Hungary, I’ve been thinking a lot about data and the utility of Google Analytics as a tool. Yes, I know, I spend a lot of time thinking about those things, but the conference was particularly inspiring in those regards. Google Analytics is not different than any other digital analytics tool insomuch as it is critical to understand what the values that get reported actually mean and how they get there in the first place. But that’s not enough. When we analyze data, we need it to be presented in a meaningful way. Data visualization is tremendously important in this regards, and I believe that one of the reasons why Google Analytics has such great adoption and market penetration (besides the enticing $0.00 entry price point) is because the UI is crisp, FAST, and easy to use.

One catalyst for this post is a response to this post entitled “Are You Being Misled by Google Analytics?” While I am about to critique the post, I do want to point out that one of the ideas that Tien Nguyen has (who Chris mentions in his article as the source of this idea) is indeed insightful. Namely, that without configuration Google Analytics may not provide as much visibility into traffic sources that one needs. While I urge you to take a look at the article, I’ll briefly summarize the main idea here.
Continue reading The Importance of Clean and Meaningful Google Analytics data

As Justin explains, 5 of these hit types are used in calculating some form of engagement, thereby impacting time on page / time on site calculations as well as bounce rate. With regards to bounce rate in particular, an additional Pageview, Event which hasn’t been set to non-interaction, or Social Media share (that is configured to be tracked in GA) are all things that can impact your bounce rate.