It’s Not Working; The Reality of Measuring Mobile Ad-Blocking and the Pace of Advancement of Desktop Blocking

“Measurement is the first step that leads to control and eventual improvement. If you can’t measure something you can’t understand it. If you can’t understand it, you can’t control it. If you can’t control it, you can’t improve it.”

–H. James Harrington

There is a massive problem in the making to which little attention has been paid, and the resulting effects have the potential for major consequences within the digital media industry. Over the last year, as we’ve worked with the largest online publishers, we consistently hear that they are seeing negligible ad-blocking on mobile and increasingly, lacking confidence of their desktop metrics.

This puzzled us because of the rapid growth of mobile ad-blocking technology available to consumers. So we decided to do our own research and discovered something rather interesting: Mobile ad blockers and more and more desktop ad blockers (plugins, dedicated browsers and network blockers) are now fooling publishers into thinking there is no ad blocker present.

As a result, Publishers are making poor business decisions about ad-blocking because they mistakenly think they have no ad-blocking problem on mobile, and that there is a decreasing presence on desktops. Consequently, they are not able to communicate with their ad blocked audience to ask them to disable or take any action (such as messaging, ad-reinsertion, or protection of any kind); mainly because they don’t know the user has an ad blocker. Or worse, the publisher messages a consumer asking them to disable when the detection was a false positive.

These problems are reinforced by news stories based on ad block data released by industry leaders like Adobe and PageFair. In their 2015 Ad Blocking Report they describe the mobile ad-blocking threat as “negligible”, and in their 2016 Advertising Demand Report they do not make a single reference to a mobile ad-blocking problem, even though they state that:

“Ad-blocking statistics courtesy of PageFair, based on data through June 2016, including number of blockers installed across 21 countries on desktop and mobile devices” [Emphasis added].

If a publisher mistakenly thinks they do not have a mobile ad block problem, they may try to make up blocked desktop impressions on the backs of their mobile viewers, as suggested in this Buzzfeed news story. If your mobile ad block rate is higher than your desktop ad block rate – which now our findings show may well be the case – then this strategy will fail at best, and at worse you will permanently lose some of your mobile audience due to the increased ad frequency on mobile creating a negative user experience. This is painful to acknowledge since we know high ad frequency has been one of the causes of ad block adoption in the first place.

The publishing industry has no idea what its true mobile ad block rate is, and it’s getting harder to measure desktop

“You’ve got to be very careful if you don’t know where you are going, because you might not get there”…Yogi Berra

Because publishers have no way to directly measure their actual mobile ad block rate and it’s becoming more difficult on desktop due the pace of advancement of blocking technology, the industry is doing the next best thing: looking at ad blocker appstore numbers, downloads from Google Play and Apple iTunes, and then employing elaborate calculations, guesses, hunches and assumptions to try to estimate how many mobile ad block users they truly have. While you can make some large scale estimates this way, the assumption methodology is flawed because it does not rely on actual user data identifying the true mobile ad block rate.

The Priori Data / PageFair report Adblocking Goes Mobile states that there are “twice as many mobile adblockers than desktop.” That’s a pretty big headline that should have made it into the Adobe / PageFair 2016 annual report (same company, same year), but there is no mention of mobile ad-blocking in the Adobe report even though they claim the report includes both desktop and mobile. The reason: they simply don’t have the data.

As the war wages on covered in our technical white paper here: http://www.adtoniq.com/2016/05/20/the-ad-block-battle-theater/), little reporting has focused on how the technology on both sides (ad-blocking and anti-blocking technology) work and the shortcomings with various approaches, ultimately leading to a real gap in understanding what is actually happening.

Our Argument: Publishers are Relying on Faulty and Inaccurate Data

The results of our tests are cause for concern. The results indicate that PageFair, and we believe most other detection technology whether it’s homegrown, open source or off the shelf; significantly fail to accurately recognize many of the ad-blocking technologies available to consumers today and do not detect or measure on mobile. That is, that they either go undetected and therefore not reported, and completely hidden or they are reported inaccurately as “not blocked”. The resulting impact of this should be cause for alarm by publishers relying on Pagefair and other similar technology to determine their respective ad-blocking rates and other corresponding and related data. And more importantly, if a publisher does not accurately detect an ad blocker, not only do you have bad data but you also cannot take any action on that blocker, no matter what your strategy might be. We further surmise that in all cases, actual ad-blocking rates are likely higher then what is being reported.

What we tested

We tested the core PageFair analytics feature by registering a new PageFair account and adding their JavaScript to a new WordPress website we created for this test, following their directions. We used ten different ad block solutions to request pages with the PageFair script on it. We monitored PageFair’s traffic breakdown report, refreshing it once per minute, to determine whether the page views were seen at all, and whether all page views were correctly attributed as blocked or not blocked. We also examined the browser’s JavaScript console, looking to see what was actually blocked by each ad blocker. Because we saw an unexpectedly high failure rate, we then replicated those test results with an independent tester to verify that what we had initially seen was accurate. You should be able to replicate these test results yourself.

The Test Results

In our test, only two of the ten ad blockers we tested were correctly identified by PageFair, while the other ones either completely hid from PageFair running undetected, or were incorrectly attributed as non-blocked when they should have shown as blocked.

We do not yet know the size of this inaccuracy, and we are investigating ways to measure how large this inaccuracy is. In the meantime, one thing we can conclude is that a publisher’s actual ad block rate is higher than what they see in PageFair and other similar technologies – homegrown, open source, or off the shelf.

Google Analytics is also blocked by some ad blockers. If a publisher is relying on an ad block detection script that sends its results to Google Analytics or similar systems, the publisher will be missing data to the extent that blockers block Google Analytics. Increasingly, ad blockers will block Google Analytics by default.

Ultimately, our research points to a rather frightening trend – publishers are being lulled into a false sense of security. They are relying upon technologies that do not accurately detect ad-blocking, and these inaccuracies are invisible to those who have implemented strategies to counter ad-blocking. This will become increasingly problematic as ad-blocking continues to expand dramatically on mobile and becomes increasingly sophisticated on desktop. Publishers need to be aware of these shortcomings, as the business implications are profound.