comScore's 'Validated' Measurements Aims To Show Full Value Of Ad Impressions

In the continuing arguments over the best way to measure ad effectiveness is to look at how many people have seen an ad or whether its more important to determine how impactful a placement is among those who have seen it, comScore's relatively new its Validated Campaign Essentials, or vCE, product is meant to provide an all-seeing eye on both that and more.

Launched in January with 12 blue chip marketers, including Allstate, Chrysler, Discover, E*TRADE Financial, Ford, General Mills, Kellogg’s, Kimberly Clark, Kraft, and Sprint, comScore has released some early findings about what it has learned about in-view rates, audience targeting, and the amount of fraudulent placements that were experienced by these major advertisers.

Among the top findings, of the 18 campaigns analyzed in December 2011, 72 percent had at least some impressions that were delivered adjacent to objectionable content, a problem that continues to plague advertisers despite ever-more sophisticated ways of detecting these sorts of issues. "While this did not translate to a large number of impressions on an absolute basis (14,000 impressions across 980 domains), it is important to note that 92,000 people were exposed to these impressions," said Anne Hunter, Senior Vice President, Advertising Effectiveness, comScore, in an interview with AdExchanger.

All of the of the vCE Charter Study ad impressions were delivered in "iframes," which are generally described as scrolling text within an ad that provide a link to another page, Hunter noted.

The study showed that 31 percent of ads were not in-view -- in other words, they never had a chance of being seen. There was also great variation across sites where the campaigns ran, with in-view rates ranging from 7 percent to 100 percent on a given site. This huge range suggests that even for major advertisers making premium buys do not necessarily get what they're paying for.

"We were not looking at ad effectiveness, which would look at did the ad generate awareness or excursion or an offline sale," Hunter said, noting the scope of what comScore was and wasn't interested in here. "What we're finding is, you can't accurately measure effectiveness if you're counting ads that weren't seen in the first place. You can't start with effectiveness -- the issue you need to start with is about ad delivery."

On a more positive note, campaigns that had very basic demo targeting objectives performed well with regard to hitting those targets. For example, those with an objective of reaching people in a particular broad age range did so with 70 percent of their impressions.

However, as additional demographic layers were added to the targeting criteria (i.e. income + gender), accuracy rates of the ad delivery declined. Ultimately, the results also showed that 37% of all impressions were delivered to audiences with behavioral profiles that were relevant to the brand (i.e. consumers with demonstrated interests in categories, such as food, auto or sports). One campaign had 67 percent of its impressions viewed by the target behavioral segment.

comScore's focus on vCE as a service reflects the competitive pressures in the analytics space, which has witnessed feverish pace of acquisitions, venture capital fundings and alliances for the better part of the last four years. In part, the vCE tools are the result of a number of comScore acquisitions over that period, including last August's $22 million purchase of ad campaign effectiveness firm AdXPose.

At the same time, comScore has also worked internally to develop better metrics around "in-view" impressions, on top of its general audience ratings. "Because both buyers and sellers of media have not had these holistic tools to understand the relative value of the placements, they haven't had the ability to price them accordingly," Hunter said.

One example cited in the study of this misunderstanding about the "holistic" value of a placement is seen in positioning "above-the-fold" of a webpage versus below-the-fold.

"There is a common misperception that ads delivered ‘above-the-fold’ are seen, while ads delivered ‘below-the-fold’ are not," Hunter said. "While the quality of in-view rates can vary from ‘above- the-fold’ versus ‘below the fold’ ad delivery, the vCE Charter Study results help to dispel some of these myths. Surprisingly, the findings demonstrate that some ads delivered ‘above-the-fold’ were not seen because users quickly scrolled past them before the ad had a chance to load, and alternatively many ads placed ‘below-the-fold’ delivered a high opportunity to be seen because they happened to be situated near content that people were engaged with. As such, the pricing should reflect that as a premium buy, but without the tools to clearly identify that, publishers are at loss and advertisers may be missing an opportunity by not looking for those placements."

6 Comments

I see the value in reporting ads that were "viewable" vs. "not-viewable"; however I'm confused as to how comScore is validating that certain demographic and behavioral targets were indeed reached.

After reading through their vCE Charter Study I am scratching my head even further - beyond the data comScore collects (via a survey) of ~20-30k panel members / month; how are they measuring WHO actually saw the ad with any level of significance.

If I was an advertiser / agency subject to this study, I would take these results with more than a grain of salt - perhaps an entire salt shaker.

I'm happy to clarify for you how validated Campaign Essentials measures demographic and behavioral data. This information in vCE is based on comScore's Unified Digital Measurement methodology. This is also the basis for our widely accepted media planning and reporting tools and provides consistent demographics from planning to campaign delivery. The demographic data used to measure campaigns is not from a survey not from 20-30K people.

If I'm not mistaken - Unified Digital Measurement is based on a mix between your panel, and "server-side metrics". I understand how that is relevant for measuring quantitative KPI's (e.g. Uniques, Views, etc..) -

Where I scratch my head is how you measure demographics/behavior using that same panel. (and while I won't get into it, I'm well aware that you do not have demographics on all 2MM panelists).

I should have been more clear in my response - where I think the pieces fall apart for me is that you cannot reach any level of statistical significance measuring Display Media with such a small universe of "known users". Further - I'm not sure how/where that panel could be purported to "verify" another data providers targeting.

For example - if a 3rd party data provider has ~15MM cookies of users who are identified as people 18-24 years old - comScore might only have 200k panelists who fit the same description. Considering that all of comScore's demographic data is collected via Survey (vs. other data providers who use multiple sources) - how can comScore claim that their demo data is in fact more accurate than someone else?

I'm happy to discuss this with you in more detail over coffee. The information you are stating is still unfortunately willfully incorrect. The most qualified entity to evaluate our demographic methodology is the MRC and that's where is being done. If you have comments or questions on the vCE Whitepaper Charter Study I'm happy to discuss.