While other MediaPost newsletters and articles remain free to all ... our new Research Intelligencer service is reserved for paid subscribers ...

Subscribe today to gain access to the every Research Intelligencer article we publish as well as the exclusive daily newsletter, full access to The MediaPost Cases, first-look research and daily insights from Joe Mandese, Editor in Chief.

Commentary

TV's Next Big Show Is Shaping Up To Be Media-Buying Workflow

At a time when the TV industry is focused on
pushing the sizzle of new shows and upfront ad sales, a group of top television data and technology executives met quasi-secretly to tackle a burgeoning problem more akin to watching paint dry:
workflow management.

The TV industry was the earliest innovator of electronically managing the workflow of advertising deals — especially the exchange of data verifying,
posting and measuring its ROI and yield.

But the digital ad business has overshadowed it in recent years with major investments in so-called “ad ops” to reduce the amount of
friction advertisers and agencies have managing their digital media buys.

Eleven years later, the marketplace
has grown incredibly more complex, thanks to the proliferation of new media options, sources of data, and importantly, systems for managing and processing advertising buys. They include programmatic,
addressable TV, attribution and modeling.

Estimating there currently are “31 different products from six companies” that advanced TV
advertising buyers must vet and choose from, Oscar kicked the meeting off by suggesting it is time to devise standards and common language for defining the data and processes used to manage such
buys.

The meeting, which was held in USIM’s New York City offices, was attended mainly by suppliers of data, workflow management and TV advertising inventory representatives of
Comcast, AT&T, Cablevision, DISH, Freewheel, Google, TiVo, ESPN, Fox, Turner and others. It focused primarily on the current state of complexity and what the TV industry needs to do to reduce it
in order to remain competitive.

One key development contributing to the complexity: Many of the biggest TV players have recently created their own data management platforms. The goal
is to help advertisers and agencies target their audiences with the same precision they use in digital media buys, utilizing actual consumer segments instead of Nielsen’s flatter demographic
audience breaks.

In many cases, the data is very rich, granular and tied to valuable first-party data, albeit on privacy compliant bases. The problem is there is no unified way to compare the
segments, much less the overlap in reach and frequency that might occur from trying to build TV advertising schedules across them.

In April, three of TV’s biggest purveyors --
Fox, Turner and Viacom -- joined together to form OpenAP, a platform that seeks to unified the language used by their audience-targeting systems, which is a good step in that direction.

But
some of the attendees at Oscar’s meeting indicated it doesn’t go far enough.

“We have to find a lingua franca that we can all use,” Oscar said. He noted the
problem is bigger than simply unifying a common definition of audience segments. It also relates to the growing complexity of measuring ROI for the buy-side and yield for the sell-side of advanced TV
advertising deals, which may also include some digital and OTT inventory, too.

“There’s a lot of black box in terms of the attribution model,” noted Walt Horstman,
who recently joined TiVo as senior vice president-general manager of analytics and advertising from Comcast’s AudiencExpress, which pioneered programmatic sales and yield management for local
cable TV advertising.

One of the problems with so-called black boxes that process outputs, but don’t necessarily disclose how the data was processed, he implied, is that they
can be used to optimize any outcome the user wants to see.

“It was clear that whoever was paying for the study was going to get the results they wanted,” he said,
referring to some of the attribution analyses he witnessed while working at AudiencExpress.

As complex as advanced TV’s new workflow ecosystem is getting, some data management
processors are taking steps to make it easier to use. The first step, said Lorne Brown, president of SintecMedia, is simply figuring out how to categorize how the data needs to be organized.

Brown began outlining a working model for that on a post-it board (photo below).

Brown said the initial goal is to conceive a method of “product unity” the
entire TV industry can accommodate. He added the industry is moving toward a “fluidity model” built around commonly defined parameters that individual companies can use to make their own
cases.

“This is an incredibly complex problem we’re trying to solve,” he said. “How do we connect linear systems to nonlinear systems and let the work
flow?”

Interesting report, Joe. I think that Mitch is providing a useful forum for discussing and sorting out these issues, though I'm surprised to see so little representation from the broadcast TV networks at this meeting. I realize that it will be regarded as a detail----perhaps a nagging one---but in addition to definitions that cut across all platforms and ways to create common datasets that allow all sellers to be evaluated using the same "currencies", there is the fundamental question of the value of the data itself. There's lots of talk about granularity, by which is meant, Big Data set-top-box and similar large sample information. The problem with this is that all of the proposed manipulations, indexing, etc. systems that have been suggested are based on so-called household audiences, not viewer data and even though this is acknowledged, it is then ignored as there seems to be no way to solve this problem. Unfortunately, set usage ratings for TV shows, rather than viewer ratings, are heavily tilted towards the sellers' interests as they create the illusion that many of shows "target" upscale and younger audiences to a far greater extent than is true. Since these are the most wanted consumers for most products or services advertisers will be misleading themselves if they think they are improving their "targeting", no matter how large the panels are and how finely their set usage ratings are broken down.

Great piece Joe and so glad to see Mitch leading the charge here. As someone who spends most of my time focused on TV's "product" issue, it great to see someone as expert at Mtch driving a group to attack the "process" issues, which are certain to get exponentially more complex as the TV industry adopts more digital-like audience targeting and optimization.

Dave is correct. Mitch Oscar should be lauded for his efforts to bring order out of chaos.

I just wonder if "The Industry" sowed the seeds of this disordered situation long ago by allowing and forcing the seller side to foot the lion's shareof the bill for media measurement and data management.

Can seller's be blamed for framing their offering in the best light?No one was stopping the buyer's of media from reframing the debate or footing the considerable operational costs associated with research qualityso that "The System" worked in the clients' best interests or the best interests of all concerned.

Agreed that Mitch's effort should receive broad support. As Ed noted, the low broadcast net representation might continue to be a concern. One of my worries involves the competitive landscape, made even more complex by vested interests and the tendency of some major TV players to be late to the data-dances. Digital platforms have problems but they also have thousands of data-staffers tackling measurement issues and much more fluid and responsive operational models. Just keeping up with Google and Facebook changes is a full-time job.

Would you agree--What's needed is a very fast-track approach? Not one that seeks parity with digital but one agile enough to respond to what we know is coming down the big data pipe...everything from IoT to AI.

Agility is better than parity, when we don't even know the utility and qualityof what we know is coming down the big data pipe. If we have a "pipe," then we have liquidity. And haven't we learned there is such a thing in certain market economies as too much liquidity.

I am just as concerned about what we don't know is coming.

In many ways, we have a form of global warming in the media, marketing and measurement world.

The liquidity of data is not a good thing especially when our data towers and fixed assetsare under the water of feuding vested interests who can no longer see the light or breathe.

A "very fast-track approach" may requiring stopping first to set new goals that are truly imaginative, actually reachable and not self-destructive.