YouTube ran ads from hundreds of brands on extremist channels

Ads from over 300 companies and organizations -- including tech giants, major retailers, newspapers and government agencies -- ran on YouTube channels promoting white nationalists, Nazis, pedophilia, conspiracy theories and North Korean propaganda, a CNN investigation has found.

YouTube ran ads from hundreds of brands on extremist channels

Ads from over 300 companies and organizations -- including tech giants, major retailers, newspapers and government agencies -- ran on YouTube channels promoting white nationalists, Nazis, pedophilia, conspiracy theories and North Korean propaganda, a CNN investigation has found.

Ads from over 300 companies and organizations — including tech giants, major retailers, newspapers and government agencies — ran on YouTube channels promoting white nationalists, Nazis, pedophilia, conspiracy theories and North Korean propaganda, a CNN investigation has found.

Companies such as Adidas, Amazon, Cisco, Facebook, Hershey, Hilton, LinkedIn, Mozilla, Netflix, Nordstrom and Under Armour may have unknowingly helped finance some of these channels via the advertisements they paid for on Google-owned YouTube.

US tax dollars may have gone to the channels, too. Ads from five US government agencies, such as the Department of Transportation and Centers for Disease Control, appeared on the channels.

Many of the companiesthat responded to CNN said they were unaware their ads had been placed on these channels and were investigating how they ended up there. (Fuller responses from these companies are collected at the bottom of this article.)

One of the companies, Under Armour, is pausing its advertising buy on the platform after CNN notified the company its ads appeared on a white nationalist YouTube channel called “Wife With A Purpose.”

“We have strong values-led guidelines in place and are working with YouTube to understand how this could have slipped through the guardrails. We take these matters very seriously and are working to rectify this immediately,” a spokesperson for Under Armour said.

This is not the first time YouTube has put major companies’ ads up against controversial or extremist content, despite settings meant to protect advertisers.

The incidents have raised questions about whether YouTube can adequately safeguard ads and brands’ integrity, or whether its automated systems mean that advertisers will always be at risk of such ad placements.

After past incidents, some companies temporarily paused ads on YouTube, but then resumed advertising later on.

“YouTube’s bottom line hasn’t been hit,” said Nicole Perrin, a senior analyst at eMarketer who covers advertising and marketing technology. “If brands want to make sure this stops, the only way for that to happen is for them to stop spending [on YouTube] until it’s fixed.”

Brands continue advertising on YouTube to reach its huge audience, especially younger generations. YouTube says it has over a billion users, and that every day those users watch a billion hours of video.

“We have partnered with our advertisers to make significant changes to how we approach monetization on YouTube with stricter policies, better controls and greater transparency,” a YouTube spokeswoman said in a statement.

“When we find that ads mistakenly ran against content that doesn’t comply with our policies, we immediately remove those ads. We know that even when videos meet our advertiser friendly guidelines, not all videos will be appropriate for all brands. But we are committed to working with our advertisers and getting this right,” she added.

However, the YouTube statement does not address why the issue keeps happening.

How YouTube places ads

Almost anyone can create a YouTube account and upload videos, but the company decides which content and channels it puts ads on.

YouTube channels with 1,000 subscribers and 4,000 watch hours in the last 12 months can apply to make money from ads. Monetized channels are given a portion of YouTube’s ad revenue from the ads running on their videos. Videos can have ads even if the channels that posted them are not monetized.

Earlier this year, YouTube restricted which channels can generate revenue from advertisements, part of its broader effort to “prevent potentially inappropriate videos from monetizing.”

Advertisers entrust YouTube to decide — and define — what content is sensitive or extremist and not appropriate for their ads.

Companies can set broad targets for ads based on demographics and user behavior. While they don’t necessarily know where the commercials will end up, companies can blacklist certain channels and use a “sensitive subject exclusion” filter that is designed to stop ads from appearing on specific channels or content.

Many of the companies CNN contacted said they used the sensitive subject exclusion filter and expected their ads would appear on content that was “brand safe,” but the ad placements still happened.

This isn’t the first time YouTube has faced backlash about that definition difference.

In 2015, major companies’ ads appeared on ISIS videos. Last year, some advertisers pulled their ads after YouTube placed ads on videos that included hate speech and extremist content.

“YouTube has once again failed”

YouTube continues to place ads on channels run by the online and digital network InfoWars, which is notorious for promoting conspiracy theories, despite the backlash it got from advertisers in the wake of CNN’s reporting in March. CNN has since found that YouTube also put ads for Mozilla and 20th Century Fox Film on a Nazi YouTube channel.

The company’s ad ran on Brian Ruhe’s Nazi channel. YouTube subsequently deleted the channel for violating its community guidelines against spreading hate speech. Before its deletion, ads ran frequently on the channel.

Ruhe — who, when contacted by CNN for comment, emphasized that he did not want to be referred to as a “neo-Nazi,” because he thinks of himself as a “real, genuine and sincere Nazi” — confirmed to CNN that his channel was monetized before its deletion.

“YouTube has once again failed to correctly filter channels out of our marketing buys,” a 20th Century Fox Film spokesperson told CNN.

Ads for Jewish and Zionist groups, such as Jerusalem’s Friends of Zion Museum and the Jewish National Fund — a non-profit organization that owns land and plants trees in Israel — ran on a Ruhe video titled “David Duke on Harvey Weinstein exposing Jewish domination. Black/White genetic differences.”

The Friends of Zion Museum told CNN that it will not be advertising on YouTube “until we can be certain that incidents such as these will not repeat, and until we will be guaranteed that earnings from our campaign will not be forwarded to any channel glorifying Nazism or the Nazi crimes.”

In one video on Ruhe’s channel, former KKK grand wizard David Duke discussed what he said are racial and genetic differences between whites and blacks. YouTube placed a co-branded ad from Nissan and Disney that promoted the Nissan Leaf and the release of the film, “A Wrinkle in Time.”

The co-branded ad featured Disney executive VP of production Tendo Nagenda discussing diversity in the film industry. Even though the video bore an offensive content warning, which disabled certain features on it, like sharing, ads were still placed on the video.

“We are shocked Nissan advertising has been served next to inappropriate and disturbing content by one of our partners. We employ safeguards to keep digital advertising from appearing next to offensive content and have agreements in place with our partners to strictly adhere to our brand safety guidelines,” a Nissan spokesperson, commenting on behalf of both Nissan and Disney, told CNN. “Effective immediately, we are freezing all of our advertising on YouTube until we resolve this issue.”

The Genius of Play — a campaign by the Toy Association promoting play in child development — has also pulled its ads from YouTube after CNN found that ads for the campaign were placed on Amos Yee’s channel, which he has used to promote pedophilia.

Yee gained notoriety in 2015 for his YouTube video that praised the death of Singapore’s first prime minister. After he was granted political asylum in the US in 2017, he began using his YouTube channel to promote decriminalizing pedophilia.

After CNN reached out to YouTube about this story, the platform removed ads from most of Yee’s video. After that, Yee told CNN that YouTube had decided to entirely remove his channel’s ability to monetize.

Other ads appeared on several channels posting North Korean propaganda like Red Star TV.

Red Star TV says it is an “information project of the DPRK Solidarity Group in which truthful information about North Korea is translated and disseminated.” A representative for the group told CNN that it is officially recognized by North Korea and receives “information support” from the regime.

Advertisements potentially funded by US tax dollars also appeared on the channels promoting North Korean propaganda. They also ran on channels connected to InfoWars, Yee’s channel and far-right channels.

The Centers for Disease Control, Department of Transportation, Customs and Border Protection, Veterans Affairs and the US Coast Guard Academy had ads on the channels. The Coast Guard Academy, the Centers for Disease Control, DOT and VA all told CNN they are working to investigate why the ads were placed on the channels.

Meanwhile, ads from the Washington Post and New York Times appeared on far-right and conspiracy channels like Black Pigeon Speaks and some run by InfoWars. The companies are investigating how their ads appeared on the channels.

“It appears that YouTube did not follow its own protocols and categorize these videos properly,” the New York Times told CNN. The paper said its ads should only appear on a list of pre-approved sites.

If the channels are monetized — which InfoWars has previously claimed they are — the major newspapers could have unknowingly supported disinformation and conspiracy.

Ads also appeared on The Jimmy Dore Show channel, a far-left YouTube channel that peddles conspiracy theories, such as the idea that Syrian chemical weapons attacks are hoaxes.

Update: This article has been changed to note that Nissan was commenting on behalf of both itself and Disney.

What the companies said

Amazon said it has filters in place for advertising. Facebook said it was working with YouTube to address the situation.

Adidas: “We were unaware of this situation on YouTube, and we are working closely with Google on necessary steps to avoid any reoccurrences.”

Cisco: “Cisco reviews our advertising placements periodically, including when potential issues are brought to our attention. Our brand safety guidelines mandate that Cisco branded advertisements do not appear on sites with content that does not align with our corporate values.”

Hilton: “We pay careful attention to where, and how our brands are represented, to ensure alignment with our strong company values. We have guidelines, using filters and other controls, to ensure our advertising appears on sites that align with our brand values. I can confirm that we have moved quickly and that our advertisement is in the process of being removed from this channel. We are also working with the relevant parties to investigate this matter and ensure all appropriate follow up actions are taken.”

Hershey: “At Hershey, we deeply care about how and where our brands show up in advertising. We work very hard to target the right consumers and ensure our ads appear within the appropriate content. We were not aware that our ads were appearing within this specific programming on YouTube. All of our programmatic ads use filters to ensure that they are appearing in the right place and right time. Given our current control systems, our ad should not have run in this program or any other program with political or news views — even if it is rooted in comedy. Our internal media team and our media partners are looking into why our ad our appeared within this YouTube channel. It was not our intent and we are taking immediate steps to remedy this situation.”

Jewish National Fund: “It’s very upsetting to learn. More should be done to protect those folks who are putting up ads or campaigns so they aren’t put in places where they don’t belong. We take this very seriously. We’re going to look into this. We’re going to first investigate it on our end.”

LinkedIn: “One of our ads ran on an unapproved YouTube placement. The ad in question has been taken down and we are working with YouTube to understand why this happened to prevent a repeat occurrence.”

Mozilla: “This newly revealed incident where our advertisements ran alongside clearly objectionable content that does not reflect our brand or our values is deeply concerning to us, especially given steps we’ve taken to prevent this from occurring. We take this new report very seriously and are investigating how and why this happened, and what we need to do to fix it.”

Netflix: “We employ numerous filters to avoid having our content appear on sites or videos that clearly don’t represent us or our values. While that works well most of the time, there are a small number of instances where it doesn’t and we are working closely with Google to close that gap further.”

Nordstrom: “We have parameters that guide where our ads appear online, and aim to prevent them from showing up near certain types of content. We’re looking into this situation.”