Chaos grows on YouTube: Pedophile searches frighten advertisers

If a few days ago YouTube Kids was the service that raised the controversy, now is the original website, YouTube, which is the focus of all criticism. The reason, its integrated search engine, which when completing the search term posed a particularly delicate one.

In that search engine in English when entering the terms “How to have” one of the suggestions offered by the search engine was “How to have s*x with kids”. The algorithms that control the functioning of YouTube continue to show their dark side, and advertisers begin to abandon a platform they do not want to see related to their brands.

The autocomplete returns to play tricks on Google

Image Search: Google Image

That search suggestion no longer appears on YouTube thanks to notices from users and media such as BuzzFeed that unveiled the problem. Spokespeople for the company said they were investigating the problem to “determine what was behind the appearance of this autocomplete.”

The autocomplete feature of the search engine has been part of Google services for years, and precisely this option that tries to anticipate our needs has been in the past also protagonist of problems, curious results and numerous memes.

This function is designed to avoid searches of sensitive data and also so that the phrases are not completed with “offensive, harmful or inappropriate searches”. There they also explained that this system makes the suggestions that are presented change frequently, and that these predictions “are produced based on a series of factors including the popularity and freshness of the search terms”.

Precisely that dependence on the popularity feature of the search terms may have been used to cause the problem on YouTube. Some speak of a coordinated campaign to the algorithm, something that seems likely given the use of the term “s * x” with that asterisk so peculiar that does not seem typical of a conventional search.

More sick videos

To this problem and to which YouTube Kids suffers with that huge catalog of disturbing videos, other discoveries that have been worrying have been joined in the past few days.

Other research also by BuzzFeed revealed the existence of a large number of videos depicting disturbing scenes with children in vulnerable situations. For example, tied with strings, with tape, simulating kidnappings without them noticing the situation.

In other videos children were forced to “play doctors” with an adult, and in them grotesque scenes often appeared and in which the children were innocent victims of a controlled simulation by those who publish the videos.

Many of these videos come from verified channels and have tens of millions of page views, and again worrying is a situation in which YouTube, yes, has reacted by eliminating thousands of conflicting channels.

According to a spokesperson for the service, “in the last week we have reinforced our policies in such a way that we are closing the accounts of those users who make inappropriate comments in videos where minors appear.”

“As a result,” he added, “we have closed hundreds of accounts, eliminated more than 150,000 videos and disabled comments on more than 625,000 videos.” In addition, we have removed ads from more than 2 million videos and 50,000 channels that appeared as appropriate content for families. These statements are joined by the official blog article in which they explained how they will reinforce this section, but the consequences of these problems are already appearing.

The business model of Google, in commitment

Image Search: Google Image

All these events have caused some major brands begin to avoid the use of YouTube and Google to publish ads, especially after the discovery that these ads were appearing in content published by pedophiles or those channels with disturbing videos.

Among those brands, indicated a few days ago in The Guardian, are HP, Lidl, Deutsche Bank, Adidas, Mars or Cadbury. One of the representatives of Mars commented for example that “we are shocked and dismayed to see that our ads have appeared together with such an exploitative and inappropriate content.” We have made the decision to immediately suspend all our online advertising on YouTube and Google worldwide. Until we have confidence that there are adequate safeguards, we will not advertise on YouTube and Google.”

Comments from other companies involved in the abandonment of the advertising platform of Google have made similar statements, and the problem actually is not new : A few months ago the appearance of advertising companies like AT & T or Verizon on controversial videos made both are They will also withdraw from Google’s advertising platform.

This withdrawal of the advertisers had a direct effect on the YouTubers community, which saw their income reduced. The service has become a way of life for many users who recently protested how the service has become a niche in which it is common to find content with poor quality and in which the management of advertising is not adequate.

The algorithms used by Google in its services have proven to be tremendously useful so that this company can offer more options to many more people and also make it faster, but that spectacular growth begins to be cornered by an increasingly evident fact: that these algorithms still need greater control and management mechanisms so that there is a misuse of them, whether intentional or not.