Microsoft's Bing search engine reportedly had a child porn problem

A report from online safety consultancy AntiToxin, commissioned by TechCrunch, found that several image searches using terms referring to children and pornography turned up explicit photos of clearly under-aged children.

Microsoft told TechCrunch it has removed these illegal images, and is working to block such search results in the future — though AntiToxin says that it found that some searches still showed the illegal images in question, even after the report was published.

Other social platforms have also had issues with rampant child porn, notably including Facebook's WhatsApp.

Microsoft's Bing search engine appears to have had a problem with child pornography.

TechCrunch reports that several queries on Bing returned child pornography for searches — even on seemingly-innocent search terms, suggested by Bing itself, related to the popular "Omegle kids" app.

According to a report from online safety consultancy AntiToxin, commissioned by TechCrunch, Bing search results included images that "any professional or novice can determine to be of underage boys and girls posing in partial/full nudity, as well as partaking in various sexual acts alone, or with others, including other minors, and/or adults."

It's important to note that searching for terms related to child pornography is illegal, and could constitute a crime. As such, Business Insider is not including in this story the specific search terms AntiToxin used in its report.

Microsoft did not respond to a request for comment from Business Insider. However, Microsoft told TechCrunch it fixed the issue after the publication made it aware, and removed illegal content from showing up in the search results used in AntiToxin's report. However, AntiToxin found that some Bing queries still turned up illegal imagery, even after the publication of the report, according to TechCrunch.

"Clearly these results were unacceptable under our standards and policies and we appreciate TechCrunch making us aware," Microsoft executive Jordi Ribas told TechCrunch in a statement. "We acted immediately to remove them, but we also want to prevent any other similar violations in the future. We're focused on learning from this so we can make any other improvements needed."

In its report, AntiToxin found that Bing not only turned up image results showing child porn, but also suggested related search terms that turn up more illegal photos. A search for "Omegle kids," referring to the online video chat platform for talking with strangers, gave suggestions for related queries that sometimes returned child porn.

This isn't the first time Bing has come under fire for the images that pop up in its search results when you turn off the SafeSearch feature. HowToGeek reported in October that Bing surfaced racist and offensive results and related search queries, though Microsoft quickly set to work cleaning it up.

The tech titans have been fighting to keep their platforms free of child pornography and other illegal imagery for a long time. Explicit images showing illegal activity have been problems on platform Twitter's livestreaming platform Periscope and on Facebook. Most recently, WhatsApp came under fire for failing to monitor a number a groups that openly shared child porn.

To monitor illegal and explicit content online, social platforms often employ content moderators to police what's posted. Facebook employs thousands of content moderators, and Facebook-owned WhatsApp employs around 300. Microsoft declined to disclose to TechCrunch how many moderators work on Bing.