TechCrunch discovered apps available on Google Play which led to child porn groups on the Facebook-owned messaging service WhatsApp, all of which reportedly made money through Google and Facebook’s advertising networks.

According to Tech Crunch, “child porn sharing rings on WhatsApp were supported with ads run by Google and Facebook’s ad networks,” through other apps hosted on the Google Play Store which linked users to the child porn groups.

“In a video provided by AntiToxin[…] the app ‘Group Links For Whats by Lisa Studio’ that ran Google AdMob is shown displaying an interstitial ad for Q Link Wireless before providing WhatsApp group search results for ‘child.’ A group described as ‘Child nude FBI POLICE’ is surfaced, and when the invite link is clicked, it opens within WhatsApp to a group used for sharing child exploitation imagery,” TechCrunch reported. “Another video shows the app ‘Group Link For whatsapp by Video Status Zone’ that ran Google AdMob and Facebook Audience Network displaying a link to a WhatsApp group described as ‘only cp video.’ When tapped, the app first surfaces an interstitial ad for Amazon Photos before revealing a button for opening the group within WhatsApp.”

In a statement, Google claimed to have a “zero tolerance approach to child sexual abuse material,” and declared, “If we identify an app promoting this kind of material that our systems haven’t already blocked, we report it to the relevant authorities and remove it from our platform.”

In their own statement, Facebook, the owner of WhatsApp, proclaimed, “WhatsApp does not provide a search function for people or groups – nor does WhatsApp encourage publication of invite links to private groups. WhatsApp regularly engages with Google and Apple to enforce their terms of service on apps that attempt to encourage abuse on WhatsApp.”

“Following the reports earlier this week, WhatsApp asked Google to remove all known group link sharing apps,” the company continued. “When apps are removed from Google Play store, they are also removed from Audience Network.”

TechCrunch previously reported on WhatsApp’s child pornography problem and claimed the platform was ill-equipped to deal with the problem, with Facebook not dedicating enough resources.

Facebook itself has also had problems with child pornography, and in July, a former Facebook moderator claimed her managers had instructed her not to suspend accounts which shared child porn if they were over 30 days old.

One example which the moderator repeatedly saw was a video of two young children “standing facing each other, wearing nothing below the waist, and touching each other,” which the moderator claimed “would go away and come back,” and “appear at multiple times of the day.”

“Each time the user location would be different. One day shared from Pakistan, another day the US,” she recalled. “If the user’s account was less than 30 days old we would deactivate the account as a fake account. If the account was older than 30 days we would simply remove the content and leave the account active.”