Share this story

On its twentieth anniversary, the UK's Internet Watch Foundation—propped up by Microsoft's PhotoDNA tech—is urging Web companies to use its list of digital fingerprints to help prevent the upload, sharing, and storage of child abuse sex images online.

The IWF hash list of the underlying code associated with child abuse images was distributed to Google, Facebook, and Twitter in August 2015. It is compiled by analysts at the charity, who have the gruelling task of sifting through photos and videos showing children being sexually abused. Every eight minutes they identify a new webpage containing horrendous images.

To date, 125,583 hashes have be added to the list—more than 3,000 of which involved the abuse of babies and toddlers.

IWF chief Susie Hargreaves said that the org's analysts had always removed reported images. "But in the past it could be uploaded again, and again," she said.

"This was incredibly frustrating for us and dreadfully sad for those victims. Now our new technology allows us, and any company which uses the Image Hash List, to hunt out those abusive images, meaning Internet companies can completely stamp out copies, stop the sharing, and even stop the image being uploaded in the first place.

"This is a major breakthrough. Each and every one of these images is the painful record of a child being sexually abused. Their suffering is very real. These victims have the right to know someone is fighting this important battle."

The IWF revealed the latest figures to mark 20 years since it began fighting the circulation of child sexual abuse images online. Since 1996, it has taken down a quarter of a million URLs.

Ars asked the charity to explain how Microsoft's offering would be protected against the danger posed by making its hash list available in the cloud. IWF's technical projects officer Harriet Lester told us that Microsoft was hosting the "cloud solution" on its servers, and said that it takes "great care to ensure security and use threat mitigation practices. These are vital to the protection of services and data."

Further Reading

She added: "The hashes which are hosted on the cloud do not leave the cloud. When a company wants to compare an image on their services, they use an API to send the image up to the cloud and receive a yes/no reply if the image has matched to the IWF hashes.

"The hashes will be in Microsoft PhotoDNA format, we do not host any child sexual abuse images on the cloud, a PhotoDNA hash is irreversible."

As of today, the charity says that 0.2 percent of child sexual abuse images are hosted in the UK, compared with 18 percent in 1996.

In 2015, the IWF removed nearly 70,000 child sex abuse images from the Web. In the same year, more than 2,800 individuals in Britain were prosecuted for indecent images of children offences—a 27 percent increase on 2014.