Early bird prices are coming to an end soon... ⏰ Grab your tickets before January 17

This article was published on January 27, 2020

Report: Cloudflare provided CDN services to sites hosting exploitative child content


Report: Cloudflare provided CDN services to sites hosting exploitative child content

For the ongoing series, Code Word, we’re exploring if — and how — technology can protect individuals against sexual assault and harassment, and how it can help and support survivors.

Cloudflare, a cloud company that provides security and content distribution services, provided CDN (Content Delivery Network) services to sites that host Child Sexual Abuse Material (CSAM). CDN refers to a group of geographically distributed servers that host site data to provide faster access to people visiting that website.

L1ght, an Israel-based company working to provide a safe environment to children online, shared a report with TNW containing details of the websites featuring harmful content. We’re not disclosing the names of these sites for safety reasons.

The firm claimed it reached out to Cloudflare in November, but is yet to receive a response. L1ght estimates the cloud company’s CDN hosts tens to hundreds of websites featuring exploitative child networks.

Experts from the Tel-Aviv based firm actively look for email and social media accounts that are exchanging exploitative material. Then the company’s algorithm scrouges the internet to find sites that might be hosting such content. L1ght’s AI has been trained to catch text, images, or video that relates to child exploitation.

[Read: Microsoft’s new message scanning tool can help identify sexual predators in chatrooms]

In a statement to TNW, Cloudflare said CSAM is repugnant and illegal, and the company doesn’t tolerate it. It has also taken action on reported domains: 

At least one of the domains outlined in L1ght’s report was not using Cloudflare (and had previously been terminated for some time) when the report was released. We have processed all the additional reports of CSAM in question and taken action, as appropriate. 

The company didn’t comment on how many websites it booted off its platform or when it took the action. It pointed us to a blog post published last December announcing the launch of a tool to catch CSAM.

In our investigation, we found that, out of four websites mentioned in L1ght’s report, three are still using Cloudflare’s services in some capacity – even if the company has booted them off its CDN. We’ve ascertained this from WHOIS lookups for these sites.

Around the same time, the New York Times published a report pointing out more than three popular CSAM websites used Cloudflare’s protection services to obscure their web addresses for years. Later, the company complied with norms and removed these sites.

Cloudflare added it has provided 5,208 reports to the National Center for Missing or Exploited Children (NCMEC) and removed 5,428 domains from its service to date. 

The fact that L1ght believes there are more sites on the CDN – and that Cloudflare removed thousands of domains from its service – indicates that this is a bigger problem than the company might imagine.

Hany Farid, a professor of computer science at Berkeley, who helped Microsoft create PhotoDNAa technology that helps in removing exploitative images of children, said it’s important for companies like Cloudflare to develop tools that detect and remove CSAM.

I think that it is important for CDNs like Cloudflare to implement these sorts (like PhotoDNA) of technologies. At the same time, it is important that other internet infrastructure services start to take their hosting of sites that are primarily dedicated to sharing and distributing CSAM more seriously by removing these sites from their hosting services when they are made aware of their presence.

It took us almost 10 years to get the major social media companies to take this issue seriously. I hope that it doesn’t take another 10 years to get internet infrastructure companies to do the same.

Globally, major companies are joining the fight to remove CSAM from the web. In 2018, Facebook introduced a new machine learning-based technique to quarantine such content from its platform. Last November, the company said it removed over 11.6 million abusive posts between July and September 2019. Earlier this month, Microsoft introduced a new free tool to scan online chatrooms and detect predators.

Get the TNW newsletter

Get the most important tech news in your inbox each week.

Also tagged with