Update at 6:30 PM IST: The treaty between the US and the UK to disclose information of individuals suspected of serious criminal acts will not require social media platforms to install backdoors or break end-to-end encryption, a further analysis of the CLOUD Act reveals.
While the initial reports from Bloomberg and The Times implied it would force tech companies to hand over encrypted messages, the CLOUD Act suggests otherwise.
Short for ‘Clarifying Lawful Overseas Use of Data Act,’ it mandates a provider of electronic communication service to “preserve, backup, or disclose the contents of a wire or electronic communication” pertaining to individuals, regardless of whether they’re within or outside of the US.
But it’s also encryption-neutral in that it “does not create any new authority for law enforcement to compel service providers to decrypt communications.”
This means that law enforcement agencies outside the US can get some basic metadata information like IP addresses, phone numbers, timestamps of messages sent and received, contact lists, and profile photos.
However, they cannot get hold of encrypted messages and attachments from WhatsApp, although it’s possible to obtain Messenger and Instagram Direct chats given the lack of end-to-end encryption.
When reached out for a response, Facebook had this to say:
We oppose government attempts to build backdoors because they would undermine the privacy and security of our users everywhere. Government policies like the Cloud Act allow for companies to provide available information when we receive valid legal requests and do not require companies to build back doors.
In a separate post on Hacker News, Will Cathcart, Head of WhatsApp, said he is not “aware of discussions that would force us to change our product.”
We’ve updated the headline to reflect that it will just include electronic communications, and not the contents of encrypted messages. The original story follows.
Social media platforms like Facebook and WhatsApp will be forced to share users’ ‘encrypted messages’ with British police as part of a new treaty between the US and the UK, according to multiple reports from Bloomberg and The Times.
The accord is expected to be be signed next month, and will force social media firms to turn over information in assisting investigations into individuals suspected of serious criminal offenses such as terrorism and pedophilia.
Note the emphasis on ‘encrypted messages’ because, as things stand, WhatsApp’s messages are encrypted end-to-end, which prevents other parties (including WhatsApp) from snooping on your data while it’s transferred from one device to another.
So, technically, even if the company wanted to share the contents of some incriminating messages with law enforcement, there’s no way to intercept and decrypt them into human-readable text and images.
But the implications, if true, could compel the platform providers to radically alter the way they function.
The services currently have capabilities to capture metadata information, such as the profile details of the participants in a conversation and the timestamps of the messages. But disclosing the actual content would challenge the very basis of how messages are sent between one person to the other.
Former Facebook chief security officer Alex Stamos took to Twitter to dispute the reports, stating the treaty won’t involve breaking end-to-end encryption or installing backdoors. “This agreement would allow UK courts to issue requests equivalent to US courts, but it DOES NOT grant them access to anything a US court can’t get already,” he tweeted.
The development comes as social media services remain an actively exploited platform for misinformation, hate speech, and spreading child abuse videos, and countries around the world are rethinking their relationship with encryption.
Indeed, a New York Times investigation last week detailed that tech companies reported over 45 million online photos and videos of children being sexually abused in 2018, more than double what they found the previous year.
Facebook Messenger alone was responsible for nearly 12 million of the 18.4 million worldwide reports of child sexual abuse material, the report disclosed.
WhatsApp, earlier this year, came under the scanner for its role as a vector for sharing sexual abuse videos in India after a study uncovered over 50 WhatsApp groups with hundreds of members that were used to circulate child sexual abuse content.
The encryption debate
But this has also brought into focus a fundamental divide underpinning these communication services: encryption. WhatsApp is end-to-end encrypted, but Messenger is not — although it plans to add the feature as part of its pivot to privacy following immense blowback from Cambridge Analytica data disaster last year.
Data obtained through a public records request suggests Facebook’s plans to encrypt Messenger in the coming years will lead to vast numbers of images of child abuse going undetected. The data shows that WhatsApp, the company’s encrypted messaging app, submits only a small fraction of the reports Messenger does.
Ultimately, this is symptomatic of a wider problem ailing social media.
The tools built to connect people, offer anonymity and privacy have also turned out to be potent weapons ripe for misuse and spread harmful content, not to mention give perpetrators a digital safe space to mask their actions.
Last November, UK intelligence agency GCHQ suggested an approach in which service providers would “silently add a law enforcement participant to a group chat or call,” without notifying the other participants. “You end up with everything still being end-to-end encrypted, but there’s an extra ‘end’ on this particular communication,” the scheme outlined.
But in May, Apple, Google, Microsoft, and WhatsApp pushed back against the proposal to add a “ghost” user, claiming “it would introduce potential unintentional vulnerabilities, and increase risks that communications systems could be abused or misused.”
Facebook, for its part, has been repeatedly railing against the idea of building backdoors, stating it would fundamentally undermine the privacy of its users.
Although technology companies should rightfully assist intelligence agencies with specific investigations, compelling them to install backdoors to allow access to encrypted communications — as a solution to what’s widely known as the Going Dark problem — is akin to locking your doors and leaving the keys under the doormat.
It not only weakens the existing security infrastructure, but also puts the privacy and safety of millions of law-abiding citizens at risk.
Balancing privacy and safety
But encryption-based privacy is fast proving to be a double-edged sword. In spelling out his privacy-focused vision for social networking earlier this year, Facebook CEO Mark Zuckerberg rightfully stressed the need for balancing both privacy and safety:
Encryption is a powerful tool for privacy, but that includes the privacy of people doing bad things. When billions of people use a service to connect, some of them are going to misuse it for truly terrible things like child exploitation, terrorism, and extortion.
While privacy — the right to be left alone — is a fundamental human right, there are no black and white solutions for a contentious problem that requires a lot of grey area to wade through.
The privacy vs. security battle isn’t just about picking sides. It’s one thing to defend encryption in the context of a personal discussion. But it’s an entirely different matter when it’s a conduit to share images of child abuse, or to plan violent acts. It’s the same problem that’s plagued Tor.
What’s more, legislation seeking law enforcement’s access to encrypted communications sidesteps the potential for government abuse of these tools.
The Cambridge Analytica case made for the perfect poster child for how data can be misused, leaving Facebook and arguably the entire tech industry badly bruised.
Although the fallout from the data flap prompted Facebook to change its ways, these questions about privacy have yielded no easy answers for companies, regulators, or consumers who want the internet to stay open and free, and also want control over their information.