The tug of war between privacy and security has come to the fore again.
This time, the US, the UK, and Australia are making a case against end-to-end encryption (E2EE), calling on Facebook to delay its plans to implement the privacy feature across its messaging apps until “there is no reduction to user safety and without including a means for lawful access to the content of communications to protect our citizens.”
A “means for lawful access to the content” effectively amounts to providing law enforcement with a backdoor intercept, a request Facebook has consistently opposed citing security concerns.
The development, first reported by Buzzfeed News, comes as the US and the UK are expected to announce a reciprocal data-sharing agreement — dubbed CLOUD Act — that would make it easy for the arm of the law to seek information from tech companies about electronic communications of terrorists, extremists, and sexual predators.
“Security enhancements to the virtual world should not make us more vulnerable in the physical world,” the joint letter addressed to Facebook reads. “Companies should not deliberately design their systems to preclude any form of access to content, even for preventing or investigating the most serious crimes. It also impedes law enforcement’s ability to investigate these and other serious crimes.”
With WhatsApp’s E2EE powered by the Signal Protocol, the move could impact other messaging services, including Apple’s iMessage and encrypted chat apps like Signal.
“When a door opens for the US, Australia, or Britain, it also opens for hackers around the world,” The American Civil Liberties Union said in a tweet criticizing the proposition. “Companies should resist these attempts to weaken encryption that reliably protects our sensitive data from identity thieves, credit card fraud, and human rights abusers.”
The “Going Dark” problem
That encryption could hamper law enforcement‘s ability to fight criminal acts is widely known as the “Going Dark” problem. But any backdoor mechanism built into a service not just erodes the security of the internet infrastructure, it can potentially introduce new vulnerabilities that can be weaponized by bad actors for malicious purposes.
It’s worth noting the National Center for Missing and Exploited Children (NCMEC) received more than 18 million tips of online child sex abuse last year, with over 90 percent of reports coming from Facebook alone.
In March, the tech titan famously announced plans to shift to a more privacy-focused approach to its services after years of playing fast and loose with personal information. To that effect, it’s unifying the backends of WhatsApp, Instagram, and Messenger with E2EE, laying bare the tensions between privacy and safety.
Should Facebook’s encryption proposals go through, there is a real possibility that vast numbers of images of child abuse would go undetected. It could also jeopardize law enforcement surveillance efforts and the company’s own content moderation endeavors.
On one hand, law enforcement will no longer have access to Instagram and Messenger chat records of perpetrators. On the other hand, the security measure would also make viewing and tracking problematic posts a lot harder, thereby obviating the need for moderating messaging platforms entirely. As they say, you cannot police something you cannot see.
Users, however, can still flag inappropriate content, which can then be locally decrypted on their devices and sent for review. But E2EE itself is of no use if law enforcement has physical access to the devices in question.
No to weakening encryption
Facebook, for its part, has been against building backdoors into its services. “We oppose government attempts to build backdoors because they would undermine the privacy and security of our users everywhere,” the social media firm told TNW early this week.
CEO Mark Zuckerberg, in a public livestream of the Menlo Park behemoth’s weekly internal Q&A session yesterday, acknowledged the pivot to encryption would reduce tools to fight child exploitation, and said the company is working on ways to limit adults’ interaction with minors.
This is far from the first time the encryption debate has pitted the government and tech companies, and it won’t be the last. A couple of years ago, Apple and the FBI argued about whether the iPhone maker should create a tool to unlock one of the San Bernardino shooters’ iPhones.
Although there’s no solution that protects the privacy of online exchanges while granting access to law enforcement, it’s amply clear that Facebook is trying its best to revamp its tarnished brand after a string of privacy and security issues. But if it does agree to government’s demands, it risks committing “the largest overnight violation of privacy in history.”