This article is part of Privacy in the Pandemic, a Future Tense series.
With so many of us working, teaching, and socializing online much more than usual due to the pandemic, strong encryption is more important than ever for ensuring commercial information security and protecting personal privacy. Zoom, whose video conferencing software has nearly replaced in-person meetings for many people, has felt this pressure directly over the past few months. At the start of the pandemic lockdowns, the company faced intense scrutiny as it surged in popularity and suffered from a series of privacy and security issues, from Zoom bombing to misleading advertising about its encryption. The company’s leadership scrambled to respond, going so far as to acquire an entire cryptography company.
Earlier this month, the CEO announced a plan to roll out end-to-end encryption. E2EE is the gold standard of messaging encryption—it allows data, including messages, to stay scrambled in transit and only be decrypted by the recipient. But Zoom was only planning to make it available for paid corporate users, explicitly stating that the company didn’t want to offer E2EE to free accounts “because we also want to work together with the FBI, with local law enforcement.” The backlash was swift, and within two weeks, Zoom’s security team updated its E2EE plans to extend the option to unpaid users. It was a victory.
But on Tuesday, a group of Republican senators introduced the Lawful Access to Encrypted Data Act, which would make Zoom’s plans illegal—and more broadly threaten privacy just as Americans are relying on their devices more than ever.
This bill would compel tech companies to build “lawful access” mechanisms into a range of encryption products, including E2EE. E2EE means that the company providing the messaging platform, such as WhatsApp, doesn’t ever see the unscrambled data as the message crosses its servers. It can’t turn over the decrypted data to law enforcement even if it wants to. Cryptographers argue that there’s no way to allow “lawful access” without putting all of the data at risk as it travels the internet.
The new bill would also require law enforcement backdoors to encrypted data “at rest”—think a locked iPhone or protected hard drive. Apple currently doesn’t have copies of iPhone decryption keys, so when the FBI demands it unlock a seized phone, it genuinely cannot comply, leaving the bureau to find another way into the phone. Although there has been controversy over the exact number, the FBI has been stymied at least 1,000 times by encrypted phones. Attorney General William Barr complained in October that “this debate has dragged on, and … our ability to protect the public from criminal threats is rapidly deteriorating.” Proposals for regulating encryption have been floated since the 1990s, each time spurring loud objections from researchers and digital liberties groups.
Over the past year, the FBI has focused on the problem of encrypted data at rest, especially those seized phones. Seny Kamara, a cryptographer and associate professor of computer science at Brown University, told me that the resurgence of this debate over the past few years meant “people sort of assumed something was coming … the government had been making veiled threats about this for a while.” But some researchers had hoped the FBI would leave aside the question of accessing E2EE data in transit in any new regulations. A bill solely requiring lawful access to devices wouldn’t necessarily be worse during a pandemic lockdown; accessing a locked device requires law enforcement to have physical custody of a phone or hard drive. But scooping up encrypted data in transit from anywhere on the internet? That’s much more threatening now that so much day-to-day life is happening online.
Privacy advocates were skeptical that the federal government would be satisfied with just unlocking seized phones, though, and LAEDA’s requirement of lawful access to any encrypted data proves them right. Riana Pfefferkorn, associate director of surveillance and cybersecurity at the Stanford Center for Internet and Society, wrote in her analysis of the bill that she “did not believe for a single moment that law enforcement or Congress would settle for only regulating encryption as to devices and not data in transit.” Pfefferkorn told me she believes the push now to regulate messaging in addition to encrypted devices is at least in part a reaction to Facebook’s 2019 announcement that the company would add E2EE to all of its messaging products. This new bill also comes hot on the heels of another proposal that critics say is secretly designed to kill strong consumer encryption, called the EARN IT bill, and the ambitious scope of LAEDA may be designed to make EARN IT look reasonable by comparison.
Even if this bill doesn’t end up succeeding, any uncertainty in the meantime might make companies like Zoom unwilling to push ahead with ambitious plans for encryption, which could hold back privacy timelines months or possibly years. “It’s a disruptive environment,” Pfefferkorn said, referring to the continual pressure from law enforcement over encryption. She added that tech investors are following this debate closely. Even for a company that wants to work with law enforcement, the uncertainty about what might be required for lawful access and how to accomplish it make it difficult to allocate limited resources.
Companies like Zoom and Slack have faced backlash too as their products expanded from an enterprise model to consumer accounts. Employers have long had an expectation that workers would give up some amount of privacy while at work, allowing bosses to monitor behavior and performance. Communication products aimed at enterprise customers sometimes have surveillance features allowing employers to access corporate email accounts, read chats, or monitor attention on webinars. “Those features” were developed within a “labor employment law, HR context,” said Pfefferkorn. As these tools have expanded directly to consumers, there has naturally been a backlash against those features as privacy invasions.
Many of those corporate surveillance features are compatible with the types of “legal access” LAEDA is asking for, and incompatible with E2EE. Some types of data mining, like what Google has been known to allow with Gmail, are also incompatible with E2EE. Pfefferkorn believes the government is using these types of corporate data collection as justification for law enforcement access: “The government will say, well, if corporate has access to this type of information, we should be able to get our hands on that too.” Sometimes, law enforcement can even buy third-party data directly, circumventing warrants altogether.
Tech companies trying to plan their privacy strategy over the next year or few years will have to balance different demands from “enterprise interests, government, and consumers,” said Kamara. “It’s difficult to sort of juggle.” Tech companies and researchers also need to be thinking not just about whether they’re protecting the privacy of the data they’ve collected, but considering “should they have the data in the first place?”
With many people working remotely for the foreseeable future, away from prying eyes of bosses, more people might look askance at back doors in their communication platforms regardless of who the back door is intended for. Having conversations with colleagues overheard in the office is one thing, but the idea of someone spying through your work video chat into your private home feels very different.
And with massive protests ongoing against law enforcement violence and systemic racism, giving those back doors to law enforcement is likely to be especially unpopular with consumers, particularly those from marginalized groups. Kamara pointed out that communities of color have historically borne the brunt of surveillance of all kinds. The new surveillance powers proposed in LAEDA would very likely also be applied disproportionately to Black people and other marginalized communities, many of whom are currently suffering disproportionately from the coronavirus pandemic.
With coronavirus cases rising in much of the United States—including the states of the three Republican sponsors of this bill—and many places still in various forms of lockdown, voters might consider whether trying to weaken online security is the best use of congressional energy.
Future Tense is a partnership of Slate, New America, and Arizona State University that examines emerging technologies, public policy, and society.
from Slate Magazine https://ift.tt/2BkTCQG
via IFTTT
沒有留言:
張貼留言