This article is part of Privacy in the Pandemic, a Future Tense series.
In her 2019 book, The Age of Surveillance Capitalism, Shoshana Zuboff gave a name to what she defines as “a new economic order that claims human experience as free raw material for hidden commercial practices of extraction, prediction, and sales.” More disturbingly, she argues that the tech giants that now dominate the global economy are no longer satisfied with using their surveillance capabilities to collect what she calls our “behavioral surplus” and predict our behavior, and will soon be able to shape and control it.
The theory-heavy 600-page tome by a Harvard Business School professor emerita hit a surprising nerve (and earned an eyebrow-raising endorsement from Barack Obama, not exactly a typical surveillance critic). Today, it couldn’t feel more timely. Thanks to coronavirus-era social isolation, more and more human interaction is moving online, mediated by data-hungry tech companies. What’s more, Google and Apple are working to develop a contact tracing app that public health departments may use to monitor the virus as stay-at-home orders lift, raising familiar privacy concerns and public skepticism.
I recently spoke with Zuboff by phone to find out what she makes of the current moment, and found her in a surprisingly upbeat mood about the post-COVID-19 future of privacy and democracy. The following interview has been condensed and edited for clarity.
Joshua Keating: So, I actually read Surveillance Capitalism shortly before all this started.
Shoshana Zuboff: Oh really?
Yeah, so you’ve been in my brain a little bit as all this has been going on. One part I keep thinking about is when you discuss how the emergence of these data collection–based business models of the big tech companies coincided with the emergence of the post-9/11 security state to produce what you call “surveillance capitalism.” So, my question for right now is: Do you think the coronavirus will have a similar catalyzing effect?
That’s what the tech companies are hoping for. But things are very different now than they were 20 years ago. Therefore, I don’t think that they’re going to get their wish. That’s my short answer.
What’s your long answer?
There’s no question that in 2001, we experienced trauma that changed our politics over the next 20 years. Before then, the main conversation in Washington was about comprehensive federal privacy legislation. How far should it go? How deep should it be? Then [after 9/11] it only took 12 to 24 hours and that entire conversation on Capitol Hill and Congress shifted from privacy concerns to total information awareness. It shifted from “How do we control these whippersnappers in Silicon Valley?” to “How do we harness them and give them free rein to develop these surveillance capabilities?”
Google in 2001 had a revenue line of about $86 million and now has a market capitalization that’s plus or minus $1 trillion. How does that happen in 16 years? It happened because of this fundamentally illegitimate economic logic that begins with taking something that doesn’t belong to them, claiming it as proprietary, and making a fortune on it.
This spectacular growth would not have been possible had it not been for their freedom from law. The lack of law allowed them to develop systems that were engineered to keep populations of users in the dark.
We now have a digital economy dominated by an economic logic that essentially steals people’s private experience for translation into data for marketing, for manufacture, for sales. We thought we were driving into the digital century toward full-throttle democratization and free access to knowledge. Instead, we’ve ended up with a feudal societal pattern based on these tremendous asymmetries, and instead of our having access to all kinds of proprietary knowledge, it’s turned out that proprietary knowledge has complete access to us.
"This spectacular growth would not have been possible had it not been for their freedom from law."
So, are we in a similar situation today? Are the big tech companies going to use this to increase their reach and market share?
Well, already in the past weeks there have been stories about how the tech lobbyists are intentionally using the pandemic panic. For example, one thing that the lobbyists have been trying to do is get California legislators to agree to put the brakes on implementing new privacy laws in California and push that back to 2021. They are trying again to re-create this situation of surveillance exceptionalism. There’s another report where Google’s former CEO Eric Schmidt is cornered to saying that thanks to the pandemic, people are going to be grateful to Big Tech.
They have a vested interest in portraying the pandemic as an exception, just as the 9/11 attacks were portrayed as an exception. So, all concerns about surveillance, about privacy, should be set aside in favor of these companies being able to expand their role and somehow ride to the rescue. This is the fairy tale that they’re spinning, and of course when people feel scared fairy tales are very soothing.
Here’s where I differ from the fairy tale: Our circumstances are fundamentally different than they were in 2001. In 2001, nobody had any idea what these companies were or what they were up to. Now, the same companies are the largest, most powerful corporations in the history of business, and they are indeed empires.
But if they are empires now, then why wouldn’t they be able to use this moment to generate even more growth? So much human activity has been forced from the physical world to the internet in the last two months. And we’re being told that our salvation will be these track and trace apps that will finally let us leave our houses and return to normal. So why isn’t this an enormous opportunity for them?
It is an enormous opportunity. There’s no question about that. The question is, what are we going to do about it?
Now we know that they don’t have our best interests at heart. That they’ve been using their knowledge, and their computational capabilities, and their cornering of the data science labor force not to solve the world’s problems, not to solve world hunger, not to solve planetary catastrophe, not to cure 100 different kinds of cancer. They’ve been using it to predict our behavior and sell those predictions to people who can benefit from knowing what we’re going to do.
Certainly, in the last two years there has been a sea change in public attitudes that hasn’t yet overwhelmed the system, but it could. We’re not necessarily locked into this deterministic narrative that too many pundits are hocking and the tech companies are salivating over—that post-COVID we’re going to have comprehensive biosurveillance of all of society. People are worried. People are asking questions.
Yes, sure, we’re all concerned about privacy. But how can we take steps to protect our privacy if we have to choose between surveillance and indefinite lockdown?
That’s the bottom line here, Josh: Is the digital century going to be compatible with democracy? Some people may think that it’s necessary to accept a Google-Apple contact tracing app. But most people regard the tracking as acceptable only in a very narrow range.
What I want to say to people is, if you feel mistrustful of this, if you feel fearful of this, cherish that fear because that fear is your own. It’s a signal of your own awareness of danger. There’s an uncontrolled power here that is trying to expand.
Do you think it’s possible to do wide-scale contact tracing responsibly?
People in the public health field don’t blink when they refer to their surveillance systems because for them it’s obvious: If you’ve got an infectious disease, you’re going to try to contain it and eliminate it. And to do that you have to know where it is. For them, the word surveillance has very different connotations.
Public health administrations have been doing contact tracing with varied, increasing degrees of sophistication and tools for over a century. And I haven’t seen many examples of the public being suspicious or mistrustful of or rebelling against or being afraid of the ministrations of the public health officials who have come to talk to them in the course of this contact tracing. Why is that? My answer to that is because first of all these are professionals who are bound to scientific and professional cards, norms, standards, statutes. Second of all, these are public sector operations. They have no other interest other than to serve the public interest and ultimately exist under the rule of law and under democratic governments because they are public operations. There is no secondary game. There’s no larger amassing of private data for the purposes of manipulation and control. There’s nothing but solving the problem, getting the disease contained and getting it eliminated. That’s it.
So is there hope for those who are concerned about privacy and our rights to our own behavioral surplus?
This has to be grasped in a historical context. Right now, we are marching into the digital century naked. But this has happened before. Imagine we were standing on the precipice of the third decade of the 20th century and trying to predict the future. We would look around, we would see a court system that decides every economic dispute based on property rights of ownership. We’d be looking at a world in which the right to join a union or strike was not protected in law, where people were set upon and beat up and sometimes murdered when they tried to do those things. It would be a world where factories could continue to employ child labor, where there was no balance of power between workers and employers, no social safety net for welfare, for health care.
More than likely, we would be predicting a future world in which there was a small group of families who own these massive monopolistic behemoths of industrial capitalism and a population of industrial serfs who have few rights and are sentenced to labor without any protection.
In fact, that is not how the 20th century turned out. It didn’t turn out that way because that third decade ended up being a period of intensely fruitful institutional development, where all kinds of new institutions were finally invented along with the legislative and regulatory frameworks to support them, to make industrialization flip to democracy.
So the message that I’m dedicating myself to right now is that this next decade really is critical. You could say that COVID and this state of exception will set us back, but I have to confess I see it very differently. I see it as highlighting, in starkest terms, our failing and our vulnerability. This is exactly the motivation that we need to come together to understand, that we as users are not just some anonymous, unwashed, passive audience. Users are the new political force in the same way that workers and consumers were a political force in the 20th century. And on this political force will come the movements we need if democracy is going to survive in this century.
Future Tense is a partnership of Slate, New America, and Arizona State University that examines emerging technologies, public policy, and society.
from Slate Magazine https://ift.tt/2WAoGT2
via IFTTT
沒有留言:
張貼留言