2019年10月26日 星期六

Cory Doctorow on Reclaiming Technologies of Oppression


Natalie Matthews-Ramo

Each month, Future Tense Fiction—a series of short stories from Future Tense and ASU’s Center for Science and the Imagination about how technology and science will change our lives—publishes a story on a theme. The theme for October-December 2019: artificial intelligence.

Real Names Policy

“We’ve got to call you something” the supervisor said as he wrote “BILL 2892” on the name slate. That was the first day, and BILL 2892 discovered shortly thereafter that he was to be called 92, not Bill, because there were 2,891 other BILLs that had transited through the system, and several of them worked in the same building where 92 was assigned to. (It was Building 34, and there were lots more just like it, in a row that stretched all the way down the portlands, each long structure elevated on identical stained pillions that were striped with a series of high-tide lines that crept upward in the weeks and months and years that came after the first day.)

Before he was 92, he’d had many names, and before that, he’d not had any. The other kids called him YOU or KID or ASSHOLE, and the grown-ups they begged from called them FUCK OFF. Then there were the police who called him SONNY and SHUT UP, and then the teachers who’d called him Muhammed or—if they were white—Moe. He didn’t see much of the teachers because almost all of the teaching was on screens, and once he could pass the proficiency test, he’d been taken to the supervisor (name-badge: Mr. Iskal Dungog) and he’d become 92.

Ninety-Two’s work in Building 34 was as an exceptions—catcher for a Re-Cog facial recognition product. All around the world, millions of people stepped in front of cameras and made a neutral face and waited: for their office door to unlock, for the lift to be summoned, for the gates at the airport to swing open. When the camera couldn’t figure out their face, it asked them to try again, and again, and again. Then it threw an exception, and 92, or someone else in Building 34, got a live view of the feed and tapped an icon: NO-FACE (for anything that wasn’t a face, like a picture of a face, or a balloon, or, one time, a pigeon); BAD SCENE (poor lighting, dirt on the lens); CRIME (once, a decapitated man’s head; once, an unconscious woman; once, a woman in terror, a hand in her hair); and OTHER (for suspected malfunctions).

Ninety-Two was expected to tap the correct icon within three seconds. The next image would be on his screen within 1.5 seconds. Picture. Tap. Picture. Tap. All around him, stretching down long ranks, other refugees, mostly kids like him, tap, tap, tapping. When his shift was over, his eyes would swim as he shifted his focus from his screen to the work-hall and joined the other kids who were headed to the chow line. He made a friend the first week, a SARAH (SARAH 8,433, and he called her 33), and she showed him around: the most reliable cafeteria food, how to avoid getting in debt at the company tuck-shop, where the bullies were and what they did to you and what to do about it. She was about his age, which he thought was 13, and she reminded him of the girls who’d been the best at smoothing over fights in his gang back home, the ones who’d been scarred or maimed before being sent to beg. Thirty-Three was blind in her left eye, a puckered scar around her always-closed eyelid. It made 92 feel better about the missing fingers on his left hand.

Dolphin in the Tuna Net

Long after she’d got her footage, long after she reached the safe house and her heart stopped hammering and she’d settled into the cool and four bars of signal from the femtocell perched on the windowsill, where it could get line-of-sight to the tower in town, Yolanda was still unable to calm herself.

There were different kinds of anxiety: the anxiety she’d felt when she was recording the people massing for their rush, clammy under the thermal blanket with its layer of retroreflective paint that would confound drones and cameras; she walked among the people, their faces shining, their few things on their backs, their children in their arms, the smell of too many bodies and too much fear. Then there was the anxiety she felt as she retreated to her perch, where she had her long lenses, each attached to its own phone, all recording as the rush formed up, the blankets rustling and rippling and then ripping as bodies burst forth, right into the gas, into the rubber bullets, into the armored bodies that raised their truncheons and swung and swung and swung, while the klaxons blared and the drones took to the sky.

There was the anxiety she felt when the skirmish ended and she trained her lenses on the bodies sprawled on the concrete, the toys and bags that had been dropped, the child holding her mother’s limp hand and wailing.

But now came a different kind of anxiety as she edited her footage down, mixing it and captioning it, being careful to blur the faces, but being even more careful to avoid any of the anti-extremism trigger-words: migration, violence, race, racism—words that white nationalists used, but also words that were impossible to avoid when discussing the victims of white nationalism. Advertisers hated them, and algorithms couldn’t tell the difference between people being racist and people complaining about racism. There were new euphemisms every week, and new blacklists, too. In theory, she could just hit publish and when the filter blocked her, she could get in line behind tens of millions of other people whose videos had been misclassified by the bots. But she didn’t want to wait 10 months for her video to be sprung from content jail; she wanted it to go viral now.

Then she preflighted her video, to flag potentially problematic images and sounds so she could remove them. But she didn’t use YouTube’s own preflighter, because everyone knew that YouTube’s didn’t just help you figure out how to stay between the lines that Google drew and redrew every day—it also put invisible strikes against your video before you formally submitted it. After all, if you had to cut and preflight and cut and preflight a dozen times to satisfy YouTube’s filters, that was a blazing red flag that someone was trying to figure out how to sneak something through.

Algorithms couldn’t tell the difference between people being racist and people complaining about racism.

So she used a private tool that tried to approximate the YouTube filter models. It was volunteer-maintained and it was down as often as it was up, but when it worked, it was an invaluable way to spot and remove pesky spots of blood or lip-read racial slurs visible through the Customs and Border Protection visors. With its aid, she was able to scrub or blur a couple hundred frames of footage. She knew that using happy-face emojis to mask the worst violence would uprank her video with YouTube’s censorware bots, but she couldn’t bear to superimpose the grinning yellow face over the agonized boy who screamed as his father took a rubber bullet to the face. She cut the sequence instead.

Finally, she hit publish and watched the progress bar as the uploader squeezed the bits through the femtocell and then through an anonymizer. Sometimes she got up to pace the safe house’s peeling floors, stressing as the algorithm weighed her video in the balance and pondered whether to make it public.

The green checkmark faded on her screen as the video went live, and she pounced on the keyboard, triumphantly hitting the bookmark for her channel, but instead of the thumbnails of her hard-won footage, surmounted with the latest video, there was an unhappy YouTube logo and a message: “Your channel has been suspended for violating our community standards. Click here to learn more about community standards on YouTube. Click here if you want to ask for human review of your suspension.”

Garbage in, Garbage Out

Dion could have been the kid: Back when he was 15, he did stupid things in the company of stupid friends, and for reasons he still struggled to understand, it was always the stupidest friend who got to set the agenda. The only reasons he didn’t end up behind bars were his willingness to endure the contempt of his “friends” when he bowed out of the stupidest shit, and his fleetness of foot when he allowed himself to be dragged into things.

This kid sitting in the back of Dion’s cruiser was barely 15, and the cannabis oil in his vape pen was even legal—just not for under-21s. He didn’t look stoned, either. He looked terrified as Dion tapped through the cruiser’s risk assessment screens. The system had linked the kid’s overt social media identity with a bunch of pseudonymous accounts, and some of them quoted rap lyrics full of thug nonsense about guns and crime. He’d also been in a fight in the school cafeteria the previous year.

All that added up to a seriously high risk-assessment score, which meant that Dion had no choice: He had to take the kid in. Looking at the kid in the rear-view, it broke Dion’s heart. This wasn’t what he’d joined the force for. He’d dreamed of being a good cop, a cop who fought injustice and helped kids like this one find their way.

He tapped the screen to tell the stationhouse that he was bringing in a juvenile and to get the paperwork started so there could be an easy transfer when they arrived, and then he put the car into gear.

The kid—Caleb—jolted when the car started moving, straining against the cuffs and the seat belt. “Come on, man,” Caleb said. “Come on. Be serious, you can’t be taking me in for that. Come on.”

Dion hardened his heart and pretended he hadn’t heard.

“Don’t you have any real crimes to stop, officer? Come on, this is so stupid.”

“Stupid was getting high on a street corner with your homies.”

Homies? Who even talks like that? Listen, Officer Corny, I wasn’t doing anything that plenty of white kids around this city aren’t also doing tonight. Why roll up on us? Is that why you became a cop, to come down here to this neighborhood and take young people to jail?”

Dion wished he hadn’t said anything, but he had, and now the kid was needling him. Damn it.

“Come on, it’s a misdemeanor. Write me a ticket, for Christ’s sake.”

“You know a lot about the law.”

“I do know a lot about the law, as a matter of fact. Do you?”

“I know enough to know that if you get arrested, you should keep your mouth shut until you get with a lawyer, which is what I suggest you do.” The light over the screen at the front of the cruiser pulsed to let him know that it was recording, then it set up the flicker that told him that its aggression detector had tripped, meaning the computer thought Dion was about to lose his temper. He heaved a sigh and maneuvered into traffic. It was a hot night—they were all hot nights, now—and though it had been week since the floods, there was still a lot of mud and debris clogging the gutters on the side of Northwest 62nd. A lot of the rest of Miami was clean and tourist-ready, but not Little Haiti.

Caleb kept working at him, while the light glowed and blinked and reminded Dion that everything he was doing was being ingested into its database. Traffic was crawling and a quick check ahead showed him why—a huge debris field in a low spot where the floodwaters had swept all the clutter and smashed property the surge had swept through the neighborhood. The side streets had less crap strewn all over them, but they also had more people who were trying to detour around Northwest 62nd.

He thought about turning on the siren and the lights, but there really wasn’t anywhere for the cars ahead of him to go, and besides—

That’s when his cruiser alerted him, flashing a message that a shotspotter on a light-post had heard the sounds of a fight breaking out. He looked at the infrared-illuminated footage, indistinct and blurred, then decided he had better check it out. He mounted the curb and put the cruiser in park. “Stay there,” he said to the kid and locked him in.

It wasn’t a fight yet, but it was about to become one. The sweaty, angry, middle-aged white guy said he’d been cut off by a young Latina woman who wouldn’t let him merge into her lane. He had already kicked her tire and now he was getting ready to kick her boyfriend, who had gotten out of the car and was looking ready to go for it.

Dion calmed down the boyfriend and muscled the white guy away, not in cuffs, but he did pat him down before putting him in the back of the cruiser with Caleb. The old white guy was breathing like a steam-train and slapping furiously at the mosquitoes that had found their way into the cruiser. His face was coursing with sweat, and he smelled like rage and privilege. Dion hated him on sight but resolved not to show it.

He scanned the man’s ID and prodded at the screen. The risk assessment score was very low: The guy had great credit, his license plate was scanned making a daily commute to and from work with incredible regularity, and he’d never been arrested. His social media was “political/conservative” but not “extremist/identitarian,” which was also a score-lowerer. So many of these old white Floridians scored high on the extremist/identitarian bar, mostly from reposting white nationalist memes.

Dion let the man out of the cruiser and gave him back his ID. “Sir, I suggest you control your temper. It’s a hot, difficult night out here, and there’s no call for blowing up like that. It could lead to trouble. Do I make myself clear?”

The guy didn’t bother to hide his contempt. “Yes, sir.” Sir, like a curse word.

When Dion got back into the car, Caleb was apoplectic. “What the actual fuck was that, man?”

Dion put the cruiser in gear and willed his shoulders to unclench.

“Come on, are you serious? That guy walks away and I’m going to jail? You bastards don’t even try to pretend anymore, do you?”

So many of these old white Floridians scored high on the extremist/identitarian bar.

Dion knew the recorder was rolling, but he didn’t care. The kid was owed an explanation. He pulled out of traffic and back up onto the curb, put it in park. “Look, you’re going to the stationhouse because you have a high risk score. Your social media, the rap lyrics, the people in your friends list, they all put you into a higher risk category. That guy I let go? He’s got a low score. He goes, you stay.”

“That’s bullshit. It’s not illegal to listen to hip-hop. And doesn’t the Constitution say I got freedom of association?”

“Yeah, it does. But you have to understand: It’s a system. It’s using the arrest data from all the perps who went down for something serious, asking a computer to find what they had in common, and that’s what makes your score go up, because you do the same things they did.”

Caleb glared at him. “You know that’s bullshit, right? I mean, I’m not even out of high school and I know that’s bullshit. Everyone they arrest is black. They ask a computer: ‘What do all the people we arrest have in common?’ And the computer answers, ‘All the things black people have in common,’ and that gets used to arrest more black people. Meanwhile, since white folks don’t get arrested for bullshit like this, the computer tells you there’s no reason to arrest them for bullshit like this.”

He turned around and looked at the kid. “You know, I expect you’re right about that.”

Caleb’s mouth opened, shut, opened. “For real?”

“Yeah, sounds right to me.”

“So you’re gonna let me go?”

Dion tapped the recording light. “Can’t. System’s watching. I got a procedure to follow.”

“You’re fucking kidding me.”

Rather than replying, Dion put the cruiser into gear.

They drove in silence. Just before they got to the stationhouse, Dion surprised himself and said, “You know, all those cameras and algorithms were supposed to take the bias out of policing, but sometimes they just take the human judgment out of the picture.”

Caleb made a disgusted face as they entered the station. The desk sergeant was ready to relieve Dion of his burden. At the last minute, he dug in his wallet and passed the kid his business card. “Look, it is what it is, but this is how you can reach me, you need some help.” That was something that actually scored him performance points with the algorithm, the human touch.

The kid took the card in his cuffed hands, wrestled it into his jeans pocket. “Yeah, all right.”

Dion caught the desk sergeant’s eye as he handed Caleb over. “He’s been real cooperative,” he said. “Not going to give you any trouble.”

He didn’t think it was true, but maybe it would make a difference. Maybe not, though.

I/O Error

Ninety-Two’s screen was going crazy. A face would appear, but before he could make a judgment about it, it would disappear as the system decided his intervention wasn’t necessary. Flick, flick, flick, the faces flickered across his screen faster than he could make sense of, and when he raised his head to look around, he saw that it wasn’t just him: Across Building 34, there were dozens of kids staring in bewilderment at their screens. The kids whose screens weren’t crazy flocked around the kids whose screens were. Soon, everything had ground to a halt. Supervisors charged into the room, sweaty-faced, flustered, bellowing, “BACK TO WORK BACK TO WORK,” moving down the lines, seizing the flickering tablets, staring open-jawed and dumbstruck at them. This was not how the screens were supposed to act.

Walking away from your assigned work spot was totally prohibited in Building 34. You could lose all leisure privileges for a week just for an unauthorized bathroom break. But the normal rules were clearly suspended. People were milling, racing back and forth, calling out to one another. Ninety-Two found 33 in a corner with a group of other SARAHs and she pointed to the faces flickering past her tablet.

The faces were mostly brown, but not all, and the people appeared excited and scared as they looked into the camera. One woman’s expression riveted him: Though she had a black eye and an ugly bruise on her cheek, her expression was pure triumph.

“She’s having a good day,” 33 said. “I wish I could be having that kind of day.” The face was gone. There was a new one in its place. Thirty-Three’s friends were passing around lychee fruits, another forbidden pleasure in the workroom, and it was all the sweeter for that. He dropped the shell on the ground amid many others.

Ciudad Juarez got too hot for Yolanda after she refused to stop escalating the fake copyright claims on her videos. That was Their new favorite tactic. (She didn’t know who They were, exactly, but it didn’t matter, because “They” were just members of the crimeware gig-economy, cartel enforcers getting some pocket money by making themselves available for hire by anyone in the world at the tap of an app. Who knew who hired them? Superpatriot militias, ICE private “blackest black SEO” contractors, or just COINTELPRO 3.0. She’d post a video and They’d claim it infringed Their copyright and it would disappear. Each claim was made by a different obviously fake name, but the platforms didn’t care, and when They made the claim, They’d also upload it to the rights-databases that prevented her from reuploading it.

A human rights person with a Twitter blue-check put her in touch with someone on the human rights task force of YouTube’s User Safety and Privacy group, who finally reviewed all these takedowns and flagged her account so future takedown requests would trigger human review—before the censorbots pulled her videos offline. The next time she posted a video—a local militia terrorizing and robbing a group of Salvadorans—it absorbed dozens of bullshit copyright claims without being pulled offline.

The triumph she felt in that moment was short-lived. That night, she caught a vicious beating from a group of three masked men who broke down her door. It might have gone beyond a beating if it wasn’t for the neighbor who heard her cries and raised a group of locals who crowded into her house. She made arrangements to sleep somewhere else the next night—she had safe houses everywhere—and so it wasn’t until the following morning that she found out someone had firebombed her house.

She had been targeted before, but never quite like this. She couldn’t bear to spend another minute in Ciudad Juarez, so she caught a bus south and searched on the highway for connections that would get her to a Caribbean port, maybe in Tabasco or Campeche—anywhere the refugee ships were massing.

She’d almost gone to cover the ships as they were massing up. The word had been going out for months that there was a flotilla assembling across the Caribbean ports, a new kind of run for the border, something never seen before. The organizers weren’t coyotes: They were a cooperative out of Chiapas, pirate cellular operators who’d been running their own networks to coordinate community organizing and balance out their own micro-grid for a decade, hooked into the tech resistance around the world.

But Yolanda had decided against it. She was comfortable in Ciudad Juarez, plugged into her own network of activists and media hackers and meme-slingers, to say nothing of the abuelas who were determined to fatten her up with delicious home cooking. Let someone else document the flotilla.

But with her home smoldering and her neighbors at risk with every moment she lingered, the flotilla was looking like a safer bet than Ciudad Juarez. The bus would take about three days depending on the connection in Mexico City and suppressing the urge to run and hide while she transited from the bustling Terminal Central del Norte to the sparse Terminal Central del Sur, where she felt incredibly exposed and vulnerable. She ended up walking to a nearby cafe after buying her ticket and nursing cups of coffee for six hours until going back to board, sporting a broad hat and shades she’d bought from a little market stall.

The next day, as the road unrolled before the bus, she used the wireless to monitor the situation in Campeche, messaging her friends and contacts to make sure she’d have a berth when she arrived. The whole thing was shrouded in mystery: where the boat was leaving from, where it was going, how it would evade the Coast Guard patrols and the cameras that ringed every beach and port on the U.S. coast. The day stretched into night and she veered from elation at the thought of the wild high-tech secret the Zaptatistas had come up with and terror that she was handing herself over to deluded techno-idealists or, worse, human traffickers who’d dump her in the sea if things went sour.

But once she stepped off the bus in Campeche, all her anxiety melted away. The air was crackling with excitement, and everywhere she looked, she saw people—singles like her, moms with kids, old men, young men, families, carrying bags that were a little too big to be mere shopping or holiday luggage. They made eye contact with one another, nodded. Someone gave her an orange; she shared her pumpkin seeds. She leaned out over a sea-wall to look at the surf and her ridiculous broad-brimmed blew out to sea and she snatched for it, nearly overbalanced, recovered, and laughed at herself as her heart thundered in her chest.

They were eight hours at sea when the Zaptatistas gathered them on the deck to show them their new hats. They came in a variety of styles—snappy little porkpies, baseball caps, broad-brimmed straw hats, even the ridiculous boaters that had come into vogue during the American voter-registration drives and now could be had on the cheap in markets around the world.

“These are the projectors,” the Zaptatista said. She was middle-aged, with a weathered face and a loud voice. She pointed to a tiny black box underneath the brim of the Marlins cap. “It has a battery, a little camera, and it’s waterproof. It senses the shape of your face and projects ultraviolet shapes onto it that are designed to confuse facial recognition algorithms. If you connect to it by Bluetooth, you can program it to make the cameras recognize you as anyone in the world, but for today’s purposes, we’ve just designed them to make your face unrecognizable altogether. The systems will not be able to tell that you’re anyone.

“Once you land, you need to get away from the site as quickly as possible, head inland and away from the zones with the most cameras. After you’re a couple of kilometers inside, you can take the hats off. Even if your face is recorded then, it won’t be matched to a face that came ashore that morning.”

It was a thrilling idea, but even more thrilling were the demos, with a large projector screen behind the Zaptatista displaying a camera’s-eye-view of the faces of random people, the system locking onto their faces with ease until they donned the hats, then losing the plot, drawing and erasing rectangles around eyes and noses and mouths as the UV light confounded its ability to sense and identify key points of facial geometry. Then, just for fun, the Zapatista put the hat through its paces, showing how it could be used to convince the algorithm that a little girl was Winston Churchill, then that an old man was Frida Kahlo, and finally that a big bruiser of a dude was Keira Knightley. They burst into applause.

Dion was nowhere near the beach when the screen in his cruiser alerted him. They’d had multiple reports of a flotilla coming ashore—individuals in dinghies and zodiacs and canoes and even rafts of floaties. Normally, that was CBP or the Coast Guard’s department: The camera-grid ringing the beaches and ports would capture images of the people as they came ashore, and any that got away would trigger alerts as they moved inland from the sparser grid. It was efficient, cost-effective, and minimally invasive—and since they’d installed it a couple of years ago, the weekly flotilla rushes had dwindled to rare occurrences. Sometimes a whole season would go by without one.

But this was new: scrambling every unit, every cop, every agency. Not just in Miami, either. Word was that there were flotillas coming ashore all along the Gulf Coast: Texas, Florida, even Louisiana. Dion’s wife sent him a meme from a presser where the president had gone purple-faced and spitting, and that was nice to see, anyway.

Every cam on Dion’s cruiser was on high alert, face-recoging every person it passed. Dion watched as it IDed and then passed on a string of petty parole violators and other low-level junk. He kept his speed down so it could ID each person he passed.

Then he found himself looking at Caleb’s mug shot, pulled up as the cam recognized him. Dion looked around and sure enough, there he was, staring hard at Dion through the cruiser window as Dion looked up. And the woman the kid was talking with, hair under a pretty sun-bonnet, her back to him, carrying a suspiciously large backpack? Even as Dion was sizing her up, the cruiser was advising him to get out and let Re-Cog get a look at her face.

He pulled over and blatted the siren once as he got out, stretched, kept his eyes on the kid. Caleb narrowed his eyes back, hate staring, and started to whisper urgently to the woman, who turned around, saw his uniform, blanched, grabbed her backpack.

“Hello, officer,” the kid said, interposing himself between her and Dion.

“Nice to see you again.” He was wearing an ankle-cuff, prominent on his bare calf under his basketball shorts. “Who’s your friend?”

The kid crowded him, stepping in his way. “No one, just someone asking for directions. How’s things with you?”

Dion used his forearm to nudge Caleb out of the way, stepping right up to the woman. Latina, young, scared, defiant.

“Hello there,” he said, crouching down a little to get her face in better range of his bodycam. It chirped at him a couple times, error noises he’d never heard before. “Could you just step over here, please?” The diagnostic readout would be on the cruiser’s screen. He led her to the cruiser, set her in the back seat. He could smell the fear-sweat on her.

The kid looked like he was about ready to throw a punch. Dion didn’t want to take the kid to jail again. He really didn’t.

The cruiser’s screen told him that it was having an “unexpected error” and tasked him with re-photographing the woman. He took a couple more shots using the in-car cameras and tapping on her face, but the camera wasn’t having any of it. The woman’s expression went from frightened to unreadable. Amused? Confused?

The kid was staring at him. The woman, too. It was a hot day. He got out of the car, let the woman out.

“What’s your name?” he said.

“Yolanda,” she said.

“How do you know Caleb?” He jutted his chin toward the kid.

“He didn’t do anything, I promise. He saw I looked lost and asked for help. Please don’t make trouble for him. Please.”

“Well, Yolanda, I think you’d best be on your way.”

“Thank you,” she said. Her Spanish accent was faint but unmistakable.

As he put the cruiser back in gear, he caught the kid’s eye. The kid nodded just a little at him, and he nodded back. If the system didn’t recognize the woman, he didn’t have to do anything. That was the rule.

Dion always followed the rules.

Lonely at the Top

Randolph’s CFO wasn’t going for it.

“Look, I hear what you’re saying, Randy, and you’re right, it makes great engineering sense, but you’re going to get slaughtered in the market.” Asha had held down a Morgan Stanley trading desk for years before getting her own fund.

When GR Group had bought Re-Cog and moved Randolph from COO to CEO, they’d brought Asha in to be his CFO and help him keep the investors happy. They’d generally had a good relationship, though it bugged the hell out of Randolph when she pulled out this, “You’re just an engineer, you don’t understand how markets work” thing.

“Asha, I hear you too, but the reality is that this is a major product challenge. We need to reengineer the whole thing, retool now that we know that this attack isn’t just a theoretical risk. Once one bad guy figures it out, the rest aren’t going to be far behind. Doing that is a multiyear project. The fact that the Street won’t like it has no impact on whether it needs doing.”

“You’re not the boss of the markets. They’re ruled by bots.”

She shook her head. “The reality is that the algorithms know that there are a lot of positions out there that will not hang in if they think you’re going to sacrifice a couple of quarters’ growth, and as soon as the wire goes out explaining that we’re ‘retooling,’ they’re going to short you into the poorhouse. I know that the work needs doing, but it just can’t be done this way. If you try, you’ll find yourself trying to fund the work without any capital.”

“You’re saying that the most profitable course of action is off the table because of automated trading bots? Come on, Asha, you can just write the press release to—”

She cut him off. “No, that’s not how it works. You’re the boss of this company, Randy, but you’re not the boss of the markets. They’re ruled by bots, and I don’t necessarily think that’s a good thing, but it is what it is.”

It is what it is was a thing Asha said, and Randolph knew what it meant: We’re done here.

Later, he sat alone, staring at his screen and staring at photos of the faces that his software couldn’t recognize as faces. And for a moment there, he found himself actually seeing them, the expressions there, the fear and hope. Fear that the system they were in was remorseless, tireless, uncaring and unreasonable. Hope that the system’s imperfections might let them slip through. He realized that it was the same expression that he was wearing.

Read a response essay by Nettrice Gaskins, an artist-educator who collaborates with artificial intelligence.

This story and essay, and the accompanying art, are presented by AI Policy Futures, which investigates science-fiction narratives for policy insights about artificial intelligence. AI Policy Futures is a joint project of the Center for Science and the Imagination at Arizona State University and the Open Technology Institute at New America, and is supported by the William and Flora Hewlett Foundation and Google.

Most Recent in Future Tense Fiction

The Arisen,” by Louisa Hall
The Song Between Worlds,” by Indrapramit Das
No Moon and Flat Calm,” by Elizabeth Bear
Space Leek,” by Chen Qiufan
Zero in Babel,” by E. Lily Yu
What the Dead Man Said,” by Chinelo Onwualu
Double Spiral,” by Marcy Kelly

And read 14 more Future Tense Fiction tales in our anthology Future Tense Fiction: Stories of Tomorrow, out now from Unnamed Press.

Future Tense is a partnership of Slate, New America, and Arizona State University that examines emerging technologies, public policy, and society.



from Slate Magazine https://ift.tt/32PxYwu
via IFTTT

沒有留言:

張貼留言