2020年8月29日 星期六

Racism Cannot Be Reduced to Mere Computation

Photo illustration by Natalie Matthews-Ramo

A historian of technology and race responds to Tochi Onyebuchi’s “How to Pay Reparations.”

Tochi Onyebuchi’s “How to Pay Reparations” spoke to me. Its themes rang virtually every note of my twentysomething-year-long career. In 1998, I made my first digital footprint with a signed online petition in support of reparations for the Tulsa race riots. I endured countless run-ins with Oklahoma good ol’ boys while crisscrossing the state, working for candidates representing a perpetually losing political party. As an academic, I researched Black politicians and white racial resentment, and testified as an expert in federal court about cases of reverse redlining and housing discrimination. And as a historian of technology, I’ve chronicled—like Onyebuchi—the stories of hope and despair wrought by computing technology on Blackness and Black people, in the service of an ever-triumphant white racial order.

America has never seen an acceptable way forward on the question of reparations. If we even get past the “whys,” we get twisted and tangled up in the “how-tos.” But in “How to Pay Reparations,” REPAIR Project Team statistician Wendy Guan says that’s precisely what algorithms, A.I., and machine learning are good for: showing us how. And that is where the hope, the wonder, the fantasy comes from in Onyebuchi’s story. We learn that a team of data scientists, statisticians, politicians, and lawyers have finally developed a “reparations algorithm.”

Some view reparations as white America’s ticket to redemption from its original sin and Black America’s salvation from a 400-year-legacy of slavery, Jim Crow, and structural racism. But no amount of computational power can save white America’s soul, or restore Black Americans’ long-foreclosed-upon and deferred dreams. The belief that it can is at best a delusion. At worst, it is another in a long line of sick and sadistic tortures designed to inflict Black pain and suffering by ginning up false hope in technologies that have always worked against us.

Throughout the story—particularly beginning when Mayor Bobby Caine said: “Imagine that. A white mayor. Spearheading a citywide reparations scheme?”—I was reminded of James Evans, the patriarch of the Black family on the ’70s sitcom Good Times. If Evans had walked in at the beginning of the show with a fat check, his ticket out of the ghetto, you’d best believe the family would be back to eating cold oatmeal in the same damned housing project by the end of the episode.

We live in a time when we conjure tools to surgically remove bias from A.I.-driven facial recognition systems by feeding it more representative training data—a time when we deploy A.I. models to optimize profit but also weed out discrimination in mortgage lending decisions (though not so well). Why does the resounding hope and tragic failure of Onyebuchi’s so-called reparations algorithm seem so inevitable? Why are A.I. and its allied technologies so ill-equipped to produce a solution for reparations and racism? The answer lies in Onyebuchi’s powerfully affective political docudrama.

There is no way to arm an algorithm to learn and account for America’s racist past.

Neither reparations nor racism can be reduced to mere computation. When building the algorithm, Wendy was told to “focus on the tangibles.” We can quantify and model things like historical income disparities. Education outcomes. Geolocations of valuable institutions like grocery stores, hospitals, or banks. But Redacted, Onyebuchi’s data scientist, said it best. When asked, “What did you feed the algorithm?” Redacted responded: “They expect you to say something like ‘racism. We fed the formula 400-plus years of racism.’ Like it’s that simple.”

Exactly. How do you measure the loss of an Anthony Crawford, lynched because he was an uppity Negro? How do you measure who Mrs. Mary Turner might have been or what the baby in her belly might have grown up to do and be were their lives not stolen by a lynch mob’s noose? How do you quantify being left behind? That’s what happened to the Negro former farmworker whom civil rights activist Roy Wilkins was thinking of when he wrote, “The computer is but one more signal that he has been kept at arm’s length while the rest of America pressed forward into the computer era.” But all of this—the violence, the economic toll, the psychological toll—is also nebulous and unquantifiable in the same way that we cannot account for, much less model, the moral turpitude, and the social, economic, and political loss of what might have been possible.

Sure, we can imagine designing an algorithmic system to sustain the direction and process of reparations by dynamically “learning” from future data. The problem is, there is no way to arm that algorithm to learn and account for America’s racist past.

But what makes us think that A.I. won’t be mobilized to work toward anything but the detriment, rather than the interests of, Black people, anyway? The threatening robot dogs that stalk Onyebuchi’s neighborhoods—even if they are intended as pandemic surveillance, the residents know that they have other capabilities—already have analogues in today’s world. Some of them come in the form of facial recognition systems that police use to locate and arrest protesters, or algorithms that utilize stereotypes about Black athletes’ brain function to defraud them of compensation for head injury. Add these things to the now all-too-familiar credit scoring algorithms that deny Black and brown folks access to employment, recidivism algorithms that keep us locked up, and detainment algorithms that divide immigrant families, just to name a few? Nah. No hope in that.

The REPAIR Project Team devised its algorithm. Councilman Perkins made a plan and got it passed. And all an ambitious white mayor had to do was to step up and take the credit. And he couldn’t even do that, much less convince white people to do something they had never done before—“willingly and openly share in the economic bounty of this country.” The story’s ending, not to mention the criminal roll-up that scrolled through the credits, says it all. Change here requires a deliberate, revolutionary rebalancing of social, economic, and political power. To even think seriously about doing that requires an exercise of collective human will. And there is no algorithm for that.

Future Tense is a partnership of Slate, New America, and Arizona State University that examines emerging technologies, public policy, and society.



from Slate Magazine https://ift.tt/32wX4RM
via IFTTT

沒有留言:

張貼留言