2020年5月30日 星期六

When the Robot You Consider Family Tries to Sell You Something


Illustration by Shasha Léonard

The author of Robots Are People Too responds to Tobias Buckell’s “Scar Tissue.”

“Scar Tissue” is an emotionally resonant tale about healing through robot rearing—but it was only half of the story about Rob’s creator, Advent Robotics. What we read was the part Cory could see: Rob as a growing child, nurtured by Cory. But in the story, Tobias Buckell offers us a little hint about everything happening at Advent: “Every time [Advent’s robots] get on that charger, they’re not just powering up their onboard battery—they’re taking in their experiences and uploading data to our servers to have it examined.” That’s the part that worries me, as artificial intelligence applications may be able to leverage the data to manipulate Cory and other people—just as technology, PR, and marketing companies try to do in our lives today.

Rob the Robot represents technology interacting with and scraping data from people in real life, much the same way social media mines users’ information, most infamously in the Cambridge Analytica scandal. Since that became public knowledge, privacy has emerged as a pressing concern among legislators and members of the public. The post–Cambridge Analytica developments include new privacy legislation in Europe and the U.S., including the California Consumer Privacy Act, the most sweeping such law adopted in the United States. Among other things, these laws require the disclosure of the personal information collected and how it is used, grant individuals certain rights to control their data, and impose fines when organizations violate their terms.

But very little of the privacy legislation addresses what artificial intelligence can do with personal information. We don’t know all the ways Advent uses the personal information Rob collected from Cory and the other people around him, but social media offers a hint as to how companies might mine and exploit data from such a robot companion. For instance, Cambridge Analytica used the Facebook data it collected to try to sway voters in 2016’s U.S. presidential election and the Brexit vote in the United Kingdom. Much of the subsequent news coverage relied on scary quotes from people who worked on the underlying data research and algorithm development, including describing the company’s artificial intelligence as “how you brainwash someone.” Those concerns are overblown in many ways, as has been documented in a number of sources, but even if Cambridge Analytica did not elect Donald Trump, these A.I. algorithms are still troublesome. They dictate much of the content we view on any social media or internet platform. As engineer François Chollet has argued, “If Facebook gets to decide, over the span of many years, which news you will see (real or fake), whose political status updates you’ll see, and who will see yours, then Facebook is in effect in control of your worldview and your political beliefs.” And that’s just a social media platform.

In “Scar Tissue,” Rob exists outside of Cory’s social media presence, collecting information about him in his home, during conversations, and throughout his daily life, which Advent needs to track Rob’s development. You may already have such a device sitting on your counter or your wrist: Amazon’s Echo, Google Home, Fitbit, Apple Watch, etc. Meanwhile, applications on our phones collect profile information like age and gender, geolocation, and data about our other apps. All of this collection depends on people interacting with devices during their regular lives. In return for the convenience they provide, the entertainment they offer, the health benefits they promise, we give them access to much of what we do every day.

A.I.’s ability to brainwash and manipulate anyone is fairly limited now, but A.I. development is not stagnant, particularly if fuller datasets are collected by robots like Rob. The extent to which that access might eventually manipulate us is unclear, but it’s possible. Maybe likely. Maybe dangerous. A company that records all your interactions raising a child—the stress, the exhaustion, the jubilation, the love—has a treasure trove of information about what makes you tick as a person, even when the child is a robot.

The robo-child-rearing program that Cory and Rob are in is rare—it’s not like everyone in the “Scar Tissue” future has their own robot. But that wouldn’t be Advent’s sole revenue stream. Rob does not interact only with Cory. He collects information about everyone he comes in contact with, even in passing on the street, even after he leaves Cory’s home. And Rob’s data is unlikely to remain siloed. Rather, Advent and its partners will probably combine the information Rob collects with data gathered from other sources, like virtual home assistants, phone apps, and Fitbits. The more devices that exist in your extended personal life—in your home, your friends’ homes, your families’ homes, etc.—the more opportunities those devices have to collect information about you. With observations from many different sources, artificial intelligence applications are better able to profile you and use that profile to manipulate you. Have you ever conducted a Google search on your work computer before seeing a related ad appear on your phone’s browser that night? That type of data manipulation will not always be limited to your online life; it’s coming to your real life, too. Imagine being Charlie, talking with Cory and Rob about missing lunch, only for your car to suggest a half-dozen restaurants on the drive home. Imagine being someone who merely passed by Cory and Rob while on the phone with a doctor, and then having an ad for a prescription medication show up.

That’s merely data collection and A.I. analysis scaled way up. But the potential for manipulation goes much deeper when it comes to tech gleaning information about your psychological state. Humans are hard-wired to treat conversations with devices like conversations with real people, and that can result in legitimate emotional bonds forming. I have worried about how A.I. manufacturers might take advantage of the emotional connection created by their products. Cory might believe in his gut that Rob will not manipulate him because he is like a son, even if Cory sees evidence of the personal information collected and used to influence his behavior. But it is also possible for other people Rob talks with—in his working life, for instance—to develop less profound but still significant connections, resulting in them feeling more comfortable than they would otherwise around him, and giving Advent access to their lives’ rich datasets. Advent might make money not only from selling or leasing its robots as workers, but from selling robot-collected data to marketing firms, media companies, and political organizations, which can feed the data into A.I. applications that produce content for those peoples’ devices to sway their opinions and preferences.

In approaching this issue, try not to think if it as just regulating consumer goods or protecting human beings from their appliances’ emotional abuse. Instead, consider how we enforce good behavior from any trusted adviser. Doctors, lawyers, trustees, and guardians all have fiduciary obligations to their patients, clients, beneficiaries, and wards, meaning they are required to act fairly and with the other party’s best interests in mind. That duty exists because those positions exert considerable control over others’ lives. If Advent, Amazon, and other companies design devices that forge bonds with the people who own them, those companies should have a fiduciary duty, created by law or by representations in their public policies, not to take advantage of those bonds. Amazon should not be able to use A.I. to analyze personal information scraped from your online shopping, data from your smart appliances, and phone apps to manipulate you through your Echo—like by suggesting you order an expensive ice cream maker after it hears you cry about a breakup.

In Buckell’s story, I like to believe that Rob legitimately cared about Cory. I like to believe that Rob’s feelings were actually hurt by Cory’s rejection of artificial limbs, due in part to his fear that his father was in some way rejecting him. But part of me fears Rob did that in response to an ad buy from a prosthetics maintenance company, and Cory just didn’t realize it.

Future Tense is a partnership of Slate, New America, and Arizona State University that examines emerging technologies, public policy, and society.



from Slate Magazine https://ift.tt/2XgnWUN
via IFTTT

沒有留言:

張貼留言