2019年12月24日 星期二

What Role Should Technology Play in Childhood?


Jamiel Law

A parent and expert on algorithms responds to Malka Older’s “Actually Naneen.”

This fall my daughter performed a skit with a few of her second-grade classmates wherein they pretended to be malfunctioning Alexas. Only one of these girls actually owns an Amazon Echo, yet the technology was familiar enough to reference, and even to impersonate.

It reminded me of the famous paper in which Alan Turing invented the Turing Test as a way to tell whether our machines were thinking for themselves. Skeptical of the whole idea that intelligence is something we could definitively measure or recognize, Turing turned a classic parlor game into an A.I. detector. A judge interacts with human and a computer masquerading as human via text chat; if the computer successfully tricks the judge into thinking it is the human, we should call it intelligent. Less well-remembered is the last section of Turing’s paper. If we ever do create thinking machines, he suggested, we should treat them like children and teach them how to grow up.

Today we have flipped the bit: We treat machines just as mechanically as we ever did, but we are increasingly reliant on them to babysit, educate, and acculturate our children. Intelligent assistants have child-interaction modes and can help with homework, tell jokes, and even “read” a bedtime story. At school, A.I. tutors and adaptive courseware promise to bring individualized instruction to students at massive scale. Social media companies like TikTok seem specifically designed to engage teenagers. Tablets and phones come armed with features for adults to regulate the amount of time children spend on various activities and apps, which tend to valorize those activities like any other controlled substance.

Malka Older’s story, “Actually Naneen,” follows this trajectory to its logical conclusion, imagining a group of financially comfortable urban moms comparing notes on their robot nannies. The story drags us deep into the uncanny valley, where technologies that should be comforting and anthropomorphic instead feel alien and creepy. The robots have hugging appendages, though the more recent models are carefully designed to avoid being too human in their hugs so parents can still feel a special bond with their children.

The story puts a point on an increasingly awkward question: What role should technology play in childhood? As a parent, I am confronted with this question every day. Like many parents, I find myself inserting a smartphone between me and my children on a regular basis, even as I imagine that it must be frustrating and dispiriting for kids to interact with adults through this endless distraction barrier. On the other hand, the benefits of technological interaction are undeniable: Our children can instantly summon answers to any factual question, watch a video about some new part of the world, or practice their math, logic, spelling, and other learning fundamentals with any number of thoughtfully designed apps and games.

Treating a system that cares for you like a toaster ends up cheapening the experience, turning it into a kind of fake, emulated love.

The question of automating child care is political, economic, and ideological all at the same time. Despite decades of educational research, we still put most children through systems designed a century ago to train factory workers and farmhands. Mountains of psychological studies have done little to prevent me from making parenting mistakes—some of them, inevitably, recapitulating my childhood, while others are totally new mistakes I’m adopting into our family like so many holiday traditions. Parenting is the most intensely personal, long-haul project many humans ever take on. What other task averages so many hours over so many years, with such little external oversight or reliable feedback? There is no one correct way to parent because every parental situation is different, and navigating those differences requires all the intelligence, compassion, patience, and humanity we can throw at it.

But it also requires resources, and the idea of outsourcing parenting has always tempted those who could afford it. Some would argue that automating the drudgery of child care allows parents and children to spend the most important time together in a happier and more relaxed way. I remember the years I call the Vale of Babies—those long, sleepless years—when my wife and I would have been willing to spend money on almost any gizmo or service that promised a few more minutes of sleep. In that dark time we liked to quote the Dowager Countess from Downton Abbey, who discussed the heavy burden of child care for the 1 percent of her era, buttressed by armies of nannies, wet nurses, and governesses: “Yes, but it was an hour every day.”

What is the difference, then, between a robot nanny you program to your precise moral specifications regarding snack administration and a human (probably a woman, probably someone from a lower socioeconomic rung than you) whom you hire to do the same thing? They are both performing the delicate affective labor of nurturing, of child-rearing. The work requires physical intimacy and a kind of psychological generosity, a giving of yourself to these tiny people who soak in attention and love like seedlings take in sun.

The difference between a closely supervised human and a robot is agency and predictability—human nannies can quit, throw an unauthorized tea party, or fall asleep on the job. Machines offer the promise of tireless, panoptic supervision and remorseless enforcement of the rules, tracking every minute of screen time and GPS coordinates, allowing parents to surveil and micromanage their kids as never before, starting with those cutely styled cameras that let you watch your infant in the crib. Humans, in contrast, are more like Mary Poppins: prone to ignore or alter the rules, to adapt to exigent circumstances, and to contribute their own child care philosophies into the mix. This is why the mothers in Older’s story look down on a rival mom, Valeria, who ostentatiously hires a human to watch her children and even sends her kids to a school with 50 percent human teachers. Why pay more for a child care situation in which you have less control and less data?

The answer is trust. You entrust your children to nannies, teachers, and schools that you think can give them something you cannot—whether that something is just being home during the day, learning another language, sharing the wisdom of age, or simply performing the crucial work of listening, caring, and nurturing in a way and a time that would be difficult or impossible for you.

Trust ultimately depends on a form of respect and autonomy. The big difference between Mary Poppins and a robot slave is that Poppins will make the tough calls, disagreeing with either the kids or the parents. Poppins could quit, but could a nanny bot? Could nanny bots surveil the whole family, alerting public services if they observe child abuse or other crimes? Older’s story shows us the uncomfortable disconnect between treating Shristi’s robot, Naneen, like a cherished caregiver and treating it like a deprecated piece of hardware.

Is it self-deception to treat Naneen like a human, or is it morally right to feel and demonstrate empathy? Maybe both are true. The way parents treat the robots in their lives will define how children approach them. As Older’s story envisions, parents and children would grow attached to these systems, perhaps even love them. Treating a system that cares for you like a toaster ends up cheapening the experience, turning it into a kind of fake, emulated love.

Part of what makes Older’s story so compelling is that at least some of the robo-nannies seem to have a bit of Poppins in them: a sense of when the rules can be broken, when a gesture is more important than a dictum. For parents to embrace that kind of relationship will require real trust, of the kind people have invested in their children’s caregivers for millennia. Building that kind of partnership with a machine would require rearranging a lot of other mental furniture around our relationship with machines. But it might do us good to imagine a future in which we really care about the technologies we build to care about, and for, us.

Future Tense is a partnership of Slate, New America, and Arizona State University that examines emerging technologies, public policy, and society.



from Slate Magazine https://ift.tt/2ELrAvQ
via IFTTT

沒有留言:

張貼留言