2019年11月20日 星期三

Everyone Is Asking the Wrong Question About Google’s New Health Care Project


Getty Images Plus

On Nov. 11, the world learned about Project Nightingale—Google’s secretive deal with Ascension, the largest nonprofit health system in the United States. Through this partnership, which began in 2018, Google gains access to the medical records of more than 50 million people in 21 states.

Since the Wall Street Journal first reported on it, journalists and regulators alike have been digging into the partnership. But they are asking the wrong questions about Project Nightingale. They have focused on whether Google is complying with health privacy laws, whether patients gave consent, and whether Google employees have access to medical records.

But corporations have long had access to millions of medical records, and patients are rarely informed. Even Google already had access to millions of patient records through relationships with more than a dozen health care partners, such as the Mayo Clinic, the University of Chicago, and the Cleveland Clinic. The difference between these arrangements and Google’s deal with Ascension is merely its scale.

Regulators should instead be focusing on something else: exactly what Google plans to do with all this data. According to the company, it will use the information to enhance productivity and “support improvements in clinical quality and patient safety.” Google says it will not use Ascension’s data for other purposes. However, there is a distinction between the data itself and the knowledge Google gains from analyzing that data. This distinction gives the company wiggle room to export what it learns to other contexts. Google likely aims to mine Ascension’s data and discover new markers of health it can apply outside the health care system—across its full suite of products—to infer consumers’ medical conditions.

Patent documents filed in 2018 suggest that Google aspires to predict or identify health conditions in people who haven’t even visited a doctor, via something called emergent medical data. Ascension’s trove of health data can help it achieve that goal. Emergent medical data is health information inferred by artificial intelligence from mundane consumer behavior. Whenever we interact with technology, we leave behind digital traces of our behavior that serve as raw materials for companies that mine EMD.

A recent landmark study involving Facebook demonstrates the power of EMD mining. The study analyzed the health records and social media posts of 999 Facebook users. The results were surprising. Posts containing religious language, such as the words God, Lord, and Jesus, were strong predictors of diabetes. Finding that connection would have been impossible without A.I. and access to health records.

The fact that your health data ends up in the hands of large corporations is nothing new.

Through its partnership with Ascension, Google now has access to one of the largest medical databases in the world. It can train A.I. to comb through the data and identify words, phrases, and other variables that reflect the presence or early onset of disease. That may not sound bad if you assume Google will use what it learns to improve the health care system. However, Google will likely maintain its discoveries as trade secrets and export what it learns to other divisions of its parent company, Alphabet, which include Nest and Sidewalk Labs. Google’s recently announced acquisition of Fitbit for $2.1 billion expands its hardware portfolio and EMD mining potential.

Companies love EMD because it allows them to transform mundane nonmedical data into sensitive health information. Facebook mines it from users to infer whether they are suicidal. Google is patenting a smart home that mines EMD from occupants’ behavior to infer whether they are developing Alzheimer’s disease or substance use disorders. But the real money isn’t in health screening—it’s in using EMD for consumer profiling and marketing. Insurance companies collect it to assess risk and calculate insurance premiums. Advertisers use it to deliver behavioral ads tailored to people’s health conditions.

Having access to massive amounts of health data gives Google a clear advantage over competitors in the market for EMD. Consider Facebook, which makes health predictions based on user-generated content but lacks access to medical records. Before the Cambridge Analytica scandal, Facebook was in talks to secure records from Stanford Medical School. After the story broke, Facebook decided not to pursue the deal. Consequently, instead of training its suicide prediction algorithms using medical records, Facebook must use proxies for suicide to train its software. As a result, the accuracy of its health predictions suffers.

In contrast, Google has access to medical records—50 million through the Ascension deal alone, which makes the study with 999 Facebook users seem minuscule by comparison. Combined with Google’s expertise in A.I., courtesy of Alphabet’s DeepMind subsidiary, the Ascension deal gives Google unrivaled power to find correlations between behavior and health conditions.

Google’s other subdivisions provide services including Gmail, YouTube, Google Search, the Android operating system, and Google Docs. Moreover, Google is part of a much larger company, Alphabet, which has its own subdivisions including Nest, Sidewalk Labs, and Project Wing. Each division and service is a data mining operation that collects and analyzes consumer information. Thus, Project Nightingale’s real danger is Google’s ability to leverage its cache of health data to build an unrivaled consumer health surveillance empire ­spanning numerous industries and technologies.

There are currently no laws to stop it. In fact, the Protecting Personal Health Data Act, a law recently proposed by Sens. Amy Klobuchar and Lisa Murkowski, actually creates a safe harbor for products that mine EMD, including those that collect personal health data “derived solely from other information that is not personal health data.”

The people who were treated in Ascension’s hospitals and clinics may not have been warned that their information would be transferred to Google. That’s bad. But the fact that your health data ends up in the hands of large corporations is nothing new. The more dangerous threat is corporations’ ability to leverage A.I. and large medical databases to implement widespread consumer health surveillance.

Regulators from the Department of Health and Human Services are currently investigating Project Nightingale. They should press Google on how it plans to use the health inferences drawn from Ascension’s data. HHS must acknowledge that EMD is a dangerous new type of health information, and Congress should implement laws to regulate it.

Future Tense is a partnership of Slate, New America, and Arizona State University that examines emerging technologies, public policy, and society.



from Slate Magazine https://ift.tt/2r6WBqy
via IFTTT

沒有留言:

張貼留言