Google’s Project Nightingale

Google’s new health initiative, Project Nightingale, has Google storing medical records for Ascension, the nation’s second-largest hospital system. The Department of Health and Human Services has already started an investigation into whether the arrangement violates HIPAA regulations. But having a tech company help manage data is not unusual. The public concern centers on the possibility that Google could combine these health records with other data they have access to, and perhaps overstep.

One of the projects planned for Nightingale is an attempt to predict health conditions. Could ads for remedies be shown to individuals online based on their likelihood fo developing specific conditions? And if so, would that be a violation of privacy?

Is Project Nightingale HIPAA-compliant?

Both Google and Ascension say yes.

Ascension said, “All work related to Ascension’s engagement with Google is HIPAA compliant and underpinned by a robust data security and protection effort and adherence to Ascension’s strict requirements for data handling.”

Google was even firmer: “All of Google’s work with Ascension adheres to industry-wide regulations (including HIPAA) regarding patient data, and come with strict guidance on data privacy, security and usage. We have a Business Associate Agreement (BAA) with Ascension, which governs access to Protected Health Information (PHI) for the purpose of helping providers support patient care. This is standard practice in healthcare, as patient data is frequently managed in electronic systems that nurses and doctors widely use to deliver patient care. To be clear: under this arrangement, Ascension’s data cannot be used for any other purpose than for providing these services we’re offering under the agreement, and patient data cannot and will not be combined with any Google consumer data.”

Google points out that they do lots of work for health care companies, and that their goal in Project Nightingale is benign. “We aim to provide tools that Ascension could use to support improvements in clinical quality and patient safety,” they said. The Mayo Clinic and Kaiser Permanente are among Google’s other high-profile health care clients. 

Lawmakers’ questions

Senator Elizabeth Warren and fellow Senators Richard Blumenthal and Bill Cassidy wrote a letter to Google asking point blank whether the data would be used for advertising and whether Ascension patients could opt out of having Google handle their records. They asked “how such a vast amount of private, personal health data was surreptitiously collected, and how Google plans to use it.”

The use of “surreptitiously,” when in fact both Google and Ascension had previously announced their partnership, suggests that lawmakers are extending their existing concerns about Google to Project Nightingale. The letter states that the senators’ “substantial concerns” reflect “prior privacy violations.”

In fact, the government is watching other Google partnerships in the healthcare space, too. 

House Antitrust Subcommittee Chair David N. Cicilline released a statement after Google announced its acquisition of Fitbit, another potential source of large amounts of health data.

“Google’s dominance is currently under investigation by Congress, the Justice Department, and 50 U.S. states and territories,” Cicilline said. “Google’s proposed acquisition of Fitbit would also give the company deep insights into Americans’ most sensitive information—such as their health and location data—threatening to further entrench its market power online. This proposed transaction is a major test of antitrust enforcers’ will and ability to enforce the law and halt anti-competitive concentrations of economic power. It deserves an immediate and thorough investigation.”

Warren’s letter also mentioned the FitBit deal.

While patient privacy is mentioned, the fear that Google could use health-related data to “build artificial intelligence systems” is apparently the central issue.

Emergent medical data

It has long been known that behavioral data can give companies insights into possible future needs of consumers. Target famously sent new mom offers to customers who started buying unscented lotion. This behavior, along with other specific purchases, turned out to be a sign of  pregnancy.

Expectant moms might like getting diaper coupons, and EMD has also been able to predict flu outbreaks and do other good things. Could Google, with access to much larger stores of information than other companies have, make big strides in using AI and EMD to identify advertising opportunities? 

If so, would that be a bad thing? Could creepy ads be offset by public health tools that help identify people who might have diabetes or high blood pressure without knowing it?

Google could be the only entity besides the government with enough information to create tools for those good, bad, or indifferent outcomes. The government probably doesn’t have the required skills. This could give Google a great deal of power. Power to decide which outcomes are good or bad, to make money from tools allowing those outcomes, and to control who gets to use those tools. 

HIPAA didn’t foresee Nightingale. 


Posted

in

by

Tags:

Comments

Leave a Reply