Skip to content

Instantly share code, notes, and snippets.

@tzvsi
Created August 14, 2019 19:55
Show Gist options
  • Save tzvsi/2bbf21d74e7c3f3e2be36542bf943ab5 to your computer and use it in GitHub Desktop.
Save tzvsi/2bbf21d74e7c3f3e2be36542bf943ab5 to your computer and use it in GitHub Desktop.
All Your Data Is Health Data

All Your Data Is Health Data

By Charlie WarzelAug. 13, 2019

This article is part of a limited-run newsletter. You can sign up here.

Here’s a terrifying sentence: Hackers are “becoming increasingly interested in the susceptibility of health data.”

At least that’s the takeaway from researchers at the University of Southern California’s Center for Body Computing. They were at the Blackhat hacker conference in Las Vegas recently, where programmers set up a fake hospital environment and invited medical tech companies to bring their devices for a live stress test. “There was a lot of talk about the ease of insurance fraud and blackmail with some of this legacy software that is very hard and frustrating to update,” Dr. Mona Sobhani, who is the head of research for the Center for Body Computing, told me.

I initially got on the phone with Sobhani to get a sense of how our medical devices might be compromised. But the discussion quickly veered into different territory. She argued that our focus on medical data as the information coming out of connected pacemakers isn’t nearly as vulnerable as the information coming from far less secure sources. There’s a bigger security risk here, she argued, saying “all our data is health data.”

What Sobhani is talking about is a relatively new field called digital phenotyping, which was coined at the Harvard T.H. Chan School of Public Health. It means taking information from our digital behaviors — on websites, via our phones — and using it to gain insight into potential health issues. “People don’t realize that small data points monitored continuously can be very predictive of behaviors and health and that’s why I’m worried,” Sobhani said. She noted research that suggests early Parkinson’s motor issues might be more accurately detected by typing patterns on keyboards. And she mentioned one initial study tying language in social media posts and Facebook likes that accurately predicted depressive episodes.

Social media companies and marketers use this kind of metadata as a way to predict our behaviors all the time; for example, using shopping behaviors, companies often try to gauge whether someone is pregnant. It only makes sense that as technology encroaches deeper into our lives and data analysis gets better, more brazen researchers and organizations would try to glean insights. In such a world, Sobhani argues, even some of our most trivial data — the way our eyes move in a video clip — could be thought of as health data. “As a researcher, I think about the Amazon Echo and if our group had continuous data from voice. We’d learn so much — and I don’t just mean from the emotional content but from the language analysis and things like pauses in your speech.”

Her argument resonated. Not long after I spoke with Sobhani, I visited my doctor for a routine checkup and was struck by how much of the visit revolved around me telling her about the mundane details of my life: How was work and home life? Was I stressed? How often was I exercising? Was I eating well?, et cetera. Because it was a well visit, the bulk of the checkup was the doctor collecting a bunch of metadata that would help her interpret the small amount of raw data (my height, weight, blood pressure and temperature) she gathered.

Naturally, Sobhani’s fear goes back to the Big Tech companies — Google, Facebook, Amazon and others — that have amassed untold stores of this information. And because their devices are not classified as medical devices by the Food and Drug Administration, they’re not subject to the Health Insurance Portability and Accountability Act. “With health data, Hipaa grants you rights to have an easy copy of it,” she said. “And if tech companies are mining my data for health insights, why don’t I have access to it?”

It’s a complicated question. Google, for one, does have health initiatives. And just this year the company was sued, along with the University of Chicago, over sharing of medical records without stripping out sensitive information. And companies that make wearable devices that track heart rate and temperature and activity all process information that gleans health insights.

At the end of our call, Sobhani suggested a potential maneuver: use Hipaa as a way to rein in the tech giants’ use of our data. “If we were ever to rule that all the data they collect on us is, ultimately, health data and that we have a right to it, then they would need to hand it over by law.” In Washington, there have been some signs of movement in the health data sphere; a bill introduced this June by Senators Amy Klobuchar and Lisa Murkowski aims to protect data from fitness trackers, which isn’t classified as health data and protected by Hipaa. Still, Sobhani admits that the Hipaa classification is, at present, an unlikely solution — one that would face enormous pushback from tech companies. There are, after all, significant costs and administrative headaches involved in Hipaa compliance that tech companies would more than bristle at.

But it’s a deeply interesting frame for the question of our personal information. If the data we constantly shed is truly one of the best ways to understand our bodies — if it’s really that insightful — then why shouldn’t we treat it the same way we treat an EKG?

From the Archives: “Rules on Privacy of Patient Data Stir Hot Debate”

In keeping with our medical theme, this week’s pick is a 1999 article by Robert Pear. It details the disagreement involving the government, doctors and the health care industry over who is responsible when a patient’s medical records are invaded.

What I appreciate about this piece is how it is yet another reminder of how privacy protections — when not thought through — can often backfire.

But doctors said the rules were inadequate and could actually erode some protections that patients now have. Several physician groups said they had been invited to today’s White House ceremony, but stayed away because in many cases the proposal does not require that patients give consent before their records are shared, only that they be notified.

Another interesting bit is the discussion of putting a dollar value on our privacy — similar to the conversations going on about data dividends.

The Administration estimated that compliance would cost the health care industry $3.8 billion over five years, but the insurance industry says the costs could be 10 times that amount. With both sides saying they support the goals of the standards, a central point in the debate will be: What is privacy worth in dollars and cents?

Tip of the Week: Stay Safe on Public Wi-Fi

A survey popped in my inbox this week with the headline, “77 percent of Americans Blindly Access Public Wi-Fi.” It was commissioned by a cybersecurity company called BullGuard; according to its polling, out of 2,000 people, a vast majority don’t bat an eye before logging onto public, no-password-required Wi-Fi networks. And even though roughly 55 percent said they knew their data was less secure on open Wi-Fi, a non-trivial percentage of respondents used credit cards (26 percent) or logged onto their online bank account (32 percent).

I get it. Sometimes we’re weak and we need an internet connection even if it’s not the most responsible thing to do (N.B.: it is definitely not the most responsible thing to do and you should really avoid it if you can!). Rather than scold, here’s a few things you can do to stay safe.

First, if you can use a virtual private network, do that. Our V.P.N. explainer is here. Second, try to only use encrypted “https” sites. The Google Chrome browser will alert you if you’re not on an https site.

In Wired’s guide to public W-Fi, it suggests turning off AirDrop or network sharing so that nobody can have access to your files:

On a PC, that means going to Network and Sharing Center, then Change advanced sharing settings, then Turn off file and printer sharing. For Macs, go to System Preferences, then Sharing, and unselect everything. Then head to Finder, click on AirDrop, and select Allow me to be discovered by: No One. For iOS, just find AirDrop in the Control Center and turn it off.

Norton also has a guide to using public Wi-Fi with the helpful suggestion that you should monitor your Bluetooth connectivity:

Bluetooth in the home is an amazing feature on many smart devices. However, leaving Bluetooth on while in public places can pose a huge risk to your cybersecurity. Bluetooth connectivity allows various devices to communicate with each other, and a hacker can look for open Bluetooth signals to gain access to your devices.

If you must use public Wi-Fi, don’t go to sites or do things that might expose especially sensitive information. That means banking or e-commerce. If it seems risky, it definitely is. Err on the side of caution.

If you’re online — and, well, you are — chances are someone is using your information. We’ll tell you what you can do about it. Sign up for our limited-run newsletter.

** What I’m Reading:

My favorite piece of the week is this Drew Harwell article on facial recognition at summer camps.

If you’re looking for an excellent example of why location data is so sensitive, look no further than this article on a group sex app that leaked user data.

A fun piece on trying to hide in a world where everyone’s surveilling you.

Like other media companies, The Times collects data on its visitors when they read stories like this one. For more detail please see our privacy policy and our publisher's description of The Times's practices and continued steps to increase transparency and protections.

Follow @privacyproject on Twitter and The New York Times Opinion Section on Facebook and Instagram.

Source: https://www.nytimes.com/2019/08/13/opinion/health-data.html

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment