Our phones and wearable devices come with no shortage of apps to monitor health. We can track sleep, heart rate, fertility and so much more — and we are. In a 2018 survey by Rock Health, a seed fund supporting digital health startups, about a quarter of participants reported using health-tracking apps and wearables, respectively.
That’s great news for scientists: In the aggregate, all that user-generated data can offer unprecedented health insights. “Big data holds the cures for many diseases,” says Moira Schieke, a physician, expert on patient privacy and data security, and the founder and CEO of Cubismi, a digital healthcare startup platform.
Schieke says that patients are eager to participate in app-based health studies; it’s a low-effort way to do some good. Still, it’s important for users to be mindful of what information they’re sharing, who they’re sharing it with and where it’s stored.
“It’s an interesting dichotomy, because that data you’re sharing through any consumer product creates some degree of risk for you personally,” says Daniel Farris, a lawyer focusing on technology, privacy and data security. “Yet from a research standpoint, having a larger dataset is extraordinarily valuable in identifying trends and patterns.”
While the only fail-safe security measure is to avoid apps altogether, there are ways to participate in mobile health studies without sacrificing your privacy. We spoke to some health-data security experts to get a handle on the privacy concerns to understand, and precautions to take, before you donate your health data to science.
Know when health information is legally protected
HIPAA, the Health Insurance Portability and Accountability Act of 1996, safeguards the privacy of protected health information. Don’t assume any personal information that a health app might collect and share falls into this category, Farris warns.
Protected health information includes any information generated by your healthcare provider or health insurance company that relates to health status, provision of healthcare or payment for healthcare and can be linked back to you. The people and organizations legally obligated to protect health information are called “covered entities.” Doctors, insurers and pharmacists are covered entities. For the most part, app developers are not.
“A pulse reading or a blood pressure reading [that can be identified as mine] is protected health data when I’m at my doctor’s office,” Farris says. “When I’m wearing my Fitbit, it’s consumer data.”
It makes sense to assume that any data shared with or generated by consumer-grade apps or devices isn’t legally protected. So if you monitor your heart rate with a smart watch for personal use, that information can be shared with third parties legally unless the terms and conditions explicitly say it won’t be.
“It’s a fundamental misconception,” Farris says, “that the health information going into wearables is subject from a regulatory standpoint to the same limitations on use, the same limitations on sharing, the same marketing restrictions or the same obligations that it be protected from a privacy standpoint.”
Know what you’re agreeing to
When you click the “I accept” button to take part in app-based research, what are you saying yes to? You might be handing over your data for a specific study, making it available for use by a research institution, donating it to the public domain or signing up to do more than just share data.
Apple, for instance, facilitates studies on user data in a few different ways: For the Apple Heart Study, heart-rate info collected from Apple Watches is being sent to Stanford Medicine to identify irregular heart rhythms for the development of heart disease treatments. Apple has also created two platforms that make it easier for users to share their health data with third parties: There’s HealthKit, which is software that lets users share health data captured by their phones with external apps, and ResearchKit, an open-source app framework that helps scientists at universities and hospitals, such as NYU Langone Medical Center, access iPhone and Apple Watch users for their studies. (Users choose which studies to enroll in and what information they share.)
In collaboration with Columbia University, the therapy app Talkspace enrolled users in a study comparing text-message-based psychotherapy to the traditional in-person model. Participation required users to report on their Talkspace experience over a period of 12 to 16 weeks.
Be realistic about security
Wendy Nilsen, a psychologist serving as deputy division director of the National Science Foundation’s Division of Engineering Education and Centers, believes patients need to face reality when it comes to their devices: “People are overwhelmingly trusting of their computers and phones in ways that they shouldn’t be.” Our trust in phones may be of particular concern: One 2012 study reported that, in a nationwide survey of 1,200 American households, 78 percent of respondents thought their phones were more secure repositories of information than other devices, including their home computers.
Apps may share information with third parties, increasing the chances for a potential breach. Users of mobile health apps, Nilsen says, should always be aware of any Hackers could also file false insurance and prescription claims.
Read the privacy statements
To increase privacy literacy, Nilsen encourages users to actually read apps’ terms of service — they’re not written to be readable, so brace yourself. Keep an eye out for privacy red flags, such as if the app needs access to your contacts and photos and which third parties, if any, your data will be shared with. “Fitbit would say they will not share data with third parties without your consent,” Nilsen says. “If they wanted to share it with a hospital, they have to tell you.”
Of course, what those third parties do with your info is up to their specific terms of service. Cambridge Analytica, the political consulting firm that worked for Trump’s 2016 presidential campaign, infamously obtained 87 million Facebook users’ personal data from a third-party researcher who told Facebook he’d be using it for academic research. In reality, Cambridge Analytica sought to use the data to create hyper-targeted ads aimed at voters.
When combing through paragraphs of legalese, you’ll want to look out for the words “de-identified” and “aggregated.” In the Apple Heart Study, the data is shared without any identifying information attached to it and analyzed in bulk. “Stanford is not able to track that back to you at all,” Farris says.
If there are facets of health apps’ privacy policies that make you uncomfortable — say, if an app needs to access to your photos for any reason — then experts advise against using it altogether and finding a different app that fits your security needs.
Be mindful of the information you’re sharing
When Leslie Heyer was developing the period and fertility tracking app Dot, she wanted to make sure that users didn’t need to submit unnecessary information. Users only need to enter their period start date for the app to work. All data is then stored on the user’s phone. “Unless [an app] needs certain information, it shouldn’t bother collecting it,” Heyer says.
Georgetown University’s Institute for Reproductive Health recently utilized Dot in a study to determine its efficacy. Upon downloading Dot, users were able to opt into the study. The opt-in agreement explained that their information would be sent to Georgetown and kept secure according to federal privacy law. “In that case, [the study data is] handled separately from how data is handled in the app,” Heyer says. Dot never had access to study participants’ data, since Georgetown collected more information than Dot typically requires.
Do your homework
Experts agree that you should only participate in reputable mobile-health studies. Learn about the companies behind apps and devices, read user reviews and make sure you understand the process by which your data becomes part of research. Here are questions Heyer suggests asking yourself in the course of researching mobile health research:
- What’s the app or device’s business model? Is this a product that makes money by collecting and selling user data?
- Is it a study backed by a trusted research and health institution with a good track record?
- Has the app teamed up for any previous research? If so, are the results published and how was the study received?
Even if you’re confident that an app won’t sell your information to outside parties, and the research institution is of the highest caliber, it’s still important to remember that no entity is immune from security breaches. Every digital platform, no matter how prestigious, is susceptible to hacks.
You may inadvertently expose your mobile health data by not password-protecting both your phone and downloaded apps. A friend could accidentally tap into your health app information and find out details you’d rather not share, from caloric intake to mental health updates. Schieke recommends taking full advantage of password functionalities on any devices.
This advice might seem obvious, but plenty of people don’t take advantage of their phones’ most basic security mechanisms. A 2017 Pew report found that 28 percent of American smartphone owners don’t lock their screens or use other security features to access their phones.
Evaluate your expectations
Some people have no problem talking about health matters on social media, while others would prefer to keep that info private. Evaluate what you’re comfortable with sharing, and for which causes. Would you risk your weight, heart rate or cholesterol becoming public information for the sake of advancing disease management? Or do you only want to participate in research if total anonymity is guaranteed?
“What people really need to do is think about their expectation of privacy is,” Farris says, “and what information they really feel is important to remain private, and make the best decisions around those.”