Why privacy should be among the first considerations of a health care app developer

Andy Oram is an editor at the technical publisher and information provider O’Reilly Media, specializing currently in open source, programming, and health IT.

Given all the complexities app developers need to worry about already–user experience, piquing doctors’ and patients’ interest, performance, accommodation of multiple devices–do they have time to worry about patient privacy too? The Health Privacy Summit on June 5 and 6 in Washington, DC explained why they should–in fact, that a respect for privacy may do more to promote an app than any other feature.

The headlines over the past week should be enough to persuade you that you don’t want to be seen as one of the creeps. It’s takes more time and digging around, though, to learn what patients really want and how to write an app that fulfills their expectations.

Certainly, Fair Information Practices and proper security are a place to start, and below I’ll list a few things developers need to keep in mind. But overriding all these technical details are questions of business model. Can you make money without treating patients as so many assets to sell?

Basics to remember

I hope you already hold the trust of patients and doctors in enough reverence to take standard precautions to protect data. In principle, when it comes to privacy, health care is not so much different from other domains, but extra attention is required. Here is a list–by no means comprehensive–of some things you need to keep in mind with your application:

  • Allow patients to easily download all their data. It is also good form to give them visualizations and other access to aggregated data that could turn up useful trends. For instance, if you search for activities associated with higher or lower glucose levels, you should let patients know what you know.
  • Let patients know whom you’re giving their data to. For instance, if an app is offered by a clinic, a patient might expect the data to be shared with his or her doctor, but would probably be surprised to find that it is also going to the marketing team or even to an outside partner. And patients have indicated that they don’t like it, even if their data is (supposedly) anonymized. So respect the patient’s “reasonable expectation of privacy” and tell them exactly where the data goes. (This is a major issue in health care; view the Data Map to see how far health data travels.)
  • If you do give patient data to other organizations (even in anonymized form), allow patients to selectively opt out of revealing particular items. They may be fine reporting that they have back pain, but choose to hide their STD.  This is called data segmentation, and adds a good deal of complexity to data exchange–but it will make your patients much more comfortable with your app. Data segmentation requires an interface to let patients select what to share, extra attributes in each data field to tag that field with the patient’s choice, and a protocol for requesting and sending particular fields. If you don’t do this, however, it is much more likely that patients will refuse to share any data–or just delete your app.
  • Encrypt data everywhere: on the device where it is collected or downloaded, on your servers, and during transit. This may have a performance impact, but is absolutely critical to guaranteeing privacy.

These activities are so basic that health providers should check for them as part of their risk analysis before recommending an app to their clinicians or patients.

The parts that require patient interaction, such as explaining what you’ll do with the data and giving them the chance to opt in or opt out, place a burden on your user interface. Hopefully, you can find clever ways to minimize the cognitive burden on the patient and guide them through the process quickly.

For instance, you could start with no login and a minimal privacy notice, establishing a default of not sharing any data, then go back to the patient after a big chunk of data has built up and he has invested time in your app. At that point, he may be amenable to seeing why you want to send his data to other parties and how it will benefit him.  Because ultimately it does have to benefit him. Otherwise, using his data is unethical. (We’ll return to this question when discussing business models.)

A note on HIPAA

Of all acronyms, in a health care field heavily laden with capital letters, HIPAA (the law governing data sharing in health care) must be the most feared. I am not a lawyer, and I trust that few lawyers could do much better than me at explaining how HIPAA covers health app developers, because your field has grown up in a kind of parallel universe to law-making. I suspect that your app is not covered by HIPAA unless you market it for a specific medical treatment, and that even if you do, HIPAA probably applies to the medical professionals who employ the app rather than to you.

But I have not researched this question much, first because an authoritative answer would probably take up three articles the size of this one, and second because I don’t give a damn. HIPAA is just a law (and is bolstered by a number of state laws). The point of helping patients is to do the right thing, not to meet the letter of the law.

Business plans and patient relationships

Most apps are connected to databases and collect patient data. This is partly because developers are coming to realize they will never recoup the costs of coding and marketing an app by charging $1.99 per download, so they make it a window into a larger service. But there is a more profound reason for the longitudinal use of data: you can’t know anything about an individual by looking just as her symptoms or medical history. The data becomes powerful when it is combined with large sets of information from other sources and hooked up with clinical research.

For both these reasons, it becomes logical to sell or otherwise share the data with researchers or marketers. The public seems to have become accustomed to this revenue model on social networks and other online services–they grumble and joke about the exploitation of their personal data, but most of them just go along with it. But health data produces a very different emotional reaction, and you need to be very careful how you handle it.

Psychology researcher Kelly Caine presented provocative research at the Health Privacy Summit about patient attitudes toward sharing their data. Once it leaks outside the immediate health providers they trust, people’s willingness to share drops precipitously. Interviews showed, however, that they want to support research and are willing to release their data to researchers–but only when asked.

Individuals probably have even less tolerance for you selling their data to marketers, but they may well be persuaded that it’s in their best interest, if you can explain that they may hear about treatments or support systems that help them.

I believe, ultimately, that apps have to justify themselves by producing better outcomes for patients and reducing costs for providers. If you can do this, you can find someone to pay for development. Selling or sharing data may be a viable way to increase revenue, but if developers do it secretly and without respect for patients, a backlash will develop.

When patient and consumer advocates talk about “privacy by design,” they refer to the architecture of your products. But privacy must also enter into the earlier decisions you make about your market and the way your business works. Developers in the health care field are already realizing that tiny details may cause their app to be rejected: something that interrupts a provider’s workflow, that requires the user to push an extra button or two, etc. Trust is part of this equation too, and therefore a sensitivity to patients’ feelings. Leon Rodriguez, Director of the Office for Civil Rights in Department of Health and Human Services, made a comment at the Summit that sums up the proper attitude: “If the dignity and the autonomy of the patient are the fulcrum of discussion, you’ll come out on the right side.”

Andy Oram is an editor at the technical publisher and information provider O’Reilly Media, specializing currently in open source, programming, and health IT. Andy has published articles since 1995 on social and political issues related to computing, media, and digital networking. His personal Web site is http://www.praxagora.com/andyo. His email address is andyo@oreilly.com.  He is on the planning committee for the Health Privacy Summit.