TOTAL: {[ getCartTotalCost() | currencyFilter ]} Update cart for total shopping_basket Checkout

Privacy Tech | A Deep Dive Into the Privacy and Security Risks for Health, Wellness and Medical Apps Related reading: Testing Lab Aims To Look Under the Hood Before It Smells Like Smoke

rss_feed

""

The Food and Drug Administration (FDA), on February 9, issued “Mobile Medical Applications: Guidance for Food and Drug Administration Staff,” in which it solidified its decision not to actively regulate health and wellness apps—as well as a host of medical-related apps. Companies breathed a collective sigh of relief, and there have even been pronouncements, by various commentators, that the area is now “deregulated.” As it turns out, this is completely untrue and has lulled many into a false sense that no regulator is watching. Nothing could be further from the truth.

Enter the Federal Trade Commission (FTC).

The FTC has formidable technical expertise, a stable of enforcement attorneys and an avowed focus on the health and wellness segment. Unfortunately, however, device and pharmaceutical companies are accustomed to a regulatory focus on product safety—the FDA’s focus—and not on unfair and deceptive practices from the standpoint of online privacy and security (the FTC’s primary focus on health, wellness and medical-related apps).

Indeed, the blind spot for online privacy issues can be seen with the Department of Health and Human Services (HHS) itself, which has had to quickly patch privacy and security issues on sensitive apps like its own “HIV Services” app on iOS and Android. The HHS blind spot, however, is precisely the FTC’s focal point. Had the HIV Services app, and corresponding web portal, been a private app, it seems likely that it would have been a prime candidate for a 20-year consent decree.

The risk with health and wellness apps is that companies will not realize the large amount of data that their apps collect and share with third parties, such as advertising entities, analytics companies, social networks and hosted solutions. The FTC has been inclined to bring down the enforcement hammer on apps like the Android flashlight app, which shared a persistent device ID and geolocation with a third-party advertiser.

Switch over to a health-related app, and imagine that the app transmitted data allowing third parties to fairly infer that the end-user was pregnant, taking chemotherapy medication, being treated for AIDS, recovering from alcoholism or the like. Does anyone really believe the FTC will turn a blind eye to THAT? Presumably the plaintiffs’ class-action bar is watching as well.

What can be done to reduce these risks?

The best way to reduce these risks is through education, awareness and risk-mitigation strategies that address information sharing and collection at the technical level.

Health-related apps that aren’t regulated by the FDA and aren’t covered by HIPAA, which means the vast majority of such apps can present the perfect storm of privacy exposure—apps that collect highly sensitive data and that use third parties that are not sensitized to the appropriate handling of such data.

Examples of third-party services that routinely get baked into health-related apps, but which may lack appropriate privacy or security safeguards, would be push-notification services, advertising SDKs (software development kits), analytics packages, social network-sharing SDKs and third-party email services. A significant number of the entities associated with these third-party services have their own back-end proprietary databases of information that can be used to reidentify end-users who may believe themselves to be anonymous in the context of the app. An “anonymous” device ID that is transmitted to a party whose back-end database contains that same ID, plus the end-user’s name, address, email or phone number, is not anonymous at all. I wrote about one such example last month on this very blog.

In other instances, third-party services may be configured to collect unsalted hashes of data—such as phone number or email address—that can easily be revealed through look-up tables. Still other services—especially those of the hosted solution variety—may have access to highly detailed, personally identifiable information about end-users. This data is often collected and stored in an unencrypted format and without any meaningful overlay of legal restrictions as to its disclosure or use. Not only can such data potentially be shared for profit—in ways sure to draw fire from the FTC aimed at the app publisher—but also represents a significant exposure in the event of a data breach.

Finally, the mode of data collection by the app developer/publisher can also raise privacy issues. An app that has no third-party network traffic associated with it, but that nevertheless transmits in plaintext over HTTP that “John Smith just reminded himself to take his hepatitis medication” makes it possible for any person within range of the end-user’s WiFi signal to sniff the data. Alternatively, when the end-user is accessing any employer or school connection, those organizations have ready access to that data as well.

Any compliance program involving health and wellness apps, to be effective, must address data collection at a technical level.

If this is a concern for you or your organization, AdvaMed will host a workshop on April 22 addressing these very concerns. I’ll be joined by my colleagues Shannon Salimone and Michael Gaba, as well as Federal Trade Commissioner Julie Brill, Merck Chief Privacy Officer Hilary Wandall, CIPP/E, CIPP/US, CIPM, Princeton Computer Scientist Joe Calandrino and Princeton Electrical Engineer Jeff Dwoskin.

Comments

If you want to comment on this post, you need to login.