FTC’s Brill sees consumer consent as key for health, finance apps

1 Comment

Credit: Jeff Roberts/Gigaom

When normal people use a new app, they don’t wade through hidden service terms. Many just click “OK” and hope for the best. This might be fine for a game of Candy Crush, but it can be risky in the case of apps that monitor things like your bank account or heartbeat.

On March 18, you can find out why from FTC Commissioner Julie Brill, a leading authority on privacy in the age of apps, and one of our guests at Structure Data in New York City.

Brill told me in Washington last week that her agency is concerned about gaps in existing privacy law, especially in how data is stored and sold.

“When it comes to hospitals, insurers and doctors, we have a law that’s well known and well used [i.e. HIPAA],” she said. “Outside of that, when it comes to health tech and wearables, there’s a lot of deeply sensitive information that can be analyzed.”

Brill pointed to an FTC study of 12 health and fitness apps released last spring. It showed how the apps can lead to personal health data, which is normally kept in closed loops of the medical community, trickling out to analytics and advertising companies. Here’s an FTC slide that illustrates the point:

ftc screenshot

A similar information sprawl can occur with financial apps, which many consumers use to track spending or obtain rewards.

The result, Brill said, is that data gathered for one purpose, such as counting steps or tracking spending, can get used for another without the consumer’s knowledge. In a worst-case scenario, the data could become a means for insurance companies or employers to discriminate against those who have experienced health or financial trouble.

One way to prevent this, she said, involves improving the consent and transparency process for apps that deal in sensitive data, such as those that collect health or financial information, or precise geo-locations. In these cases, Brill sees a potential solution in encouraging app makers to obtain affirmative consent if they want to use a consumer’s data out of context.

“So if the consumer downloads an app to monitor some of her vital statistics, and the health information is being used to provide that information to the consumer herself – to monitor her weight or blood pressure  – that is part of the context that the consumer understands when she downloads the app, and affirmative consent is not needed,” Brill stated in a follow-up email. “However, if the company is going to share this health information with third parties, like advertising networks, data brokers or data analysts, then that is a collection and use that is outside the context of the relationship that the consumer understood, and affirmative consent should be required.”

The challenge, of course, is for the Federal Trade Commission to find a way to improve privacy protection without subjecting vibrant parts of the economy to pointless or burdensome regulations. Brill said she’s aware of this and, in any event, formal rules or laws (including a Privacy Act like that proposed last week by President Obama) may be a long time coming.

“I believe industry can do lots before any legislation happens,” she said. “Legislation will take a long time, and this industry is taking off — so if industry can do best practices, it will allow appropriate business practices to flourish.”

To hear more about how (and if) the FTC can find a practical way to protect consumers, come join us on March 18-19 at Structure Data, where you’ll meet other leaders of the data economy, including executives from Google, Twitter, BuzzFeed and Amazon.

An earlier version of this story misspelled HIPAA as HIPPA. It has since been corrected.

1 Comment

Stephen Wilson

Good grief, the framing of privacy problems and responses is so biased. On the one hand, this article poses the real privacy problem quite cleanly: it’s the sharing of “health information [with] advertising networks, data brokers or data analysts”. But in the very next paragraph we get the classic paranoid fear of “subjecting vibrant parts of the economy to pointless or burdensome [#privacy] regulations”. It’s so shrill! Why not some reasonable middle ground? Why shouldn’t the much vaunted innovation of the American economy continue with a little more decorum, where patients are involved in pemissioning the re-use of health information about then? Wouldn’t the economic marvel get even better if the citizens knew what was going on with their data and their lives? You know, with the transparency that economics actually assumes?! So then the data would be even better and the demand for information services even higher? Privacy is not the enemy of business or of innovation.

Comments are closed.