How can the government stop the internet of perceived risks?

0 Comments

Credit: HowStuffWorks

The headline du jour for the Federal Trade Commission’s report on the Internet of Things is that the agency recommends Congress should come up with some sort of technology-neutral law to regulate data privacy. But the more interesting discussion inside the report is how data has become a catalyst for innovation. Right now, the catch phrase among the tech set is that software is eating the world — but that is the old paradigm, as it were.

Inside the 71 pages of the FTC report we see hints of the next shift, the one where your next innovation isn’t built around how fast you can code or iterate, but on your data streams and the quality and defensibility of your algorithms. And in that world, the threat of legislation around data privacy and even some of the more-reasonable consumer opt-ins or disclosure suggestions around user data found in the FTC report are an existential threat.

The report, based on a hearing the FTC held in November 2013 and subsequent comments, suggested several topics related to data privacy and security tied to the internet of things, including:

  • Asking Congress to make some technology-neutral laws governing user data privacy
  • Suggesting ways device makers could help users manage their privacy, such as a privacy dashboard
  • Suggesting device makers limit the amount of time they keep user data
  • Asking device makers or sites to limit the data they collect and de-identify the data they collect
  • Providing some type of user opt-in and understanding of how data would be used, outside of long documents that no one reads

We’ll discuss these ideas in greater depth and the response to them at our Structure Data event in March when U.S. Federal Trade Commissioner Julie Brill gets onstage to speak about protecting privacy in an age of unbounded data.

But it’s in the detailed discussions in the report where this idea of data as a catalyst for innovation and as a competitive differentiator arises. Many of the requests for dumping user data quickly or trying to limit the way data could be used was met with worries about how that would stymie innovation. Even trying to tell consumers how it could be used, so companies could offer an opt-in opportunity, was impossible because companies don’t even know what they want to do with it yet.

They want it — all of it — because they understand that having access to it could be a catalyst for future products and business.

A few examples

A sample of Activity Recognition working in the Dropcam office.

A sample of Activity Recognition working in the Dropcam office.

owned by Nest create a computer vision algorithm

A more mundane example comes from a conversation I had last week with an entrepreneur, who had an idea for a startup that would offer sizing data at different retailers based on crowdsourced data from what other people bought at similar stores. Basically, if you bought a size 8 blouse from The Gap and size 10 shirts from Ann Taylor, the startup would then tell other people who bought size 8 shirts at The Gap they would probably wear size 10 at Ann Taylor.

It’s a nice idea, but it never got off the ground because the founders couldn’t find anyone who wanted to give them the starter dataset to prove out the concept. No data, no business.

And because the internet of things is basically a digitization of every analog element in your home and everything you do — literally every step you take — the amount of data being thrown off and correlated to a person and place is astounding. It’s also data that no one has really had access to before. As the FTC report makes very clear, much of this data has been protected by privacy laws that don’t seem to cover data collected by device makers.

Untested waters

This means we’re getting into some gray areas that stand to benefit businesses and, yes, even consumers in the form of energy savings and public health. But it also stands to cause some very real harms when companies start looking at the information they have and start discriminating against people based on the data gleaned from our trackers, our homes or our connected cars.

If a company can look at my Facebook page and decide to fire me based on some photos a friend tagged me in, what’s to stop them from pulling some of my Fitbit data, through a partnership with my company-provided health insurer, and fire me for being a functioning alcoholic? Or maybe I’m a closet vegan working at Tyson.

The FTC report makes a final compelling point on all this, which is that even if none of this happens — although I am sure that it will in certain cases — the perception that it could happen will be enough to turn some people way from adopting some of these technologies. And given the societal and personal benefits that responsible data analysis can bring, that would be unfortunate. Using data to help manage our resources and allocate our time, for example, could free us up to be more human and less stressed than we are now.

The hard part will be convincing the industry to make the internet of things a great experience, perhaps by teaching it to see the public as human beings rather than just consumers.

Comments are closed.