Stay on Top of Emerging Technology Trends
Get updates impacting your industry from our GigaOm Research Community
Hacked pacemakers. Connected homes that show when your kid is at home and what they are watching. Connected cars that can lie about your location, your speed or be controlled by hackers. Throw in consumers and companies that don’t know what they are doing and you have what might be the perfect ingredients to build a surveillance state or kill the concept of personal privacy.
That so far is the takeaway from the morning hours of the Federal Trade Commissions internet of things workshop being held on Tuesday. For those following the space, the workshop offered the now-cliched examples of a connected coffeepot, connected refrigerators and connected vineyards, but offered very little in new ways to deal with or regulate the potential security gaps and loss of privacy that the internet of things can cause.
And even Vint Cerf, the man who is credited as one of the fathers of the internet and who is now a VP at Google, couldn’t tell the FTC how it should approach regulation, saying, “If someone asked me if we should write a regulation for this, I would not know what to say,” before adding that while regulation might be helpful, an awful lot of the problems with regard to privacy is a response to our own behavior. He noted that often times we don’t think about the potential hazard of our actions.
“Before we run off to write regulations, I think we better need to understand what the risk factors are,” he added.
Few speakers were willing to set some kind of regulatory framework in place — with the exception of Craig Heffner, a vulnerability researcher at Tactical Network Solutions, who tries to break embedded systems. In a direct presentation, he called for some kind of monetary incentive for companies to build highly secure products.
Heffner said that for years people have talked about educating the consumer as if that were a solution to security issues around technology, but as he rightly points out, consumers don’t understand the technology and they shouldn’t have to. The companies offering the products should bear that burden, which means security is not only designed into the product but also all aspects of the ecosystem serving the product.
He gave an example of a connected device company that wanted to offer a cloud storage service on one of tis products and so set up the code to drop the data into a few trusted domains. Yet, the company forgot to buy one of the trusted domains, which Heffner then purchased. So it’s feasible that the data gathered by that devices could be stored with him.
So far, the FTC seems willing to hold companies responsible for lax security practices that result in a loss of consumer privacy, as the agency’s ruling in the TRENDnet case shows. TRENDnet makes a connected camera that was “hacked” and shared video footage of people’s homes online. The FTC noted that the company should have done more to protect such sensitive data as home video footage, which implies that it is thinking about classifying more types of data as sensitive when it comes to connected devices in people’s homes.
But fines are punitive, discretionary and retroactive. Given the types of data that are out there and the potential for harm, the elephant in the room is how to build out an ecosystem that protects consumer dat and privacy in a connected world. And that’s where the known unknowns multiply.
For example, Carolyn Nguyen, director at Microsoft’s Technology Policy Group, gave a presentation about how to contemplate a framework for customer data and privacy that delved into the challenges of securing customer data in relation to context. Context might be based on the data itself, how it was gathered or what is used for among other options. She basically summed things up as, “It would be great if we could understand what the questions are before we jump to formulate the answer.”
Brigitte Acoca, consumer policy analyst and lawyer at the OECD, who was attending the workshop, expressed concerns about the same thing, noting that consumers are unaware not only of the data they are giving up, but also what can be done with that data. When I asked via email what this means for policy makers, she wrote that she was unsure, but thinks talking about the issue and making some quick decisions are essential. She wrote:
It was quite interesting to hear the different panelists saying that not only consumers do not understand the technology that is available to them but likewise the same goes to those who have or are developing it. As a result, it appears that neither consumers nor companies are (if at all) aware of the privacy and security risks they are being exposed to. And it’s not just the technology that consumer do not understand, they are also not well-informed about who, in this complex landscape, can collect their data or use it, for what purposes, and with whom it may be shared.
So while I’m still waiting for interesting tidbits, such as this nugget on the number of users and the amount of their data from the CTO of SmartThings, I think we’re stuck covering the same ground when it comes to this debate. I’d really like to see us move the conversation forward. Hopefully after a day spent in discussion, and months of comments, the FTC will be ready to do just that.