Why the internet of things gives us a second chance to define digital trust and privacy

connected house icon

As we connect more things to the web and share more data, there hasn’t been a coherent effort to build trust and privacy into the internet of things. This is why a conversation I had with ARM’s CTO Mike Muller last month was so refreshing. He made a distinction between trust, privacy and security and explained what our inability to address trust and privacy might mean for the internet of things.

It’s a topic we’ll also explore with his boss, ARM CEO Simon Segars at our Mobilize event in October in San Francisco. We’ll also have a security and regulatory panel moderated by Adrian Turner, of Mocana discussing these issues. But first, let’s lay the groundwork, starting with trust and privacy.

Data is powerful, so who can you trust with it?

“We should think about trust as who has access to your data and what they can do with it,” Muller said, in explaining the best way to approach security and the internet of things. “For example, I’ll know where you bought something, when you bought it, how often and who did you tweet about it.” A simple transaction can gather a chain of data that can now be linked across a variety of locations and platforms.

“When you put the long tail of lots of bits of information and big data analytics associated with today’s applications we can discern a lot. And people are not thinking it through. … I think it’s the responsibility of the industry that, as people connect, to make them socially aware of what’s happening with their data and the methods that are in place to make connections between disparate sets of data. In the web that didn’t happen, and the sense of lost privacy proliferated and it’s all out there. People are trying to claw that back and implement privacy after the fact.”

And what troubles Muller is that today, there’s nothing that supports trust and privacy in the infrastructure associated with the internet of things. And of course, less focus on privacy can also impact security, making it easier to see when someone is vulnerable or upping the amount of damage someone can do.

But instead of a nuanced discussion of the topic, we tend to use security, trust and privacy interchangeably and wring our hands about the dangers of connectivity. That’s got to stop, so let’s start with breaking up these concepts and starting a discussion around how to start tackling each problem.

Defining trust

Trust
Trust is the easiest to define and the hardest to implement. It relies on both transparency and making an effort to behave consistently. So if you are transparent that you are changing your terms of service to broadcast data that was once kept private, but you’ve promised users that you wouldn’t do that, you are being transparent but inconsistent.

That’s the equivalent of a friend telling you they are going to screw you over. It’s transparent, but it violates trust nonetheless, because screwing people over is inconsistent with being a friend.

When it comes to connected devices and apps, trust is probably most easily gained by explaining what you do with people’s data: what you share and with whom. It might also extend to promises about interoperability and supporting different platforms. Implicitly trust with connected devices also means you will respect people’s privacy and follow the best security practices. So let’s get to those.

Defining privacy

Privacy is more a construct of place as opposed to something associated with a specific device. So a connected camera on a public street is different from a connected camera inside your home. It’s easy to say that people shouldn’t be able to just grab a feed from inside your home — either from a malicious hack or the government (or a business) doing a random data scrape. But when it comes to newer connected devices like wearables it gets even┬ámore murky: Consider that something like a smart meter can share information about the user to someone who knows what to look for.

So when thinking about the internet of things and privacy, it’s probably useful to start with thinking about the data the device generates. Muller offered a good analogy, comparing the connected individual to a music publisher.

“You have to think of yourself as Universal Music,” he said. “Your data is your catalog and that catalog is valuable in a variety of formats.” Of course, that implies the user is in control of their data, which is true for some places but not all. It also means the user needs to take responsibility for understanding what happens with their data and the function and implications of the devices they bring inside their houses.

Privacy isn’t a given, and it must be protected

Big Brother is watching you
There’s also a problem of your privacy being violated, as was the case with Google sniffing out Wi-Fi passwords or the Renew system grabbing the MAC addresses off of the phones from passersby in London in order to show them a more relevant ad.

To protect privacy when everything is connected will require laws that punish violations of people’s privacy and draw lines that companies and governments can’t step over; but it will also require vigilance by users. To get this right, users should be reading the agreements they click through when they connect a device, but companies should also create those agreements, especially around data sharing transparent, in a way that inspires trust.

Governments and companies need to think about updating laws for a connected age and set criteria about how different types of data are transported and shared. Health data might still need the HIPAA-levels of regulations, but maybe looser standards can prevail for connected thermostats.

Either way, we need to stop freaking out about the dangers of connected devices and start having productive discussions about implementing trust and security before the internet of things goes the way of the web. Wonderful, free and a total wild west when it comes to privacy.

loading

Comments have been disabled for this post