Cerf’s dual identity made for an interesting performance at the Guardian’s Activate 2013 conference in London on Tuesday, where he faced many questions around privacy and surveillance. He initially appeared to dodge these questions — understandably, given Google’s sensitive position in the middle of the current fuss — but they kept on coming and he eventually gave a clear opinion.
The questioner wanted to know whether Cerf and his DARPA colleagues had considered the surveillance potential of stored information when they adopted the concept of state, and whether the protection of privacy required a technological or institutional solution.
Here’s what Cerf said. It is, I think, worth quoting at length:
“It has to be institutional; it also has to do with social conventions that we adopt. The reason there isn’t a technological solution is that the ability to infer information from partial information is extremely powerful — you can take information which appears to be anonymous and (extrapolate identity). It has to be a set of conventions that we adopt, either a legal framework or social conventions.
“Technology is racing ahead so quickly and we are so eager to embrace it with our mobiles and everything else that we don’t fully appreciate the side effects. When we put photos on the web and other people tag them, we create (problems) for people who just happen to be in the image. They get caught… we learned this with Street View.
“There are a lot of things that we do everyday that we think are innocent… but there are cascades of things that happen. I don’t think we’ve figured out what the right intuitive set of social conventions should be in order to protect privacy. We’re going to have to learn by making mistakes.
“This can’t be just a national issue because the internet is everywhere. The consequence of that is it causes us to confront head-on this problem of global issues, of frameworks, legal frameworks, social conventions and the like.”
Wise words, even if they don’t point to any easy solutions.
I was particularly struck by Cerf’s point about invading the privacy of others. With facial recognition technology speeding ahead, it’s very easy to override someone else’s desire for privacy or anonymity without even realizing it. It’s one thing for a person to agree to the privacy trade-off of Facebook or Google+ for themselves, and quite another to drag nearby people into that decision.
That said, I’m not entirely sure that technological measures can’t play a part. Cerf’s chief inquisitor at Tuesday’s event was media guru Jeff Jarvis, who asked (prior to the state question) whether snooping by the NSA and other intelligence agencies might put people off the cloud, and whether users might demand more encryption to protect their online activities.
Cerf noted that Google encrypts data as it travels into and out of its systems, but argued against the use of encryption within. “I’m not sure that encrypting everything inside the system would be a smart move, if it prevents us from offering the services that enable all the things you get for free,” he said, adding that it “would not be a good thing from a business point of view” or for the user either.
If someone can figure out new business models that deliver services while at least partially protecting users from prying eyes — probably business models that aren’t built around targeted advertising, as Google’s is — then technological solutions to the privacy problem may indeed play a significant role.
It is certainly the case that fragments of data can be more easily glued together than ever before, but perhaps that doesn’t preclude a new breed of service providers from doing more in mitigation. Motivation may be the key.