7 Comments

Summary:

This year’s CES is a frustrating affair — so many cool new context-aware toys to play with, and so little reassurance from the manufacturers that their use will stay secure or private.

Hear_speak_see_no_evil_Toshogu

I am really frustrated right now. I look at the slew of awesome announcements coming in from the Consumer Electronics Show in Las Vegas, and I keep thinking the same thing: “Nope, because surveillance.” Damn you, Snowden!

A case in point: as we reported back in September, Intel will this year stick its “perceptual computing” technology, now named RealSense, into PCs. The likes of Lenovo, Dell and Acer will be releasing laptops with a new kind of webcam that scans what it sees in 3D and enables a new stage in the evolution of gesture-based human-machine interaction, allowing the manipulation of virtual objects and even the registering of user emotions.

Whoop whoop, future here we come… except here’s my webcam today:

Webcam covered with tape

I wish that piece of electrical tape denoted unreasonable paranoia, but it doesn’t — like so many things these days, webcams contain their own micro-controllers that can be hacked into without even bothering the computer’s primary chipset and, in this case, turning on the webcam light. (In case you’re wondering, by the way, yes I still use video-chatting applications – I just dug out an old external webcam that I can unplug when not in use, to make sure that off means off.)

I don’t want intelligence agents or any other hackers watching me through a good old 2D camera, so why would I want them to be able to tap into a camera that’s sensitive enough to read my heartbeat through near-imperceptible variations in my skin color? Why would I want a system that not only sees but is able to catalog my surroundings, yet doesn’t promise to keep that information secure?

The problem with perceptual or contextual computing at this point in time is that I can’t be sure who is able to access the products of that perception, or use that contextual knowledge against me in some way. I have no intrinsic problem with the idea of feeding stimuli to a developing artificial intelligence – which is loosely what’s happening here and with systems such as Google Now – but as the purchaser of that new ultrabook or omniscient smart TV or Android smartphone, I want that process to be on my own terms and solely for my own benefit. And right now, that’s not guaranteed in the slightest.

Garmin vivofit

Ditto all these new wearables being shown off at CES, from companies such as LG, Garmin, Sony, and Intel itself with its biometric earbuds. Fine, cover me from head to toe in sensors to help me get fitter and healthier, as long as no-one else can possibly access that data, neither by knocking on the service provider’s door nor by hacking into their systems.

I love tech. I love progress. Scientific breakthroughs make me go wobbly at the knees. But right now these marvellous innovations make me think first and foremost about the risks of surveillance and malicious hacking. And I know I’m not the only one.

So, tech companies, I beg you, please take my security and privacy more seriously, and tell me what you’re doing to make that happen. That sticker on the laptop box that will tell me it comes equipped with super-duper next-gen gesture control capabilities? Have it also tell me how secure the camera is. That sensor-packed wristband? Sell it to me on the basis that it won’t only help me stay healthy, but that you’ve also locked down both the product itself and the paths through which my personal data flows.

Security and privacy can no longer be afterthoughts or nice-to-haves — difficult as they are to implement in this age of embedded systems. We the consumers now know the dark flipside to these innovations, and that, manufacturers and app providers, is your problem.

After all, there’s no point in me buying into perceptual computing if I feel I have to stick a blindfold on it.

  1. How are they going to tell you how secure it is?

    The best they could ever do is list all the exploits they have been hardened against (and how). They can never just say ‘it’s secure’. It’s a tough, tough, problem.

    That said – tough problems are there to be solved.

    Share
    1. Yup, it won’t happen overnight, that’s for sure.

      Share
  2. Solutions to this problem require strong (OS independent) isolation of devices using hardware-enforced isolation such as IOMMU. Some examples:

    http://qubes-os.org
    http://genode.org
    http://www.ghs.com/products/mobile_devices.html

    Share
    1. There’s something in that, for sure

      Share
  3. Aw, s’cute! You actually think that if the companies tell you your privacy is secure it is? Bless your heart.

    Share
    1. No, I think it would make a real difference if they had to give promises like that. It would take them into different legal territory in many places around the world, for one thing, and we could all play a fun new game of “what didn’t they promise?” Also, there’s an awful lot these companies could be doing that they’re not, because it costs them money. Just look at the rash of encryption-in-transit that arose after the Snowden revelations.

      Share
  4. No doubt the security and privacy are ongoing concerns. However, it’s still fun to see how collaborative based innovation, open standards, etc. are driving a rapid evolution of highly connected devices.

    Peter Fretty

    Share

Comments have been disabled for this post