Is it safe to buy that new gadget? Why trust is perceptual computing’s biggest problem

Hear_speak_see_no_evil_Toshogu

I am really frustrated right now. I look at the slew of awesome announcements coming in from the Consumer Electronics Show in Las Vegas, and I keep thinking the same thing: “Nope, because surveillance.” Damn you, Snowden!

A case in point: as we reported back in September, Intel will this year stick its “perceptual computing” technology, now named RealSense, into PCs. The likes of Lenovo, Dell and Acer will be releasing laptops with a new kind of webcam that scans what it sees in 3D and enables a new stage in the evolution of gesture-based human-machine interaction, allowing the manipulation of virtual objects and even the registering of user emotions.

Whoop whoop, future here we come… except here’s my webcam today:

Webcam covered with tape

I wish that piece of electrical tape denoted unreasonable paranoia, but it doesn’t — like so many things these days, webcams contain their own micro-controllers that can be hacked into without even bothering the computer’s primary chipset and, in this case, turning on the webcam light. (In case you’re wondering, by the way, yes I still use video-chatting applications – I just dug out an old external webcam that I can unplug when not in use, to make sure that off means off.)

I don’t want intelligence agents or any other hackers watching me through a good old 2D camera, so why would I want them to be able to tap into a camera that’s sensitive enough to read my heartbeat through near-imperceptible variations in my skin color? Why would I want a system that not only sees but is able to catalog my surroundings, yet doesn’t promise to keep that information secure?

The problem with perceptual or contextual computing at this point in time is that I can’t be sure who is able to access the products of that perception, or use that contextual knowledge against me in some way. I have no intrinsic problem with the idea of feeding stimuli to a developing artificial intelligence – which is loosely what’s happening here and with systems such as Google Now – but as the purchaser of that new ultrabook or omniscient smart TV or Android smartphone, I want that process to be on my own terms and solely for my own benefit. And right now, that’s not guaranteed in the slightest.

Garmin vivofit

Ditto all these new wearables being shown off at CES, from companies such as LG, Garmin, Sony, and Intel itself with its biometric earbuds. Fine, cover me from head to toe in sensors to help me get fitter and healthier, as long as no-one else can possibly access that data, neither by knocking on the service provider’s door nor by hacking into their systems.

I love tech. I love progress. Scientific breakthroughs make me go wobbly at the knees. But right now these marvellous innovations make me think first and foremost about the risks of surveillance and malicious hacking. And I know I’m not the only one.

So, tech companies, I beg you, please take my security and privacy more seriously, and tell me what you’re doing to make that happen. That sticker on the laptop box that will tell me it comes equipped with super-duper next-gen gesture control capabilities? Have it also tell me how secure the camera is. That sensor-packed wristband? Sell it to me on the basis that it won’t only help me stay healthy, but that you’ve also locked down both the product itself and the paths through which my personal data flows.

Security and privacy can no longer be afterthoughts or nice-to-haves — difficult as they are to implement in this age of embedded systems. We the consumers now know the dark flipside to these innovations, and that, manufacturers and app providers, is your problem.

After all, there’s no point in me buying into perceptual computing if I feel I have to stick a blindfold on it.

loading

Comments have been disabled for this post