AI is coming to IoT, and not all the brains will be in the cloud

1 Comment

Smart devices, appliances and the internet of things are dominating International CES this week, but we’re probably just getting a small taste of what’s to come — not only in quantity, but also in capabilities. As consumers get used to buying so-called smart devices, they’re eventually going to expect them to actually be smart. They might even expect them to be smart all the time.

So far, this type of expectation has been kind of problematic for devices and apps trying to perform feats of artificial intelligence such as computer vision. The current status quo of offloading processing to the cloud, which is the preferred method of web companies like Google, Microsoft and Baidu (for Android speech recognition, for example), works well enough computationally but can lag in terms of latency and reliability.

Running those types of algorithms locally hasn’t been too feasible historically because they’re often computationally intensive (especially in the training process), and low-power smartphone and embedded processors haven’t been up to the task. But times, they are a changing.

tegrax1

Take for example, the new mobile GPU, called the Tegra X1, that Nvidia announced over the weekend. Its teraflop of computing performance is impressive, but less so than what the company hopes it will be used for. It’s the foundation of the company’s new DRIVE PX automotive computer (pictured above), which Nvidia claims will allow cars to spot available parking spaces, park themselves and pick up drivers like a valet, and be able to distinguish between the various types of vehicles a car might encounter while on the road.

These capabilities “draw heavily on recent developments in computer vision and deep learning,” according to an Nvidia press release.

Indeed, Nvidia spotted a potential goldmine in the machine learning space a while ago as research teams began setting record after record in computer-vision competitions by training deep learning networks on GPU-powered systems. It has been releasing development kits and software libraries ever since to make it as easy as possible to embed its GPUs and program deep learning systems that can run on them.

There’s a decent-enough business selling GPUs to the webscale companies driving deep learning research (Baidu actually claims to have the biggest and best GPU-powered deep learning infrastructure around), but that’s nothing compared with the potential of being able to put a GPU in every car, smartphone and robot manufactured over the next decade.

drive px lights

Nvidia is not the only company hoping to capitalize on the smart-device gold rush. IBM has built a low-power neurosynaptic (i.e., modeled after the brain) chip called SyNAPSE that’s designed specifically for machine learning tasks such as object recognition and that consumes less than a tenth of a watt of power. Qualcomm has built a similar learning chip called Zeroth that it hopes to embed within the next generation of devices. (The folks responsible for building both will be speaking at our Structure Data conference this March in New York.)

A startup called TeraDeep says it’s working on deep learning algorithms that can run on traditional ARM and other mobile processor platforms. I’ve seen other demos of deep learning algorithms running on smartphones; one was created by Jetpac co-founder Pete Warden (whose company was acquired by Google in August) and the other was an early version of technology from stealth-mode startup called Perceptio (the founders of which Re/code profiled in this piece). TeraDeep, however, hopes to take things a step further by releasing a line of deep learning modules can be embedded directly into other connected devices, as well.

teradeep copy

Among the benefits that companies such as Google hope to derive from quantum computing is the ability to develop quantum machine learning algorithms that can run on mobile phones — and presumably other connected devices — while consuming very little power.

Don’t get me wrong, though: cloud computing will still play a big role for consumers as AI makes its way further into the internet of things. The cloud will still process data for applications that analyze aggregate user data, and it will still provide the computing brains for stuff that’s too small and cheap to justify any meaningful type of chip. But soon, it seems, we at least won’t have to do without all the smarts of our devices just because we’re without an internet connection.

1 Comment

Gama Xul

Fun! I always wanted to rip the guts out of a car and reverse engineer it’s digital components to use them to hack other cars.

Comments are closed.