Blog Post

What comes after Siri? A web that talks back

Stay on Top of Enterprise Technology Trends

Get updates impacting your industry from our GigaOm Research Community
Join the Community!

Siri may be the hottest personal assistant since I Dream of Jeannie, but Apple’s (S aapl) artificial intelligence is only the tip of the iceberg as we combine ubiquitous connectivity, sensor networks, big data and new methods of AI and programming into a truly connected network. Instead of connecting people to people as Facebook or even some of the cooler services like do, the web of the future will connect machines to machines and connect those machines back to people.

What Om calls the “alive web” is still people talking to other people, but the next generation of the web is far more interesting. It’s when machines start talking to each other and then to people. The emergence of the Internet of things is well documented but in order to get there, we’ll need several advancements in technology from low-power, cheap sensors to better ways of programming computers so that they can understand data from several million end points.

The Internet of things is here

For example, Ford (s f) is not just connecting its entertainment systems to the web, it’s also connecting its engines. Another roadside example is the myriad traffic mapping services that gather data from cars on the road to deliver a real-time look at traffic. Such examples could end up profoundly changing the way we get around, either through self-driving cars, more efficient ride sharing or even better public transportation.

Future generations of Fords could talk to the cloud.

Want to get away from the automotive examples? One only has to consider what the alignment of big data and connected sensors could mean for healthcare. The simplest example might be epidemiological mapping of diseases, but advanced capabilities might involve insulin pumps that monitor blood sugar and react instantly, or that prompt you to order certain foods once you enter a store or restaurant. Think this is crazy? There are already 23,000 mobile health apps available for iOS and Android devices according to Happtique, an app store, and the Consumer Electronics Association is creating a medical and healthcare summit designed to hash out the issues surrounding gadget-delivered healthcare.

So what else do we need to make the web talk back?

These are emerging examples of what machines will be able to tell us once they are wired into the web, but to really understand them, we need to address some holes in the system. First, there’s connectivity, which we’re rapidly making ubiquitous. From smart-grid components connected to a cellular network, to the emergence of White Spaces broadband that will allow for connectivity on lower value networks, the connectivity element is in place, although costs can still be prohibitive.

Another area where we’re making rapid progress are the sensors that can track everything from inventory to environmental factors. Earlier this week, I reported on a company trying to make thin-film, cheap sensors that might be attached to objects for a few pennies. Already, RFID and GPS tags track high-value items such as fleet trucks or beloved pets; adding more items just requires cheaper sensors.

But now it gets hard. Ericsson estimates that 50 billion devices will be connected to the web by 2020, which seems reasonable. But if we want those items to give us information, that’s a lot of data to sift through and correlate. And if we want those machines to react without human interaction, then we’re looking at an entirely new style of programming, as well as better AI.

Smarter machines require better AI and different programming

Piecing together the puzzle of better AI.

On the AI side, Siri is huge. Voice recognition has come a long way, but what’s essential here is the level of understanding that Siri offers once it has recognized your voice. Siri has a sense of what words mean, and represents the real promise of the semantic web in something useful rather than gimmicky and complicated (sorry Twine).

Improvements in AI will allow machines to parse the data from billions of sensors and notify people to take action only when needed. At a less-complicated level, it’s akin to only getting a warning on your heart-rate monitor when your heart skips a beat, as opposed to hearing every blip. In a decade, it may involve weather data getting sent to a computer that then ships appropriate inventory to stores in time for a freak cold spell.

In addition to AI, the Internet of things will need new ways to process and store information. It will have to have its intelligence spread around multiple endpoints like a mesh, rather than centralized in one big brain. The smarter our computers get, the more challenging they are to program in a manner that lets them efficiently make use of all their compute resources. This is why projects such as IBM’s neurosynaptic chips are so important, or HP’s efforts to create a new style of chip for processing big data.

Pulling it all together

We’re getting to a point where connectivity allows us to interact in real time with each other, but the real boon to a connected web is not just connecting people, but adding to our capabilities with machines that can track states, detect anomalies, and then advise humans how to react. It’s not exactly the Singularity, but it is a necessary evolution, so we can take the terabytes of data we’re generating and separate the monumental from the mundane.

9 Responses to “What comes after Siri? A web that talks back”

  1. We must continue to watch Star Trek to get an idea of where the future is going :)

    Talking AI computers? Yup, a thing form the 60s. Putting an interface on bots linking them to APIs? Yup, old news but never delivered like this. That’s the uncanny beauty about Apple so far, they know how to deliver a working piece of technology that not only fits the needs of many but anticipates it, unlike other companies that creates needs. But then again, this is just my view.

  2. Siri for me sounds like straight out of fiction movies and books; so does this “talking Web.” Nevertheless, it’s not a far-fetched idea either. I just don’t think it’s going to see its complete fruition anytime soon. Sure, we have apps, but they’re not as close to AI programming as we’d like them to be.

  3. > Think this is crazy? There are already 23,000 mobile health apps available

    Stacey, come on, you should know better :-) That’s an illogical way to defend your hypothesis (you provide no data on how many of those 23,000 apps are dormant, deprecated, in use, etc.), but am glad you also wrote at the non-triviality of what’s required to pull off such complex systems (at the end of the day, it is people who are writing the code that talk to machines and eventually back to people) and so long as humans are designing, architecting and writing code we’ll have errors and unknown unknowns (think: Nassim Taleb Black Swans).

  4. E = MC^2, do we need the simulation of the Universe to arrive at the equation which describes it? Or do we need to simulate on synapses level(which already excludes BAC) to get the equation of awareness? Is machine awareness needed to brake the programing barrier? All problem solving animals seem to have different degrees of self, visual to cognitive [Magpies to Elephants]. Does Siri have any form of self?

  5. It seems to me a bit odd that tech savvy folks are so enamored with Siri. We’ve had chatbots understanding sentences for quite some time now. We’ve had voice recognition for quite some time now. You stick the voice recognition in front of the chatbot and allow the chatbot to make API calls in addition to being able to speak back and you have Siri. Why wasn’t it done before now? It was! Thats why I’m having a hard time understanding this love affair. Its not as if it has the level of understanding of Watson. It simply breaks down phrases and seems to miss as many as your average chatbot.

    I understand riding the news wave but I still thought I’d see better than this out of tech blogs.

    • It’s not just the AI, but how Siri makes expensive AI accessible for normal people to ask it dumb questions. It’s not like I can access Watson today, but I can ask Siri for the nearest Italian restaurant. Apple may not invent everything it implements, but it does popularize them by making them easy to use.

      • Julian Bourne

        Stacey, The accessibility is a great point. I have been working for 4 years on some of the artificial intelligence discussed in your article. Beyond Siri, the AI will get even easier to use.

  6. Lindsworth Horatio Deer

    Stacey Higginbotham, my dear, you just stepped over the red line in the sand!!!! Gigaom readers, get you pitchforks ready, the machines are ALREADY organized and take over the world.

    but sand it is. AI is inevitable.

    Just a little miffed personally, that AI was not “invented” in 2005AD…so it could fulfill the prophecy in the movie Johnny Mnemonic (1995) about “Artificial Intelligence laws of 2006AD”