6 Comments

Summary:

The Internet of things is supposed to connect every aspect of our lives from our homes and cars to the objects we wear and the goods we consume. It’s even connecting ice machines. But one thing the Internet of things lacks is a unifying standard.

The Internet of things is supposed to connect every aspect of our lives from our homes and cars to the objects we wear and the goods we consume. It’s even connecting ice machines. But one thing the Internet of things lacks is a unifying standard.

Devices will be connected by different radio technologies: Wi-Fi, Bluetooth, ZigBee and a host of 2G and mobile broadband cellular technologies. There’s really no way of assuring your “thing” will connect to the network or networks available at any given time.

The mobile industry is trying to rectify the problem, at least as it pertains to cellular machine-to-machine (M2M) technologies. The bigger issue of fragmentation among bands and technologies isn’t going to get worked out anytime soon: You’re not going to connect a GSM wristwatch to a CDMA or Wi-Fi network. But often you can’t connect that GSM wristwatch to a GSM network either. Roaming among networks that use the same technology requires not only a business arrangement with each carrier but also a common protocol.

A group of global wireless-standards bodies are trying to tackle that problem. The European Telecommunications Standards Institute (ETSI), the Telecommunications Industry Association (TIA) and the Alliance for Telecommunications Industry Solutions (ATIS) in the U.S. are working with their counterparts in Japan, Korea and China to develop a common “service layer” that can be embedded in every M2M device, making them compatible with M2M application servers hosted by any global operator.

At the end of the day, that means many of the devices in our Internet of things suddenly become untethered from specific networks. That wristwatch could work on AT&T as well as T-Mobile’s GSM network and then connect to Rogers Communications’ GSM towers when you fly into Toronto. Shipping containers embedded with M2M modules connect to whatever network is available at any port of call. The same wireless smart meter could be deployed in Kansas City or in Marrakesh without having to completely reconfigure its software.

Many things, many internets

Roaming among networks is possible today. The problem is those arrangements tend to be ad hoc deals put together by M2M service aggregators like Kore Telematics, which sort out all the underlying carrier deals and manage each network’s various protocols. Some operators have started taking matters into their own hands.

On Tuesday seven major operators — KPN, NTT DoCoMo, Rogers, SingTel, Telefónica, Telstra and Vimpelcom — formed an alliance to create a common M2M management platform allowing for the “delivery of a global product with a single SIM, eliminating roaming costs in the countries of participating operators.”

It sounds like a standard, but it’s not quite the same thing. All seven providers use the same M2M management platform supplied by Jasper Wireless, so they are able to bridge their difference through a common vendor. Still, the effort is admirable and could lead the creation of ad hoc interoperability among a large section of the world’s carriers. Jasper has many other customers besides those seven, including AT&T and America Movil.

A good example of a problem such cooperation could solve is the Kindle’s international predicament. Even though Amazon sells the Kindle all over the world it has one “home” network, AT&T. That means any Kindle user either living in or traveling to another country has to pay international download (read: roaming) fees to buy a new book or access a periodical subscription.

It’s fairly ridiculous that a multinational retailer like Amazon can’t support its flagship device internationally without resorting to such single-carrier arrangements. But if Amazon were to broker a deal with this new alliance, the Kindle would find itself at home on whichever of these seven networks it wandered onto.

There’s a possibility the industry will coalesce around a single proprietary technology such as Jasper’s, creating an ad hoc standard much like we see developing in the public-cloud space around Amazon Web Services. But an ad hoc standard isn’t a standard.

What we need is for the industry to get together and sort out a way to make every M2M device carrier and network agnostic. A gadget maker should be able to build a device that connects to the Internet of things without a specific carrier, a specific management platform or a specific application server in mind. The business deals with individual carriers would still need to be sorted out, but first we need remove the technology barriers. Otherwise we won’t wind up with a single Internet of things but instead many internets, each with its own separate sets of things.

Feature image courtesy of Shutterstock user alexmillos; Standard photo courtesy of Shutterstock user almagami

  1. Have a look at the OECD study, machine to machine, connecting billions of devices for a simple solution. The document was also discussed on Gigaom. The simple solution is to give car manufacturers, Amazon and others access to their own Sim cards independent of mobile operators. This allows roaming, national roaming, innovation while maintaining competition.

    Share
  2. Reblogged this on #Hashtag – Thoughts on Law, Technology, Internet, and Social Media and commented:
    Why we need a standard for the Internet of Things

    The Internet of Things is supposed to connect every aspect of our lives from our homes and cars to the objects we wear and the goods we consume. It’s even connecting ice machines. But one thing the Internet of Things lacks is a unifying standard.

    Devices will be connected by different radio technologies: Wi-Fi, Bluetooth, ZigBee, and a host of 2G and mobile broadband cellular technologies. There’s really no way of assuring your ‘thing’ will connect to the network or networks available at any given time.

    Share
  3. If anyone figures this out, I’d place bets on technologists in Japan more so than Silicon Valley (the Valley is too hyper focused on social networks).

    Share
  4. thanethomson Friday, July 13, 2012

    It doesn’t really matter if we don’t have a single set of standards. Applications will drive interoperability efforts between heterogeneous networks/systems. It’d probably make it cheaper and easier to have a single set of standards, but who’s going to have the final say on those standards, and what are the other hidden agendas behind their decisions? What sorts of freedoms will we be giving up by having a single set of standards?

    I’d actually advocate having multiple standards and then just figuring out how to get them to work together.

    Share
  5. What about security?

    Share
  6. We have to consider which elements need standardising. Our devices (such as http://www.powelectrics.co.uk/metron2/) will quite happily be deployed in different continents with either the same sim or a different sim. I think the bigger question is the protocol that device uses to communicate the data. The world of telecoms may not appreciate the challenges us hardware/device developers face when it comes to power management, reprogramming, remote firmware updates etc… The needs of the protocol will change from application to application.

    Dave

    Share

Comments have been disabled for this post