17 Comments

Summary:

M2M is huge, and power everything from point-of-sale machines and ER devices to much of the Big Data revolution. But all that is in danger, says John Horn of RACO Wireless, if we don’t patch two major holes.

shutterstock_16429684
photo: Shutterstock

It goes by many names these days: Machine-to-machine communications; the Internet of Things; the Industrial Internet; even just “wireless sensors.” In short, M2M is the technology that enables the automated exchange of key information between machines, and then ultimately to humans.

With seemingly endless applications that range from mere entertainment (say, streaming video to the backseat of your car) to matters of life and death (like the ability for doctors to remotely monitor blood glucose levels), M2M technology is already rapidly changing the way that we live for the better. The M2M revolution is threatened, however, by two crucial and already pressing challenges: unnecessary complexity and the impending sunset of 2G wireless.

The rise of M2M is revolutionizing the way countless industries work. From shipping companies that now track high-dollar assets from country to country, to auto dealers that use GPS tracking and automated collection technology to assess customer loans. From doctors and caregivers who can monitor elderly patients from a distance to know if they need assistance instantly, to bagged ice machine vendors who are alerted when supply is low.

M2M is also a key enabler of Big Data, as an unprecedented amount of information is being collected from automated sensors already—from inside cars, in traffic light cameras, in new automated parking meters, in energy meters and so on. In fact, a whopping 2.14 billion M2M devices will have the ability to “talk to each other” by 2021—up from an estimated 100.4 million M2M device connections in 2011, according to the research firm Analysys Mason Limited. That represents an astounding compound annual growth rate (CAGR) of 36 percent across 10 years. And through it all, M2M is creating greater operational efficiencies, more productivity and fewer costs and headaches for companies across all industries.

All of these growth predictions fail to account for the crucial challenges mentioned above, however, which could become a costly and time-consuming wrench in the works for companies if they aren’t addressed. Some estimates suggest it could slash the forecast for 2.14 billion M2M devices in half—or even worse.

Complexity is a profit killer

Right now, large enterprise M2M projects still often take up to three years to complete, and typically require at least a $25 million investment, according to the research firm Maravedis-Rethink. It’s a big reason that M2M adoption for many businesses is still in a semi-holding pattern, with few committed to building it into their business plans.

That can’t continue if we’re really going to see M2M solutions make their way into improving our everyday lives, helping businesses improve bottom lines and allowing decision makers to make more informed decisions. There is no doubt that the benefits await, but there will be some wrinkles to iron out along the way.

M2M solutions must be made to be easier to deploy. We’re talking days or weeks here – even hours. Not years. Solution providers need the ability to get thousands of M2M devices up and running at once, crucially, using existing, standardized technology. They need the ability to customize rate plans and to see in real-time how their customers are actually using their applications. This is possible. More to the point, enterprises that get their M2M applications up and running quickly are seeing amazing returns. No longer do enterprises have to sit on the sidelines and wait as the process unrolls while they continue running their business with the same deficiencies that their solution is intended to improve. Typically, there is up to a 40 percent return on their investment in the first year alone.

But every time there’s a problem with that M2M application or the enterprise IT department has to focus on something like making the wireless connection work, that ROI is reduced. And at some point, if deploying an M2M application distracts from a company’s core business rather than enhancing it, then the ROI is no longer worth the effort.

Sunsetting 2G could slow some M2M applications

Unfortunately, simplicity isn’t the only thing holding back the growth of M2M right now. In fact, the very future of some M2M applications is being challenged, thanks to the mobile industry’s migration to 3G and 4G networks. In the process many are simply shutting down their existing 2G networks, stranding customers who have M2M applications that rely on them. I recently heard of a small boutique in the Midwest that relies on a 2G network to process credit-card payments. Without notice, its 2G cellular service was shut off and suddenly the shop’s point of sale device was non-functional, leaving a vulnerable small business scrambling to find options.

Complicating matters is that many M2M applications simply don’t use enough data to justify updating or transitioning them to wider pipes and the more costly devices associated with 3G and 4G networks. So, while in many cases it may be an option to upgrade to a significantly more expensive 3G or 4G compatible device, the low levels of data consumption required by these applications would not come close to justifying it, and so unnecessarily put a hit on a businesses ROI.

These shifts force customers to be very strategic in how they plan their M2M strategy. As some carriers are forced to move away from 2G networks because of spectrum constraints or other long-term strategies, there are other carriers that remain committed to supporting their 2G networks.

The bottom line is that for M2M to reach its full potential, application providers need easy-to-implement M2M solutions. And they also need some assurance that their M2M solutions will still be supported in the future as networks continue to evolve. If you give enterprises and potential M2M application developers these two things, M2M will reach its full potential.

And we’ll continue to see a revolution in the way business gets done.

John Horn is president of RACO Wireless.

Photo courtesy of Shutterstock.

  1. Some very good points made here – nice article John. The current and burgeoning crop of M2M solutions have one thing in common, whether we’re talking about 2G, 3G or even (perhaps especially) 4G. It is the unrealistic overhead of a legacy cellular infrastructure that has limited capacity, is hugely expensive in spectrum costs, over complex and expensive in chipset costs and consumes excessive power.

    I’m declaring an interest here – I represent the Weightless SIG where version 0.9 of the Weightless Specification has just been announced. Operating in white space spectrum and designed from the ground up to support machine communications and ONLY machine communications it addresses all of the issues that currently dog the M2M industry. A copy of the Weightless Specification is available at weightless.org.

    I apologise if the above sounds like an advertisement! I want to stimulate debate on a realistic and optimised machine communications alternative to conventional approaches.

    Al Woolhouse, Weightless SIG

    Share
    1. Agreed that M2M is limited by cellular infrastructure. Even WiFi has limitations especially in terms of power. For many applications a cost-effective short-range infrastructure is all that’s required. That’s the approach we’ve taken with reelyActive. That infrastructure can then use cellular, WiFi or Ethernet to bridge the simplest of devices to the cloud.

      Jeffrey Dungen, CEO, reelyActive
      http://reelyactive.com

      Share
  2. I’m not sure I agree that it takes that long to deploy. I’ve seen FedEx and others deploy M2M solutions in months and not years. We wrote up our experience while wondering why Microsoft missed the opportunity:

    http://successfulworkplace.com/2012/11/05/microsoft-missed-out-on-the-internet-of-things-are-you-ready/

    When you mention M2M solutions, I feel that application providers are already serving up M2M in the form of on-premise and Cloud solutions. A system ready to manage ubiquitous data (as opposed to Big Data) already has the ability to bring in many types of data, including sensor.

    Many of these solutions are wifi and not necessarily 2G, but I can understand that the shutdown of that network type would put a damper on things. I just don’t see that M2M is at risk at this point or that application providers aren’t servicing the market.

    Share
    1. Chris, thanks for the comment, I agree with a lot of what you pointed out. No question that there are enterprises out there that are finding their way to deployment in months and not years, but I also think that days or hours is where the industry needs to get to and I believe we we are on our way to getting there. Also, solutions like wifi for fixed modules will be an option for many and it will be significant piece to the puzzle, but for many solutions, a cellular connection will offer both superior reliability and mobility.

      Share
      1. Thanks, Landon. Not that I disagreed too much with the piece, it just seemed one-sided on the 2G part.

        Share
  3. Why not transition to a satellite provider of M2M like Orbcomm? They are launching a new generation of satellites which will dramatically improve latency and are backward compatible with old gen.

    Some appilications will still require near instant communication like Onstar from GM but for the vast majority of applications the latency Orbcomm offers will be good enough.

    Share
  4. Joshua Goldbard Saturday, November 10, 2012

    Sunsetting 2G networks isn’t going to slow down the Internet of things.

    It’s articles like this that force Carrier networks to support things like X.25, ATM and Frame Relay long after they should be taken off the market.

    The problem is that most M2M solutions are designed in a way that is proprietary and only applicable to one situation. M2M services that leverage public APIs are far more successful and long-lasting.

    Mashups are the future of technology and the classic examples of M2M applications just won’t work any more. If you want great Machine Data exchange, build an API. That’s how we approached the problem at 2600hz along with community input on what our users wanted in a platform.

    Disclaimer: I’m the community manager for http://2600hz.com

    Share
  5. Please edit the last sentence of the third paragraph.

    Share
  6. All very valid comments and thank you for sharing. On the contrary, this entire discussion around 2G, 3G, LTE, Wi-Fi, satellite ultimately leads to a typical “I wouldn’t start from here” response.

    There are just too many really significant problems associated with all of those foundations for successful M2M adoption going forward. And they are show stopping problems. All are too expensive for a start. As soon as you start talking about the potential for a terminal market estimated to be anywhere between 24 billion and 50 billion connections by 2020 then cost becomes absolutely critical. Both spectrum (cellular) and chipset costs are too high and have finite limits to their potential minimums through economies of scale. Terminals that have sophisticated intelligence built into them cost money and consume power. For battery powered terminals – it’s impossible to forecast with any certainty what proportion of the projected M2M applications will utilise battery powered terminals but it is likely to be significant – putting the heavy lifting processing at that end of the link is not practical – processors eat batteries. Cellular networks have finite capacity – adding another 50 billion connections is not practical. Cellular – and Wi-Fi – protocols were developed for a different application, one that they are particularly suited for and good at. But that wasn’t the low bandwidth, small data packet, variable latency application that suits most M2M applications. It is entirely the wrong vehicle for the job.

    If you were to sit down with a blank sheet of paper you absolutely would not design an M2M application around a legacy architecture such as that of cellular – you might very well come up with a standard that is optimised in every way to address the problems of spectrum cost (free of charge TVWS spectrum), chipset cost (processing at the base station end of the link in the cloud), power consumption (a lightweight protocol and data packet architecture), signal propagation characteristics (TVWS provides significantly better range, in building penetration and terrain sensitivity than 900MHz+ spectrum). Reminding you of my interest, the Weightless SIG has taken the blank sheet of paper approach.

    Share
  7. 3years and 25Million is the threshold of dooming the Internet of Things? Wow. Either the author hasn’t been around the world of technology very long (in which case I would strongly encourage a academic study of the history of computing, the challenges previously faced, adoption rates, amounts spent adjusted for inflation, etc.), or a sensationalist (in which case I would ask for less information pollution please because 3 years 25 million as a barrier is a joke). Come on Gigaom, you guys have better stuff than this.

    Share
    1. I second GetReal.

      In my opnion, nothing can stop *Internet of Things now EXCEPT complete break down of the system.
      (* or call it ‘what ever’ at this point, because we will hear many titles for it in 2013, industrial age thinkers and pundits thrive on defining and demystifying obvious)

      We will communicate, machines will converse with us and each other, nervous system will continue to expand, etc. Don’t assume technology as a constant.

      Share
    2. Nicholas Paredes Monday, November 12, 2012

      It’s adverstuff… Sitting on ones hands will simply hand the space to AWS or Google, so you can imagine how that will play with VC looking for opportunities.

      Messed up was the cellular infrastructure in the early 2000s! Today is like a mecca of intelligent decision-making.

      Share
  8. M2M is already changing the way we live. In architecture and building science, lighting controls are being developed that detect your iPhone nearby and adjust light levels to your preferences. This is exciting, and it will happen. The question is, will we waste billions of dollars and tons of natural resources developing these technologies in an unregulated free market in which only the wealthy have access to it (initially)?

    Share
  9. Also, will we go back and upgrade failing technologies instead of throwing them away and starting with new development? Politics people.

    Share
  10. About the loss of 2G. I understand that sensor nodes communicate with an RF module to a gateway that them sends data to a central server. So If the gateway only supports 2G, it could be replaced with a 3G gateway, or updated with a 3G dongle, which means all sensors can still be used.

    Share

Comments have been disabled for this post