Meet Node-RED, an IBM project that fulfills the internet of things’ missing link


If you play around with enough connected devices or hang out with enough people thinking about what it means to have 200 connected gizmos in your home, eventually you get to a pretty big elephant in the room: How the heck are you going to connect all this stuff? To a hub? To the internet? To each other?

It’s one thing to set a program to automate your lights/thermostat/whatever to go to a specific setting when you hit a button/lock your door/exit your home’s Wi-Fi network, but it’s quite another to have a truly intuitive and contextual experience in a connected home if you have to manually program it using IFTTT or a series of apps. Imagine if instead of popping a couple Hue Light Bulbs into your bedroom lamp, you bought home 30 or 40 for your entire home. That’s a lot of adding and setting preferences.

Organic programming: Just let it go

If you take this out of the residential setting and into a factory or office it’s magnified and even more daunting because of a variety of potential administrative tasks and permissions required. Luckily, there are several people thinking about this problem. Mike Kuniavsky, a principal in the innovation services group at PARC, first introduced me to this concept back in February and will likely touch on this in a few weeks at our Mobilize conference next month. He likens it to a more organic way of programming.

The basic idea is to program the internet of things much like you play a Sims-style video game — you set things up to perform in a way you think will work and then see what happens. Instead of programming an action, you’re programming behaviors and trends in a device or class of devices. Then you put them together, give them a direction and they figure out how to get there.

Over at IBM, a few engineers are actually building something that might be helpful in implementing such systems. It’s called node-RED and it’s a way to interject a layer of behaviors for devices using a visual interface. It’s built on top of node.js and is available over on github.

Node-RED and how it came to be

The idea behind the node-RED effort came from playing around with connected devices, and the work it took to make things work together. The engineers behind the code — Nicholas O’Leary, Dave Conway-Jones and Andy Stanford-Clark — are also working with IBM’s MQTT messaging protocol. But with node-RED they aren’t focused on how devices talk to each other, but how they work together.

“The first version of node-RED was all about MQTT and how can we move messages between different topics and do it in a really lightweight way,” said O’Leary. But eventually it became more about a way to tell devices what you’d like them to do as opposed to having to tell each of them how to do it added Conway-Jones.


One of the key elements is the visual interface, which helps programmers see how things relate and interact. You can drag new items into a community of related nodes or change the characteristics from the tool. If you imagine a network of connected machines in a factory this is a far more efficient way to think about programming sensors than dealing with each one individually.

For example, a temperature sensor associated with different machines will likely need to be programmed individually for its machine. But instead of manually inputting the parameters for each sensor, what you could is have the machine it’s associated with tell the sensor its parameters and have the sensor then respond according to those temperatures. All the programmer had to do was move the sensor and describe it’s function to the community of nodes associated with that machine; Node-RED and the characteristics associated with the machine did the rest.

Applying this organic programming model elsewhere

This is a profound shift in the way of thinking about development, but it’s not just something we’ll see in the internet of things. As enterprise software becomes less about individual applications and more about a collection of federated services called up via an API, this way of thinking and these types of visual interfaces will become more common.

Temboo's Twyla representation.

Temboo’s Twyla representation.

For example, Temboo, a company that makes a cloud service that helps link APIs for big companies, also has a cool visual interface and a method of generating this type of organic programming. Temboo calls it Tywyla and the actual programs it writes using this are called choreos — short for choreography. It has over 2,000 choreos and adds a few more weekly.

Temboo is also the service that connects the Arduino Yun device to hundreds of APIs. This lets the relatively dumb controller link to the internet and lets it borrow its smarts from the cloud. Devices hooked into Temboo’s cloud can run a few lines of code that essentially tell the cloud-based servers what action to perform. In this way, a dumb microprocessor can perform more complicated tasks without getting weighed down with code.

The common thread behind node-RED and Temboo is that as we’re connecting thousands of devices and building interconnected services we will be forced to think differently: about how to deploy dumb devices and about how to program thousands of devices while keeping the resulting program flexible enough for change.


Muhammad Khatree

This is a bit like artificial life, where related objects are placed into a common space where they can interact, but they’re are not told specifically what to do. Rather, they are given general guidelines/parameters to follow, and from this, their natural group behaviour emerges. For example, this one ( is about the flocking behaviour of birds, but instead of telling each bird to “flock” with the rest, they each have been told “don’t fly too close to other birds, fly in the general direction of other birds, stay close to nearby birds” … and as you’ll see, the flocking behaviour emerges.

Samuel A. Falvo II

How is this different from something like LabVIEW? Is it merely a price issue?

Stacey Higginbotham

Hmm, not exactly sure, although NI is getting into this management space for connected sensors. LabView is pricey but I’ll have to hope someone familiar with both projects can explain.

Comments are closed.