2 Comments

Summary:

A Huntsville, Ala., company is moving from the machine-to-machine world into cloud platforms and big data. Here’s how it did it and how it thinks its work could actually end up saving lives.

Every year, nearly 2 million people contract infections while staying at the hospital, and inadequate handwashing is a big cause. It turns out we might be able to help fix the handwashing problem with help from intelligent sensors, and a Huntsville, Ala.-based sensor-network company called Synapse Wireless is working on just such a system for its hospital-industry customers. However, the trick to pulling it off isn’t just having the right sensors, it’s also having the right tools in place behind the scenes, on the servers that act as the sensors’ brains and tell them what to do.

Essentially, Synapse’s system is like a real-time monitor, reminding people to wash their hands when there’s not a malpractice lawyer hanging around to do it for them. A very simple implementation might go like this: When a nurse enters a room, a sensor on the name badge will send that information to a server, which will relay it to a sensor on the soap dispenser. If the nurse doesn’t wash his or her hands within, say, 30 seconds, the soap dispenser tells the server, which then sends an alert to the nurses badge.

“We’re actually triggering events in real time,” Synapse’s cloud platform lead Bryan Stone explained to me.

It sounds simple enough, but it’s not so easy to make these alerts happen in real time and to build a platform that can handle this for many customers instead of trying to manage a bunch of individual implementations.

And the system can’t just delete those events and any metadata once they’ve generated alerts. If there’s one thing everyone should have figured out by now, it’s that — assuming you know how to analyze it — there’s gold in the mountains of data we generate. Maybe a hospital needs to investigate the cause of a fatal hospital-acquired infection, or maybe an administrator wants to study patterns of handwashing to determine what other factors might play into how closely its staff adhere to guidelines.

To the cloud … and the big data systems

So, Stone and his team turned to the web for answers. Now, Synapse, which previously worked in embedded processors and server applications is building out a cloud platform that’s designed to handle data processing and analysis however and whenever it needs to happen, across a variety of applications. Stone said the handwashing use case is just one of about 50 it has identified in health care alone.

“We’re building a platform not for one industry,” he explained. “We’re building a platform that’s about data processing on single system.”

Like Synapse's, the Netflix data architecture is about online, nearline and offline processing.

Like Synapse’s, the Netflix data architecture is about online, nearline and offline processing.

Stone said the company started pretty much from scratch so it didn’t need to bolt new tools onto old infrastructure or re-engineer a stream-processing engine like Storm to work with some old-school database system. What the team ended up building actually looks a lot like the platforms of the web companies that created many of the technologies Synapse is using. There’s Storm for stream/real-time processing, Hadoop for batch/offline processing, Cassandra for a low-latency data store and Kafka for a distributed message queue. There are various proprietary products integrated, as well, including Pentaho’s software for transforming Storm data and sending it to analytic software in real time.

“There are amazing possibilities the more you take advantage of technologies in all these areas,” Stone said.

Doing it all with local talent

The Synapse platform also proves you can build a platform like this without a bunch of Silicon Valley-trained engineers — a point that should be driven home during our Structure Data conference later this month in New York. Yes, the Googles, Airbnbs and LinkedIns of the world will be represented, but there will also be folks from companies such as Ford, MetLife and McLaren talking about how they built advanced data architectures far removed from the west coast.

Stone himself came from a biotech company across the street in Huntsville, where he had taught himself how to build a genetics platform on Amazon Web Services but hadn’t worked much with core Hadoop technologies such as MapReduce. The rest of the team fits into roughly the same boat — folks who were familiar similar technologies but have what Stone calls a “hobbyist attitude.”

That’s partially by design and partially because there aren’t a lot of NoSQL, big data or distributed systems engineers running around Alabama. But, Stone said, his team understood the general paradigms of building distributed systems and was able research the best tools for the job. “With the right mindset and passion,” he said, “… we were able to come in with that mindset of creating something new that pushes the boundaries.”

Feature image courtesy of Shutterstock user Andrey_Popov.

  1. Reblogged this on BenchTime and commented:
    While healthcare is the primary example in this post, the concept and vision for what we can do with IoT data that we generate is where the opportunities are. As IT and Enterprise Architects, this is where we have the opportunity to shine and evolve by understanding the processes and building the architectures to support them. Creative problem solving through technology at its best. These are exciting times for those of us in healthcare technology architecture!

    Share
  2. I agree with BenchTime wholeheartedly. It seems the opportunities and possibilities are genuinely endless in this space. It’s an exciting time all the way around. I just hope that the data can flow freely and remain accessible on multiple levels rather than being horded by the biggest players.

    Share

Comments have been disabled for this post