Boundary, the real-time network monitoring service, has expanded its product portfolio to include alerts for clients when their online services behave abnormally. Boundary has already touted its ability to see cloud outages a few hours before they happen based on the network data it sees from customers hosted on Amazon’s(s amzn) or Microsoft’s(s msft) Azure clouds, but this new service targets individual customers.
The idea is that Boundary has analyzed the performance and habits of a customer’s network so when that pattern changes, Boundary can send an alert. The company is ingesting 350 Mbps of inbound networking traffic on its 550 customers which it then processes in real time. Boundary has its own custom-built, real-time data processing engine, that takes this influx of second-by-second data and combines it into dozens or hundreds of terabits per second of traffic for analysis. A Boundary spokesman notes that this current rate represents a fraction of the startup’s total processing capacity.
Indeed, both the product launch and the data flows are topics I’ve discussed with Boundary CEO Gary Read (pictured) in the past. We’ve outlined the merits of real-time network monitoring that can track multiple data points every second as well as discussed what that will mean in terms of the amounts of data that Boundary will have to process. With 550 customers, Boundary is analyzing 600,000 events every second and the company is adding about 10 to 12 paid customers each month.
To analyze this data, servers monitored by Boundary report data to its SaaS platform, sending up information about the network and application flows they see passing through. So the 600,000 events per second refer to the network flows that Boundary observes each second for our customers inside their networks, and as they communicate with their customers. As Boundary grows to observe more flows between data centers to end users, and through mobile networks, it gains a broader picture of the health of cloud providers and the Internet as a whole. It’s somewhat similar to what startup DeepField Networks is also proposing, and is an essential step in creating accurate monitoring for federated applications built in cloud environments.