Low-latency networks aren’t just for Wall Street anymore

digital data flow through optical wire

When it comes to broadband we spend a lot of time talking about how much data we can transfer in a set amount of time (measured in Mbps). But what about latency? Applications from video chat to gaming need networks where packets get to their destination quickly, which requires a different type of network speed.

Plus if we’re going to stream everything from video to music in addition to our latency-sensitive apps, we’re going to have to add some metrics to our definition of broadband and some intelligence to our home networks. The question is how carriers can respond to both the threat and the opportunity this development offers and what other companies might play in this space.

What is latency and when does it matter?

Currently, consumers think about their broadband connections in terms of how fast they are. But delivering a 30 Mbps connection means only that the pipe is fat enough to deliver 30 megabits every second. This matters greatly when you’re trying to download a fat file, but when dealing with real-time applications a different level of speed matters. Latency measures how fast the packets get from the original server to your device and is measured in milliseconds.

The average home network ideally has a latency of less than 25ms, but everything from distance, the number of gateways between your device and the originating server and network congestion can affect latency on your home network. And the effects are noticeable in more applications we use today. Gaming has always been a popular example, since waiting even a few milliseconds more to dodge a bullet or throw a punch can mean the difference between progressing to the next level or starting over.

One of the many real-time apps I use every day.

But now most people are engaged in video chats on both home and mobile networks, and nothing introduces a user to the concept of latency better than paying for a 30 Mbps connection and still dealing with pixellated or choppy video chats. It also adds to the wait time when downloading objects and scripts from a web site, making load times feel sluggish. A StrangeLoop survey this month found that latency at the desktop was between 65ms and 145ms. An entire industry around front end optimization has been built to help reduce such performance issues, and many of those startups have already been acquired, such as Akamai’s acquisition of Blaze.io.

On mobile networks it’s worse, even with next generation LTE networks coming online. Michael Thelander the president of Signals Research said Verizon and AT&T had network latencies of greater than 40ms about 20-30 ms longer than some of their European peers. But with video chat on mobiles, plus our general impatience when using those networks trying to find a restaurant while walking toward it, improving latency on mobile is also a real issue.

“The importance of latency can’t be overlooked for applications, such as social media and web browsing that involve small transfers of data. As part of my testing I basically proved that higher throughput becomes irrelevant at a certain point …. However, latency can never be too low and in that regard many of the US operators have their work cut out for them,” he said via an email message.

Who ya gonna call?

The issue of latency is known in the industry — the FCC was has even asked if the traditional Mbps was the best metric to stick with, or should latency and other elements matter. But as consumers spend more time in real-time, latency will become a more noticeable problem.

“You can’t exactly call up a carrier and ask them to improve your latency,” said Sean McCann of Killer Technologies, a Qualcomm Atheros company that makes latency-reducing network cards for gamers. Short of putting in new fiber or dedicating network resources, this isn’t an easy problem, he added. As a side note, banks and other firms engaged in high-frequency trading DO pay a lot for dedicated low-latency networks. They also place their IT next to the trading floors to help lower latency caused by distance as well.

But your average consumer doesn’t have or need that level of dedicated bandwidth. And they couldn’t afford it either. But that doesn’t mean we will keep paying for fatter pipes in the home and still have bad gaming or video experiences. Simple fixes like easier home network management software will help. Killer Technologies is working on the problem, although McCann couldn’t go into a lot of detail about it. Alcatel-Lucent also has a service it sells to carriers that can be used to help manage the home network.

Such software can help consumers prioritize network traffic in their own homes and eventually it could proactively look at traffic and give it more space on the pipe. For example, a software update downloading for your computer can wait, while packets from Netflix have to hit the box in a hurry. Right now it’s a free for all on most networks because nothing is telling your router that the software update can cool its heels for a millisecond while the Netflix packet has to hit your Wii right away or you get downgraded to Standard Definition.

The question is how consumers will implement this type of software. Will you visit a carrier web site to prioritize your traffic as it hits your ISP’s modem? Will consumers buy a box that plugs into the router with a simple UI that lets them press video to give their Hulu streams a “speed boost?” Perhaps router makers will just bake something into their products and let people manage it from a web site.

Either way, devices in the home are becoming more numerous and chattier while the services and applications we’re consuming are more sensitive to packet delays. You don’t have to be a network engineer to understand that something has to be done.

You're subscribed! If you like, you can update your settings


Comments have been disabled for this post