Streaming HD video may be clogging up the last mile in homes, but in an enterprise setting, it’s not Vin Diesel flicks that are the problem — it’s larger and more important data being stored in the cloud. Medical records containing radiographic scans or genomic data for cancer research are transferred from corporate offices and university connections over the long-haul network. These records can comprise terabytes of data, which need to travel to cloud storage vendors. Each terabyte contains the equivalent of 100 HD movies at 10 GB each. This massive data migration could drive the deployment of faster broadband networks that will benefit everyone.
The enterprise last-mile networks generally involve faster, dedicated connections as compared to those in our homes. The common corporate link to the outside world, a T-1 line, offers speeds of 1.5 Mbps, and is able to max out at about 15 GB of information per day. According to Geoff Tudor, founder and senior VP of business development and product strategy at cloud storage company Nirvanix, if one assumes a corporate employee generates 3-5 MB of data per day, once you get over 300 employees sending their files to offsite cloud storage for backup, the T1 is tapped out. Over one of the fasted telcommunications options, an OC-48 line with speeds of about 2.5 Gbps, it will still take about an hour to send 1 TB of data.
After it gets through the last mile, data travels over the long-haul networks crisscrossing the country, which are currently being upgraded from 10 Gbps to 40 Gbps. The slower long-haul networks can still be 4-60 times faster than the last-mile connections, but still not fast enough for the even more demanding data sets required by scientific computing. Jay R. Boisseau, director of the Texas Advanced Computing Center (home to the Ranger supercomputer), is worried that high-performance computing, which deals with petabytes of data, will be left in the slow lane as providers upgrade their long-haul networks with an eye toward the less demanding consumer and enterprise bandwidth needs. When I asked about the move from 10 Gbps to 40 Gbps on long-haul networks, Boisseau scoffed, “Great, now it will take me one day instead of four to move my data sets.”
But enterprise adoption of all things cloud may have a silver lining for Boisseau and the HPC set, as enterprises start sending their own terabytes of data to cloud storage providers. Recently Nirvanix won a contract to store 240 TB of NASA moon imagery data, and Tudor thinks that’s just the beginning of a trend toward terabyte and even petabyte data transfers. Nirvanix has a 1 Gbps connection from its data centers to the web, which Tudor says are kept pretty full, even before NASA’s bits and bytes start coming in. The company is now in talks with carriers to provide cloud storage and build out bandwidth to address the coming network demands that sending fat files over the Net require.
Getting carriers involved will not only get the bandwidth providers directly involved, but companies such as AT&T, Level 3 or Verizon have the trust of corporate customers when it comes to securely and reliably storing their data. That will help enterprises to trust clouds for data storage. Verizon is even trying to get a company to build super-fast long-haul network equipment to boost bandwidth. For enterprises, the next issue is price. Tudor estimates with corporate data growing at 30-60 percent a year, storage for enterprises is going to become too expensive to keep in house. If that happens, bandwidth providers will suddenly have the demand — and a customer who’s willing to pay — for lightning fast pipes. When this happens, science, storage clouds and even your own web-surfing experience could benefit.
This article also appeared on BusinessWeek.com.