24 Comments

Summary:

Streaming HD video may be clogging up the last mile in homes, but in an enterprise setting, it’s not Vin Diesel flicks that are the problem — it’s larger and more important data being stored in the cloud.  Medical records containing radiographic scans or genomic data […]

Streaming HD video may be clogging up the last mile in homes, but in an enterprise setting, it’s not Vin Diesel flicks that are the problem — it’s larger and more important data being stored in the cloud.  Medical records containing radiographic scans or genomic data for cancer research are transferred from corporate offices and university connections over the long-haul network. These records can comprise terabytes of data, which need to travel to cloud storage vendors. Each terabyte contains the equivalent of 100 HD movies at 10 GB each. This massive data migration could drive the deployment of faster broadband networks that will benefit everyone.

The enterprise last-mile networks generally involve faster, dedicated connections as compared to those in our homes. The common corporate link to the outside world, a T-1 line, offers speeds of 1.5 Mbps, and is able to max out at about 15 GB of information per day. According to Geoff Tudor, founder and senior VP of business development and product strategy at cloud storage company Nirvanix, if one assumes a corporate employee generates 3-5 MB of data per day, once you get over 300 employees sending their files to offsite cloud storage for backup, the T1 is tapped out. Over one of the fasted telcommunications options, an OC-48 line with speeds of about 2.5 Gbps, it will still take about an hour to send 1 TB of data.

After it gets through the last mile, data travels over the long-haul networks crisscrossing the country, which are currently being upgraded from 10 Gbps to 40 Gbps. The slower long-haul networks can still be 4-60 times faster than the last-mile connections, but still not fast enough for the even more demanding data sets required by scientific computing. Jay R. Boisseau, director of the Texas Advanced Computing Center (home to the Ranger supercomputer), is worried that high-performance computing, which deals with petabytes of data, will be left in the slow lane as providers upgrade their long-haul networks with an eye toward the less demanding consumer and enterprise bandwidth needs. When I asked about the move from 10 Gbps to 40 Gbps on long-haul networks, Boisseau scoffed, “Great, now it will take me one day instead of four to move my data sets.”

But enterprise adoption of all things cloud may have a silver lining for Boisseau and the HPC set, as enterprises start sending their own terabytes of data to cloud storage providers. Recently Nirvanix won a contract to store 240 TB of NASA moon imagery data, and Tudor thinks that’s just the beginning of a trend toward terabyte and even petabyte data transfers. Nirvanix has a 1 Gbps connection from its data centers to the web, which Tudor says are kept pretty full, even before NASA’s bits and bytes start coming in. The company is now in talks with carriers to provide cloud storage and build out bandwidth to address the coming network demands that sending fat files over the Net require.

Getting carriers involved will not only get the bandwidth providers directly involved, but companies such as AT&T, Level 3 or Verizon have the trust of corporate customers when it comes to securely and reliably storing their data. That will help enterprises to trust clouds for data storage. Verizon is even trying to get a company to build super-fast long-haul network equipment to boost bandwidth. For enterprises, the next issue is price. Tudor estimates with corporate data growing at 30-60 percent a year, storage for enterprises is going to become too expensive to keep in house. If that happens, bandwidth providers will suddenly have the demand — and a customer who’s willing to pay — for lightning fast pipes. When this happens, science, storage clouds and even your own web-surfing experience could benefit.

This article also appeared on BusinessWeek.com.

You’re subscribed! If you like, you can update your settings

  1. 1TB disk drive via FedEx will move the data in 24hr for $20. Why clog up everyone’s backbone?

  2. Is T-1 still the gold standard for business Internet, like it was in the 90’s? Please say it’s not so!

    1. sad but true…. unless folks like Talari convince folks that Business DSL w/ 99.x% (<$100) is far better than T-1 @ 99.xxx% availability ($375 and up around here), this is a big impediment to cloud based storage + computing. the ‘network is a computer’ with pitifully slow lanes at the moment.

  3. Stacey,

    ” Information growth rate will always exceed our ability to consume and transport that information.”

    Specific to Enterprise and large organization, Let’s break this data explosion context and the relevant cloud storage discussion into 2 buckets.

    #1- Information that is deemed important for the organization/enterprise but not consumed on real-time basis i.e NASA moon images.

    #2 – Information that is deemed important consumed on a real time basis i.e say ” Project status on latest green energy building?”

    on #1- Upload speed is less of an issue and I can see where Nirvanix of the world have the opportunity; Challenge there is keeping the information secure once it gets to the cloud; No Cloud storage vendors offers SLA guarantee and they won’t because they don’t own the network pipes; However, Cloud security is big issue here; Also, most cloud storage architectures are not architected for petabyte storage

    on #2- Upload speed is real issue; When Enterprises want access to the information on demand, they should get it; Security, again is a open questions here

    I sense that Enterprises will adopt a a hybrid premise-cloud model is the best approach. Why hybrid? Fast LAN access to realt-time information and better control; We may see creation of hybrid private clouds that Enterprise can control but don;t have to own and operate.

    1. Stacey Higginbotham RajR Friday, April 3, 2009

      Raj, i agree with you. As for security, I think that getting carriers involved here will be key.

      1. Please no! They should just focus on moving the bits around the fastest at the lowest price possible. Carriers are incapable at any VAS.

  4. The GigaOM Weekly Recap: Saturday, April 4, 2009

    [...] Online storage will boost bandwidth demand. [...]

  5. Internet Marketing, Strategy & Technology Links – Apr 6, 2009 « Sazbean Monday, April 6, 2009

    [...] Cloud Storage Could Mean Fat Pipes For All (GigaOM) [...]

  6. Two Geeks and a Blog :: Geek News :: Quick Hits: Mar. 29 – Apr. 4 Tuesday, April 7, 2009

    [...] Cloud storage could mean fat pipes for all – GigaOM [...]

  7. Internet Will Eventually Be Remote Controlled « Jason Kinner Wednesday, April 8, 2009

    [...] why I’ve been intrigued by some of the stories I’ve been reading lately. GigaOm covers “fat pipes” in the cloud, which is already happening. From my work in the social technology space, I already know that one [...]

  8. Nirvanix Adds $5M to Keep Data in the Cloud Wednesday, April 15, 2009

    [...] VP of business development and product strategy with Nirvanix, outlined earlier this month in an article about bandwidth needs associated with storage in the cloud, it’s becoming much more expensive for companies to own their own storage facilities, [...]

  9. Storage Effect » Networks and storage: which is the chicken and which is the egg? Thursday, April 23, 2009

    [...] Networks drive storage growth.  The fatter the pipes, the bigger the content users can create and consume – from streamed HD movies to 240 TB of data moon imagery data. [...]

  10. NetEx Targets Data Transfer Tech to the Cloud Friday, April 24, 2009

    [...] a moment too soon in bringing this technology to the cloud, where transfers of large data sets are becoming more and more prevalent. Backing up primary storage in the cloud can be a great option for protecting data, but it isn’t [...]

Comments have been disabled for this post