5 Comments

Summary:

The National Security Agency will have plenty of room to store information at its Utah data center — but not necessarily a yottabyte’s worth. Speculation continued this week with the release of site plans.

Forbes got a hold of site plans for the National Security Agency data center going up outside Salt Lake City and proceeded to ask people to speculate on the facility’s storage capacity. The resulting article, published Wednesday, is a fascinating read, although it repeats previously reported information on storage that doesn’t pan out; not yet, anyway.

Various figures are thrown out in the article. Brewster Kahle, who worked on the Internet Archive, checked out the site plans and thought the place could hold 12 exabytes. Paul Vixie, founder and president of the Internet Systems Consortium, estimated a 3-exabyte capacity. Both of the numbers are substantially lower than those reported previously, which went into zettabyte and even yottabyte territory. But at least for now, there might not actually be a need to store data in the yottabyte range.

Want to store a yottabyte? Expect to pay trillions

An article on the Utah data center that ran in Wired last year suggested that the Pentagon wants to make the U.S. intelligence community’s Global Information Grid capable of dealing with yottabytes, attributing the claim to a 2007 report from the Department of Defense. But that doesn’t necessarily apply to any one NSA data center — it’s for the entire Department of Defense. A 2008 report supports this: “The target GIG supports capacities exceeding exabytes … and possibly yottabytes … of data.”

On top of that, the defense contractor Mitre showed in a 2008 report how it’s unlikely that data input for the entire Department of Defense would hit a yottabyte even by 2015.

What’s more, it could cost trillions of dollars to store a yottabyte of data, perhaps upward of $50 trillion. A calculation in 2009 put the cost at $100 trillion just for hard drives. For the sake of context, the gross domestic product in the U.S. last year was around $62.7 trillion, including seasonal adjustments.

While the Utah data center could play a key role in the operation of the NSA’s PRISM program after it opens at the end of September, it has gotten a lot more attention since PRISM was first discovered earlier this year, and since then it has become a sort of symbol for the fears some companies have that the data they store in certain clouds could get sucked up into the NSA’s hands. My colleague David Meyer will be digging into this subject with Jason Hoffman, the chief technology officer at Joyent, and other luminaries at our Structure:Europe conference in London on Sept. 18-19.

Will that be cold, flashy, or software-defined?

Setting aside PRISM and the yottabyte-or-not-yottabyte question, it’s interesting to wonder what the NSA might be installing in the Utah data center. The federal government is pushing to consolidate its data centers and make them more efficient. The Forbes article rightly cites Moore’s Law as a factor that advances the evolution of computing. But that’s not the only area of rapid development in IT infrastructure. Networking is undergoing a huge shift, but storage, too, is changing fast, and it could be an opportunity for driving considerable efficiencies.

Software-defined storage has been getting a lot of investor money and are letting users get the most bang out of the byte capacity they’ve got. Flash storage is getting cheaper, allowing for faster responses to queries on enormous databases. And cold storage is another option the NSA could be putting in place at the new data center, to spend the bare minimum on long-term storage of data that might be useful for validating a hypothesis based on more real-time information flowing in. Amazon Web Services and Facebook have been making strides with cold storage, and startup SageCloud just got some funding last month to keep going after it.

Presumably, NSA technology buyers have heard of these technologies — and they might have already decided to bring in all-flash arrays from Pure Storage, for example, given In-Q-Tel’s investment in the company. Such a decision might mean relying less on storage at other NSA data centers, or preparing for a day when an NSA data center would need to store an amount closer to a yottabyte.

So while straight-up storage estimates are fun to make, more nuance could be added to the discussion about how exactly the NSA will store the data in Utah.

Feature image courtesy of Shutterstock user kubais.

This story was updated on Friday to remove a link to a parody site.

  1. Seagate is rumored to have 6TB enterprise drives next year, Hitachi should have 7TB this year.
    In theory they could easily get close to 20TB/drive with 5.25 inc drives and maybe double that in just a few years.But it would be rather difficult to make that many drives when the industry’s capacity can maybe reach 200mil drives/quarter.
    That aside,a waste of money to build that center.If the NSA doesn’t get shut down,they destroy the country so lets hope it does get dissolved.

    Share
  2. Our GDP last year was 14.991 billion USD.

    Share
  3. Our GDP was nowhere near $62 trillion, you may have been thinking world GDP, although that was closer to $70 Trillion last year.

    Share
  4. The Password Bot Friday, August 23, 2013

    And here are some calculations showing just how silly their yottabytes claim is: http://xato.net/privacy/dear-nsa-meant-yottabytes/

    Share
  5. quoting xkcd, saying it can store “between an exabyte and a yottabyte of data” is the same of saying “this thing has a length which is between a millimeter and a kilometer”

    Share

Comments have been disabled for this post