4 Comments

Summary:

The concept of webscale computing gets a lot of attention thanks to the impressive scope of operations at companies like Google and Facebook, but one could argue there’s an even more-impressive tier of IT infrastructure: bank-scale. Here’s what scalbility means at Citigroup.

cern tape library

The concept of webscale computing gets a lot of attention thanks to the impressive scope of operations at companies like Google and Facebook, but one could argue there’s an even more-impressive tier of IT infrastructure: bank-scale. It’s not because banks are necessarily operating larger infrastructures than Google, or that they’re unveiling amazing new homemade innovations for dealing with massive systems, but, rather, because banks are dealing with real money. Lots of it.

Yobie Benjamin

At the Under the Radar conference this morning in Mountain View, Calif. Citigroup   Global CTO Yobie Benjamin took the stage to talk about what operating at bank-scale really means. It’s a topic best illustrated by the list of stats below and by the following quote from Benjamin: “We have to store all our data, forever.”

Here’s what’s up at Citi:

  • Exabytes of data. Benjamin said the bank is seeking analytics technologies, especially, that can handle petabyte scale today, but that will be able to tackle a data store that will soon hit the exabyte range. If you’re selling anything at terabyte-scale, Citi isn’t interested.
  • There’s no such thing as too much speed when it comes to high-frequency trading. Period.
  • Ten regional processing centers, which are the size of a few football fields apiece. If you have software that can monitor that much gear across that many locations, Benjamin will take your call.
  • 500 milliseconds per banking transaction is average, but too slow. Benjamin wants transactions in the microsecond and nanosecond range.
  • 99.999 percent availability? Try 99.9999999 percent.
  • Citi repels 40,000 attacks on its system every day.
  • Citi operates a 100-country network, which means it has to perform (and meet regulatory requirements) globally.
  • $12.5 trillion. That’s the amount of customer money for which Benjamin’s half of Citi is responsible. About a quadrillion dollars worth of transactions flow through his system every year.

The last point is probably the most important, as everything Citi does is part of an effort to keep customers happy. If their data isn’t secure or their transactions don’t process immediately, customers will take their business and their money elsewhere. That pretty much explains Benjamin’s stance on public cloud computing: “If you have a public cloud, don’t talk to us.”

Feature image courtesy of Flickr user gruntzooki.

You’re subscribed! If you like, you can update your settings

  1. Yobie is not anchored in reality when he states that “We have to store all our data, forever.” Legal and regulatory dictate this, NOT Yobie’s or Citi’s requirments.

    1. I’m not sure I get your point. Does it matter who dictates the length for which data is stored?

  2. Thomas Kejser Monday, April 30, 2012

    Not sure I am interpreting Bill correctly. But it could be argued that some legal requirements tell you when you MUST delete data. For example, many telcos operate under policies that require data to be kept for a max of 7 years.

  3. Thomas Kejser Monday, April 30, 2012

    Another thing that is very odd is the quote on 9 “nines”. What exactly does it mean to have an uptime that requires you to measure any “downtime” in the microsecond range? Even granting that you COULD measure this accurately, it also raises the question if any provider of electricity would ever be able to meet such a demand.

Comments have been disabled for this post