1 Comment

Summary:

Financial institutions have a lot of data — as in multiple petabytes– so storing that data for use in new products and for regulatory compliance will move to the public cloud.

Ron Bodkin Think Big Analytics Ann Neidenbach The NASDAQ OMX Group
photo: Albert Chau

The financial services industry needs to store a lot of data, and the folks at NASDAQ want to use it to help build new products. But they also have to keep an eye on costs. Ann Neidenbach, an SVP at the NASDAQ OMX Group, explained how the exchange made the decision to use the Amazon public cloud for storage Thursday at the Structure:Data 2013 conference in New York.

“The economics of the clouds are just phenomenal,” says Neidenbach. She and Ron Bodkin, the founder and CEO of Think Big Analytics, noted that the exchanges have regulatory requirements to keep petabytes of old data associated with trading, and that adds up. She says the exchange has 10 petabytes of trade data and 3.5 petabytes of data that’s just backups. “We’re not even talking about email or even instant messaging,” said Neidenbach. “It’s a tremendous amount and that’s why we went into this partnership with AWS.

“We have to leverage the public cloud where storage is so much cheaper per gigabyte,” she added.

The exchange did have to spend time making sure the public cloud was secure enough for their data, but also for their customers’ data and that’s a journey that is just beginning. But she certainly implied that given the costs of storage in the public cloud and the amount of data financial institutions must keep, that the journey and eventual public cloud destination was inevitable.

And the end result of having all this data stored in one public place might yield new products. Neidenbach expects that combining trade data with social media might result in new opportunities for trading and research. You had the feeling that the sky might be the limit.

Check out the rest of our Structure:Data 2013 live coverage here, and a video embed of the session follows below:


A transcription of the video follows on the next page

page of 2
  1. Kazuya Mishima Friday, March 22, 2013

    With many utilities facing the task of storing petabytes of smart meter data for as long as seven years in order to satisfy regulatory requirements, the ability to house and leverage the massive load of data accumulating from the smart grid is a significant IT challenge….http://bit.ly/YHCpQp

    Share

Comments have been disabled for this post