Amazon’s new data warehousing service takes aim at “old guard” IT giants

IMG_0175

Updated: Well, now we know what one of Amazon’s “unbelievable” new services will be. It’s Redshift, a data warehousing service now in preview, which aims to siphon business from Oracle (Redshift, get it?), IBM and Teradata. The move shows that Amazon Web Services (AWS) hasn’t finished building higher-level services that compete not only with old-school IT vendors but with some of Amazon’s own software partners.

Redshift will cost roughly 1/10th of what old-school data warehouses would, Andy Jassy, SVP of Amazon Web Services (AWS) said Wednesday morning in his AWS:Reinvent keynote.

Data warehouses are too expensive and too hard to scale for big companies and just plain too expensive for small businesses, Jassy said. For that reason, AWS wanted to build a service that is easy to get started on, would be self service and can scale up and down as needed.

The company tested out Redshift on Amazon.com’s giant retail business and found that it worked out to about $32,000 a year versus “millions” spent on its old-school data warehouse. Traditional data warehouse applications can cost $19,000 to $25,000 per GB per year compared to $1,000 per GB TB per year for Redshift. Update: Redshift builds on technology licensed from Paraccel.

Of course the proof is in the pudding. Customers can sign up for the preview now.

Amazon execs are hell bent on showing that their services are ready for primetime even for mission-critical applications. But last week,  two IT execs, one from Diebold and another from a big US bank, both told me that no one in their companies, not even developers, can use AWS at all for compliance reasons So there’s still some work to do there.

loading

Comments have been disabled for this post