Summary:

Teradata is trying to steal some thunder in the in-memory analytics space with a new technology called Intelligent Memory that places hot data in RAM while dispersing the rest across solid-state drives and disk.

Data analytics veteran Teradata will not let the new era of data-analysis architectures pass it by without a fight. It has already built products to address massive data volumes and Hadoop, and on Wednesday it announced an in-memory database technology to answer the industry’s latest call.

Speed is the driving factory behind the in-memory analytics push that spans everyone from classic Teradata rivals like SAP and Oracle to startups such as MemSQL. Estimates vary as to the exact speed difference between data access in RAM versus hard disk, but Teradata is claiming RAM is 3,000 times faster. The speed difference between RAM and solid-state drives or flash memory is smaller, although still significant.

Of course, cost also comes into play, as the speed and cost tend to go hand in hand when it comes to storage media. That’s one reason Teradata says its new technology, called Intelligent Memory, doesn’t operate fully in-memory like some competitive offerings do. Rather, it places only the “hottest” data in memory for super-fast analysis and spreads the rest between solid-state drives and disk within a Teradata environment.

tdc2

This concept of intelligent data placement has been around a while in the storage space (it’s part of EMC’s new ViPR software-defined storage platform, too), but the advent of big data and abundant flash has given it some new life. Many companies desire a tiered system in which they can pay more for fast access to their important or hot data, while saving some cash on lower-performance for their older and less-accessed data. Facebook is really pushing the envelope here with its cold storage initiative — something VP of Engineering Jay Parikh will likely discuss at our Structure conference June 19 and 20 in San Francisco.

In analytics, though, RAM, not flash, is the fastest medium out there. Whether someone goes all-RAM or a tiered approach like Teradata pushing probably depends on how much performance they need across how much data, as well as how much they’re willing to pay. But if you’re doing interactive analytics in the next decade, they’re almost certain to be in-memory to some degree.

Feature image courtesy of Shutterstock user Hellen Sergeyeva.

Comments have been disabled for this post