Taking In-Memory OLTP Mainstream

A confluence of major changes in computer hardware is disrupting a three-decade equilibrium in the design of SQL database management systems (DBMS’s).  Memory is no longer precious compared to disk and CPUs can now scale *out* much farther without requiring a cluster of servers.  By optimizing a database to run in memory and exploit all 64 virtual processor cores on a commodity server without locking up, the industry has reached a tipping point.  It’s now possible to accelerate many OLTP applications by 10-100X without having to partition data across a cluster or migrate applications to specialty databases.

Join Gigaom Research and our sponsor Microsoft for “Taking In-Memory OLTP Mainstream,” a free analyst webinar on Tuesday, May 6, 2014, at 10 a.m. PT.

What Will Be Discussed

  • The technology changes that have come together to make in-memory OLTP possible now

  • The trade-offs between the approaches taken by software vendors to deliver this technology

  • Example applications that showcase the sweet spot of each approach

  • The likely medium-term evolution of in-memory OLTP technology within the broader data management marketplace

Who Should Attend

  • Enterprise Architects

  • Data analysts

  • Business analysts

  • IT Decision Makers

  • IT Managers & Directors