Investors continue to bet on databases that can handle large swaths and wide varieties of data. The latest proof point: Deep Information Sciences on Tuesday said it has $10 million in new Series A funding to fuel what it calls a high-performing transactional and analytic database. Funding sources include Stage 1 Ventures, Robert Davoli and angel investors.
Based in Portsmouth, N.H., Deep rejects the usual SQL, NewSQL, NoSQL, columnar, streaming and in-memory terminology, preferring the term “general-purpose database.” It handles structured and unstructured data and claims to keep latency low for writes, reads and queries. It aims to efficiently use all the cores of available processors, whether on premise or in the cloud.
One customer, Global Relief Technologies, sped up the work of updating its database with DeepDB with data from employees who log information on their tablets. A process that once took more than a day now takes 17 minutes, according to a Deep spokeswoman.
Other companies offer databases that mix transactional and analytic capability, including SAP, (s sap) with its HANA database, and JustOneDB. Deep, which has two commercial customers, responds to the differentiation question by claiming DeepDB performed better in tests for many use cases. In one test, it reportedly blew through 1.72 million transactions per second, compared with 32,000 per second in MySQL using the InnoDB storage engine.
The <a href="http://gigaom.com/2013/01/08/idc-says-big-data-will-be-24b-market-in-2016-i-say-its-bigger/"willingness to spend on big data has set the stage for a large pool of database providers, and many claim they have unique products. At the end of the day, it could be that enterprises will want multiple types of databases for multiple purposes. If that’s the case, Deep will need to add customers and use cases demonstrating that DeepDB can beat existing options in the transactional market as well as the hot analytic space.