Key Criteria for Evaluating Data Warehouses v2.0

An Evaluation Guide for Technology Decision Makers

Table of Contents

  1. Summary
  2. Data Warehouse Primer
  3. Report Methodology
  4. Decision Criteria Analysis
  5. Evaluation Metrics
  6. Analyst’s Take
  7. About Andrew Brust

Summary

This report aims to give readers a solid understanding of data warehouse technology and provide the tools they need to evaluate the key points of differentiation among product offerings.

Data warehouses offer businesses a storage and access solution that enables them to perform complex analytics and to gain insights that drive business value. Over time, these products have matured, with a continual shift towards cloud-first and cloud-native approaches.

Within this mature and established sector, a wide range of factors should be considered when evaluating the suitability of a particular product to meet your business needs. This report breaks down the issues by outlining what to expect from any data warehouse product—the features that differentiate offerings from some of the leading players, as well as the technologies and trends that are currently emerging. The report also draws conclusions about which trends will become standard among products in the medium to long term.

Beginning with a primer on data warehouses, readers will gain an understanding of how the vast majority of vendors working within the sector have embraced technologies such as massively parallel processing, columnar storage, and vector processing to drive gains in performance and scalability that translate to real business value for their customers.

The report demonstrates how vendors are pushing the sector forward through the development of business-user-friendly interfaces, democratizing the data analytics field by enabling nontechnical staff to pull insights from systems in ways that would previously have been available only to data scientists and analysts.

Read the full report to discover:

  • How vendors are optimizing for concurrency within their products, driven by a cloud-first approach that replaces the traditional on-prem deployment.
  • The developments in access control and user roles that are enabling companies to maintain secure access to their data for their increasingly remote workforces.
  • Why integration bottlenecks between data lake and data warehouse systems can be overcome through new approaches to interoperability.
  • What the key evaluation metrics are for analyzing the suitability of any data warehouse product for the business requirements of your company.

Analyst Andrew Brust is a seasoned industry veteran. He has strong background in development and has been tracking the field of big data and analytics since its inception, making him a sought-after author and speaker as well as a respected authority in the field of database technology.

As a founder of Visual Studio: Live!, one of the nation’s longest-running developer conferences, Andrew remains at the leading edge of understanding developer requirements and the ways in which those needs are rapidly evolving.

Having held CTO, research director, and market strategist positions with a range of companies spanning multiple industries, Andrew understands the challenges facing companies in the 21st century. His insights help organizations of all sizes build functional and performative data solutions that remove barriers to data analysis, reduce costs, and drive business value in a manner that is relevant, credible, and empathetic.

Full report available to GigaOm Subscribers.

Subscribe to GigaOm Research