This GigaOm Research Reprint Expires Aug 21, 2024

Digital Analytics and Measurement Tools Evaluationv1.0

Product Comparison: Google Analytics 4 and Snowplow

1. Introduction

Digital analytics covers a growing set of use cases as more and more of our lives are mediated by digital platforms. This includes:

  • Marketing Analytics: Measuring the return on marketing spend, especially from digital channels.
  • Product Analytics: Helping product teams understand the impact of their product developments on revenue, customer lifetime value, conversion rates, and retention rates and churn rates.
  • Merchandising Analytics: Relevant for retailers that want to optimize their online offer by understanding the performance of different stock keeping units (SKU).

As digital analytics has become more sophisticated, there has been a move to performing more analytics in the warehouse. Google has supported and driven this trend with the native BigQuery integration with GA4. Snowplow has done something similar. It is a warehouse-first analytics tool, delivering all the data into the data warehouse (i.e., BigQuery, Snowflake, Databricks, Redshift) in near real time.

With these differences in mind, we performed a field test to assess how Snowplow compares to GA4 for a retail organization that realizes the mandate for digital analytics. To make the test as fair as possible, we used both Google Analytics and Snowplow in vanilla e-commerce implementations. This means that for both solutions, we used the out-of-the-box e-commerce events.

Both tools support the definition of custom events, but to make the comparison like-for-like, we stuck to the implementation that a retailer is most likely to deploy. We utilized the Snowplow out-of-the-box e-commerce accelerator. Snowplow Accelerators are recipes/templates that enable Snowplow users to execute specific use cases rapidly. Snowplow E-commerce Accelerators allow online retailers to get started with Snowplow quickly, delivering data to power a wide range of e-commerce analytics out-of-the-box. The accelerators provide a standard way to set up e-commerce tracking (including tracking product views, add-to baskets, and transactions), and data models that optimize delivery of the data for analytics and AI.

Google Analytics has standard out-of-the-box e-commerce events (schemas), which are comparable to those that are part of the Snowplow accelerator. The Snowplow accelerator also includes dbt models that process the data in the data warehouse to make it AI and BI ready, and Google Analytics lacks an equivalent to this. But this is half of what the accelerator is—the other half is the schemas, which Google does have.

2. Executive Summary

The primary goal of this field test was to compare running different sorts of analyses that an online retailer would like to perform, in-warehouse, using data from Google Analytics vs data from Snowplow. We utilized our field experience to establish a baseline of metrics that a typical enterprise would need to effectively develop marketing and advertising campaigns, determine the effectiveness of campaigns, and ultimately make the data-driven decisions that will help drive increased revenue, higher customer lifetime value, better return on advertising spend. The costs associated with using the products were also evaluated.

Snowplow has an e-commerce tracking package for iOS and Android, and the dbt packages already support this. With GA4, the tracking works the same across both web and mobile.

The data produced by Snowplow (out-of-the-box, in BigQuery) is significantly easier to work with than that produced by GA4. A big reason for this is that Snowplow ships with an e-commerce accelerator. Like GA4, it provides a set of out-of-the-box e-commerce events that the end user can instrument. Unlike GA4, it also includes a dbt package that generates a set of derived tables in BigQuery optimized for analytics, making the Snowplow data much easier to use and less costly to query.

The accelerator that ships with dbt data models provides a data mart-like experience for surveying web analytics, whereas GA4 has a much more complex data structure. In our findings, Snowplow significantly outperformed GA4:

  • Snowplow performed 6.9x faster than GA4 in our queries.
  • Snowplow reduced the amount of query data scanned by more than 99%.
  • Snowplow queries, after three years of cumulative data growth, proved more than 800 times less expensive than GA4.

Our testing also revealed that Snowplow E-commerce Accelerator is significantly more user-friendly.

3. Products

Google Analytics 4

Google Analytics 4 is the latest version of Google’s web analytics platform, allowing businesses to gain deeper insights into their website performance. As the industry standard web analytics platform, employed by 94% of the top 1 million websites globally, it offers powerful tracking, segmentation, and reporting capabilities to help organizations make data-driven decisions.

Benefits of the platform include automated event tracking, enhanced machine learning capabilities such as conversion modeling, whereby ML algorithms are used to account for things such as users not consenting to data collection. It also has the ability to cross-link the GA4 account with other services, such as Google Ads, with the more enterprise-focused platform Google Marketing Platform (FKA DoubleClick), which contains DV360, Search Ads 360, and Campaign Manager 360 for more comprehensive cross-platform reporting. Further features like customizable streaming reports allow users to monitor data in real-time, and more powerful insights, such as user cohort analysis and prediction models, present a range of opportunities for users to further analyze and act on their data.

Along with the many features that GA4 provides, a few pain points also exist. GA4 can be difficult to learn and use. While the tool has great features for drilling down into data, it can take considerable time to become familiar with its intricacies, making it a challenge for new users. A steep learning curve also exists even for advanced users of the previous version of Google Analytics, called Universal Analytics. Due to the difficult to use UI, Google users refer to GA4’s data in BigQuery in order to gain certain insights.

Our testing showed that GA4 lacks straightforward ways to apply business logic to its data in BigQuery. You are stuck more or less with the clunky web reporting UI and event data in BigQuery or no way to easily build applications on top of the data it provides. It can be done, but it takes time and effort, and Google doesn’t currently offer an official way of doing so. GA4 also offers significantly less functionality than Universal Analytics, as it transitions the analysis to BigQuery instead.

Overall, GA4 offers a wide range of useful features to gain user insights, but the challenge of learning the tool is a notable drawback that must be considered when planning to use it.

Snowplow

Snowplow is a powerful next-generation digital analytics behavioral platform used by brands to understand their customers in detail. It has been developed with a warehouse-first approach as opposed to a vendor-defined user interface and also provides solution accelerators for typical use cases, such as next best actions, personalization, and others. Using the platform, data can be transformed, loaded, and tested with efficient data pipelines and systems.

Snowplow is a platform for creating behavioral data from websites, mobile applications, and other sources to power analytics and AI. Like Google Analytics, it enables organizations to deliver data directly into BigQuery or alternatives like Snowflake, Databricks, and Redshift. Unlike Google Analytics, it does not have its own UI for performing analysis. With Snowplow, end users are expected to perform analytics in the warehouse and in-stream (for real-time applications). Also unlike Google Analytics, Snowplow:

  • Supports multiple data warehouse destinations like Snowflake, Databricks, or Redshift.
  • Streams the entire data set into the data warehouse in real-time with SLAs.
  • Gives end users the ability to process their data end-to-end in their own cloud accounts, which provides compliance advantages.
  • Ships with accelerators and dbt packages that provide derived copies of the underlying data optimized for BI and AI.

Snowplow also offers reusable SQL code to make creating complex queries and data-driven applications easier. The software integrates with popular cloud data warehouses like Snowflake and BigQuery. This provides powerful data storage and analytics capabilities for businesses of all types, leveraging existing investments into the data stack. With these functions and features, Snowplow enables businesses to generate insights quickly from their raw data and make more informed business decisions.

As we found in this study, Snowplow is an incredibly powerful tool for data engineering and analytics that can offer huge value for businesses. Its e-commerce accelerators and reusable SQL code make it much easier and less expensive to extract complex insights from raw data and integrate with cloud data warehouses.

4. Test Setup

We wanted to test the case of a retail/e-commerce company that wants to perform digital analytics in the warehouse to find out if it is easier, more cost-effective, and better overall with Google Analytics or with Snowplow. To do that, we:

  1. Started with a completely vanilla e-commerce data set for both GA4 and Snowplow.
  2. Identified a broad set of questions and analyses an e-commerce company would want to perform. Some of these analyses were relatively simple; others were more sophisticated.
  3. Compared running the analyses in BigQuery between the Snowplow data and the GA4 data.

Objectives

The field test was designed to demonstrate a solution’s ability to address pain points felt by users with conventional, unaugmented web analytics data and architectures. With our field test and time-to-value study, we experienced how easy it is to mitigate the problems while highlighting the differences between the two competitors in reducing these pains.

The test simulated the development of an analytics platform for a retail e-commerce website. We used those results to estimate the complexity of building queries, which has downstream impacts on the time to first insight. In addition, the field test also sought to measure the difference in performance when executing equivalent analytics queries against the out-of-the-box GA4 data model versus the out-of-the-box dbt models built by the Snowplow E-commerce Accelerator. Finally, we sought to understand any differences in the analytics infrastructure costs using the two different platforms and how those costs might compare considering typical usage patterns over a three-year period.

Data

The field test data used was sample data provided by both platforms. It was designed to show indicative usage for e-commerce datasets generated by both tools while allowing end users to explore and understand how the data is formatted and structured and how to extract business value from that data. The datasets used were:

  • GA4 obfuscated sample e-commerce event data obtained from the BigQuery public data sets.
  • Snowplow JavaScript Tracker v3 sample event data obtained from Snowplow.

In their raw form, the Snowplow JavaScript Tracker and GA4 data consist of many similar data elements, including page visits, timestamps, obfuscated user information, geolocation, e-commerce, and other tracking elements. These data sets did not include identical data or represent identical web events. The Snowplow e-commerce dbt models are used in conjunction with their JavaScript Tracker, just as Google Analytics users would have data in its natively supported format. Normally, we would not use non-identical data sets for rigorous, precise benchmarking, but for the objectives of our study, the fact that the data sets are indeed different is important in differentiating their ease of use. Thus, we sought to use data sets of a similar data range across a similar number of sessions, as shown in Table 1. The GA4 data set available in BigQuery public data is larger than the Snowplow data set provided to us, so we randomly selected GA4 partition tables in a 30-day date range that contained an equivalent number of sessions as the Snowplow data.

Table 1. Data Set Characteristics


Google Analytics 4 Snowplow E-commerce
User Session Count 29,015 29,092
Date Range 30 Days 30 Days
Source: GigaOm 2023

Setup

The setup process for our field test was fairly straightforward. For GA4, we simply used the obfuscated tables that came preloaded in BigQuery public data sets. For Snowplow, we used the E-commerce Quickstart documentation available on their website. This documentation provides a step-by-step process for dbt installation, package installation, loading the data into BigQuery, and running the Accelerator scripts to build the models.

Use Cases

The field test methodology consists of use cases designed to simulate a representative set of web analytics e-commerce data needs and uses. Tables 2 through 5 describe the use cases we sought to address. Two Standard SQL BigQuery queries were written for each use case–one for GA4 events and the other against the Snowplow. The dbt models process the data and produce a set of derived tables in BigQuery that are ready for analyses. These are queried, rather than the raw events table, to obtain equivalent results. These result sets could then be used for business intelligence dashboards, drilldowns for data analysts, machine learning training sets for data scientists, and other uses.

We used our field expertise to establish a baseline of metrics that an enterprise would need to effectively develop marketing and advertising campaigns, determine the efficacy of campaigns, and ultimately make data-driven decisions for an e-commerce initiative. Table 2 describes the metrics.

Table 2. KPIs

Name Description Deliverable
Conversion Rate The number of people who made a purchase out of the total number of people who accessed the website Dashboard KPI
Average Order Value The average size of an order in dollars (US) Dashboard KPI
Customer Lifetime Value The total revenue from the top 10 customers throughout their lifetime Dashboard KPI
Cart Abandonment Rate When a customer adds a product to their online shopping cart but doesn’t complete the entire checkout process Dashboard KPI
Returning Customer Rate The percentage of customers who have made more than one purchase from the site Dashboard KPI
Single Event Visit The number of people who viewed one product on the website and then left without taking any action Dashboard KPI 
Average Product Views by Purchaser Type The average number of product pageviews purchaser type (purchasers vs non-purchasers) of users Dashboard KPI
Source: GigaOm 2023

These are some common drilldowns that would be found with the KPIs that provide an extra layer of information and value. Table 3 shows these.

Table 3. Drilldowns

Name Description Deliverable
Top Products Most popular products by view and purchase Drillable result set
Top Pages Most popular pages by view and time spent Drillable result set
Abandoned Products The products most often removed from carts Drillable result set
Product Mix The products purchased by customers who purchased a certain product Drillable result set with dropdown parameter
Source: GigaOm 2023

Tracking the journey of a visitor–starting from their initial awareness of a brand, to when they consider the brand as an option, to when they actually become buyers or subscribers–can provide powerful insight into a brand’s marketing effectiveness. By analyzing how, when, and where visitors enter and move through the journey, it is possible to identify strengths and weaknesses in each stage of the funnel and optimize efforts to improve conversions. Visitor journeys, awareness, consideration, and conversion analytics are immensely useful and interesting in the conversion funnel. (See Table 4)

Table 4. Conversion Funnel

Name Description Deliverable
Visitor Journey Feature usage, visit length, page views, and outcomes Result set for a drillable ribbon chart
Awareness Visitors by unique vs. returning, visit length Result set for a drillable funnel chart
Consideration Conversion rate for visitors to key pages  Result set for a drillable funnel chart
Conversion Journey Compare the journey of people who convert vs. those who do not Result set for a drillable funnel chart
Source: GigaOm 2023

By leveraging machine learning and advanced analysis to identify correlations, companies can increase retention, sales, and customer satisfaction. Upsell model training is essential for increasing top-line revenue by recommending additional products and services that customers may be interested in. Related product model training helps to improve customers’ shopping experience by suggesting items that may be of interest based on their current selections. Market basket analysis enables businesses to improve understanding of trends in customer purchases to better design products and services that will attract customers. Ultimately, these three techniques, shown in Table 5, are critical for businesses to gain a better understanding of their customers and use that knowledge to make data-driven decisions.

Table 5. Machine Learning/Advanced Analysis

Name Description Deliverable
Upsell Model Training Provide a data set for training an upsell recommender model Training data set
Related Product Model Training Provide a data set for training a related product recommender model with a list of product interactions Training data set
Market Basket Analysis Provide a data set to uncover patterns between products or items frequently purchased or viewed together by website visitors Training data set
Source: GigaOm 2023

Measurements

The field test captured a number of measurements to discover insights and differences between GA4 and Snowplow with its e-commerce solution accelerator. These measurements are detailed in Table 6 below.

Table 6. Measurements

Metric Description Measurement
Query Complexity Complexity of the queries required to realize value from the data The number of “words” in the complete SQL query
Query Performance How long it takes to for BigQuery to execute the query Query execution time in seconds
Data Scanned  The amount of data scanned by BigQuery for each query–this has cost implications The number of bytes scanned
Source: GigaOm 2023

The query complexity measurement is based on the number of alphanumeric “words” contained in the query, i.e., a word count. A query word can also contain dashes and underscores. The words are delimited by spaces, equal signs, commas, and other SQL syntax characters. Periods, when used as part of a table object name, were ignored and counted as one word. For example, the query “select event_timestamp from dataset.events;” would count as four words. We also attempted to be fair with the treatment of column and subquery aliases. While complexity can be subjective, we assert that longer queries take longer to write and validate, much like lines of code as a measure of complexity in application development.

Query performance is also very important to web analytics users. Users would not want dashboards to take tens of seconds to populate or refresh. The relationship between query execution time and data volume is positively correlated, so as data volumes increase, the average query time also increases (not necessarily linearly). Thus, query performance is a key user experience and scalability metric.

The queries we composed for our test cases do not constitute a formal, industry-vetted benchmark like the TPC-x benchmarks in the database performance testing world. However, they offer some insight if we find the performance between the two web analytics data sets to be significantly different.

Efficiently organized and streamlined data coupled with optimally-written SQL queries go a long way to achieving satisfactory performance. To measure this performance, we executed every query we composed and recorded the time of execution. Note that this is when BigQuery reports in its query history interface, showing no bias in how query time is measured.

The amount of data scanned is an important metric for both performance and cost. In terms of performance, the amount of data in a table scan can be expensive, significantly increasing the time of execution. In terms of dollar cost, this can require more compute resources to make long-running table scans execute faster. In the case of cloud data warehouse platforms such as Google BigQuery or Azure Synapse Serverless, performant SQL and optimized reporting tables for your BI, dashboarding, and reporting needs become crucial.

5. Findings

Ease of Use

Our ease of use testing revealed that working with the Snowplow data was significantly easier, primarily because it ships with an e-commerce accelerator that includes dbt models that produce derived tables optimized for analytics. The accelerator dbt data models provide a data mart-type experience for analysis of web analytics, while the organizational structure of GA4 was much more complex.

GA4, when stored in BigQuery, has a number of nested record format data elements. For example, the event parameters (event_params) object stores campaign-level and contextual event parameters as well as any user-defined event parameters. The event_params RECORD is also repeated for each key associated with an event. Additionally, some key-value pairs contain different data types, so there is a separate nested column for, say, integers versus strings. Thus, to extract commonly used attributes, like page title or URL location, a user must write their SQL query like this:

    select
      (select value.string_value from unnest(event_params) where key = 'page_title') as page_title,
      (select value.string_value from unnest(event_params) where key = 'page_location') as page_location
    from
      `bigquery-public-data.GA4_obfuscated_sample_E-commerce.events_*`;

Compare this syntax to the straightforward design of the Snowplow models:

    select
      page_title,
      page_url
    from `atomic.events`;

As you can see, the Snowplow’s design allows for shorter, easier-to-write queries, which may be closer to what the user expects. Of the 20 queries we composed in our use cases, Table 7 shows the total number of words for each category.

Table 7. Total Words per Category

Category Google Analytics 4 Snowplow eCommerce
KPI 165 114
Drilldown 204 116
Conversion Funnel 329 168
Machine Learning 66 46
Grand Total 764 444
Source: GigaOm 2023

The number of words needed to form a query affects its complexity and determines how long it takes to write. A complex query with more words may also require more careful thought and consideration, not only increasing the time required to write and maintain it but impacting how quickly the query can be executed and how accurately it will return the desired results.

By our measurements, queries written against Snowplow and its e-commerce solution accelerator were overall 42% less complex than GA4 queries. (Figure 1)

Figure 1. Complexity of Queries as Defined by Word Count (lower is better)

Performance

Of the 20 queries we composed and validated, we executed them in serial fashion and captured the individual and overall query execution times (Figure 2). Executing all 20 queries against the GA4 data set took 111 seconds, while queries written against Snowplow and its e-commerce solution accelerator took only 16 seconds to execute with the same BigQuery On-demand Analysis service. This resulted in Snowplow queries running 6.9 times faster than GA4. This could have critical user experience implications; the dashboard/KPI queries alone on GA4 data took 25 seconds to run versus only four seconds on the Snowplow data.

Figure 2. Execution Time Across 20 Queries (lower is better)

Data Scanned

The difference between Snowplow and GA4 is also apparent in the amount of data scanned per query. Table 8 shows the average amount of data scanned by query for each category in terms of megabytes (MB):

Table 8. Data Scanned by Query (in MB)

Category Google Analytics 4 Snowplow eCommerce
Conversion Funnel 259.8 0.09
Drilldown 234.1 0.84
KPI 38.0 0.01
Machine Learning 274.9 0.30
Average 170 MB 0.26 MB
Source: GigaOm 2023

Snowplow queries scanned less than 1% of the data scanned by GA4 queries. The efficient design of the Snowplow data models is primarily responsible for this difference.

Scenario: Total Cost of Ownership

With this data, we were able to build a calculator and simulate what the total cost of ownership (TCO) might be given a number of assumptions. This is not a perfect model, and real-world results will differ; however, it provides important insight into overall costs, given the differences in the amount of data scanned using Snowplow versus GA4.

In our scenario, we estimated a three-year period for our TCO calculation. We are also using the BigQuery on-demand analysis rate of $5 per terabyte (TB) scanned. Note that this calculator would have a much different result if the database-as-a-service used charges by the hour. The fee per data scanned is unique to products like BigQuery, Azure Synapse Analytics Serverless, and Amazon Web Services Redshift Spectrum.

Our scenario is also based on a number of user behavior assumptions that go into the formula:

  • There are 10 active web analytics users in our organization, including manager, analyst, and data scientist personas.
  • There are people executing queries 10 hours per day and 250 days per year.
  • The number of query executions per user per hour varies by query category:
    • KPI/Dashboard queries: one per minute per user
    • Drill-down queries: Once every 10 minutes per user
    • Conversion funnel queries: three times per hour per user
    • Machine learning queries: once per hour per user

Additionally, we also made assumptions regarding data growth over time:

  • The data for both platforms would grow at a daily rate equal to the average size of a single day’s worth of data produced by GA4 or the Snowplow JavaScript Tracker data every day for the three-year period. Thus, we assumed linear data growth.
  • We also included the query execution cost of the daily build of the Snowplow dbt models.

Given all these assumptions, the following tables reveal the results of daily use of these platforms over time for three years. Table 9 shows on-demand costs after Day 1, while Table 10 shows costs after three years.

Table 9. On-Demand Costs: Day 1 (USD)

Category Google Analytics 4 Snowplow eCommerce
Conversion Funnel $0.37 $0.0001
Drilldown $0.67 $0.0024
KPI $1.09 $0.0003
Machine Learning $0.13 $0.0001
Total $2.26 $0.0030
Source: GigaOm 2023

Table 10. On-Demand Costs: Day 1,092 (USD)

Category Google Analytics 4 Snowplow eCommerce
Conversion Funnel $13.17 $0.0039
Drilldown $23.73 $0.0746
KPI $38.49 $0.0087
Machine Learning $4.64 $0.0045
Total $80.03 $0.0917
Source: GigaOm 2023

As you can see, using the BigQuery on-demand rate, the cost of running queries on Day 1 of the three years is minuscule. However, by the end of three years, the daily cost of running the queries on GA4 is $80 compared to less than $0.10 for Snowplow; more than 800 times less expensive than GA4. Over time, as shown in Table 11, these costs add up.

Table 11. Total Cost of Ownership


Google Analytics 4 Snowplow eCommerce
3-Year Cost of Daily Builds N/A $1.0004
3-Year Total Cost of Queries $44,927.47 $52.66
Source: GigaOm 2023

One may argue that you should not use BigQuery on-demand for GA4 analytics, but you should pre-purchase compute slots for the best price. However, this point is nullified once you consider that the minimum BigQuery slot commitment is three years, pay in advance for 100 slots, which ends up costing $94,608. Therefore, BigQuery On-demand would still be more economical for GA4, given our scenario.

Every scenario is different, and your situation may vary greatly. However, if you use BigQuery On-demand for your web analytics, Snowplow Accelerators can save significant money over time.

6. Conclusion

The results of our study reveal that Snowplow is significantly more user-friendly, better performing, and less expensive than GA4 data. A big reason for this is that Snowplow ships with an e-commerce accelerator including dbt models that create derived tables optimized for analytics. For surveying digital analytics, the accelerator dbt data models offer a data mart-like experience, whereas GA4 has a significantly more complicated data structure.

Compared to GA4 queries, Snowplow queries were 42% less complex, according to our measurements.

For our queries, Snowplow was 6.9 times faster than GA4. Important user experience implications may result from this. Using the same BigQuery On-demand Analysis service, running our 20 queries against the GA4 data set took 111 seconds versus only 16 seconds for queries written against Snowplow and its e-commerce solution accelerator.

Over 99 percent less data is scanned by Snowplow queries than GA4 queries. According to our analysis, after three years worth of data growth, the cost of running queries on GA4 was more than 800 times more expensive than running them on Snowplow. The efficient design of the Snowplow data models are primarily responsible for this difference.

Snowplow’s ability to handle large volumes of data with ease makes it a more reliable option for organizations dealing with massive amounts of data. Moreover, Snowplow provides granular-level data that enables businesses to gain deeper insights into their customers’ behavior and preferences, which can be used to improve their products and services.

Snowplow is an indisputable choice in our comparison when it comes to performing the web analytics needed to stay competitive in today’s market. Its cutting-edge features allow businesses to monitor their data more effectively than ever before, giving them the edge.

 

7. About Snowplow

Snowplow generates, governs, and models high-quality, granular behavioral data, ready for use in AI, ML, and Advanced Analytics applications. When integrated with other tools from the modern data stack, Snowplow can power a wide variety of advanced use cases, allowing organizations to drive significant business value with behavioral data. Data products built on Snowplow include the composable CDP, first-party digital analytics, and ML-powered churn reduction for subscription businesses.

For more information on Snowplow, visit our website.

8. About William McKnight

William McKnight is a former Fortune 50 technology executive and database engineer. An Ernst & Young Entrepreneur of the Year finalist and frequent best practices judge, he helps enterprise clients with action plans, architectures, strategies, and technology tools to manage information.

Currently, William is an analyst for GigaOm Research who takes corporate information and turns it into a bottom-line-enhancing asset. He has worked with Dong Energy, France Telecom, Pfizer, Samba Bank, ScotiaBank, Teva Pharmaceuticals, and Verizon, among many others. William focuses on delivering business value and solving business problems utilizing proven approaches in information management.

9. About Jake Dolezal

Jake Dolezal is a contributing analyst at GigaOm. He has two decades of experience in the information management field, with expertise in analytics, data warehousing, master data management, data governance, business intelligence, statistics, data modeling and integration, and visualization. Jake has solved technical problems across a broad range of industries, including healthcare, education, government, manufacturing, engineering, hospitality, and restaurants. He has a doctorate in information management from Syracuse University.

10. About GigaOm

GigaOm provides technical, operational, and business advice for IT’s strategic digital enterprise and business initiatives. Enterprise business leaders, CIOs, and technology organizations partner with GigaOm for practical, actionable, strategic, and visionary advice for modernizing and transforming their business. GigaOm’s advice empowers enterprises to successfully compete in an increasingly complicated business atmosphere that requires a solid understanding of constantly changing customer demands.

GigaOm works directly with enterprises both inside and outside of the IT organization to apply proven research and methodologies designed to avoid pitfalls and roadblocks while balancing risk and innovation. Research methodologies include but are not limited to adoption and benchmarking surveys, use cases, interviews, ROI/TCO, market landscapes, strategic trends, and technical benchmarks. Our analysts possess 20+ years of experience advising a spectrum of clients from early adopters to mainstream enterprises.

GigaOm’s perspective is that of the unbiased enterprise practitioner. Through this perspective, GigaOm connects with engaged and loyal subscribers on a deep and meaningful level.

11. Copyright

© Knowingly, Inc. 2023 "Digital Analytics and Measurement Tools Evaluation" is a trademark of Knowingly, Inc. For permission to reproduce this report, please contact sales@gigaom.com.