Key Criteria for Evaluating Edge Infrastructure

Table of Contents

  1. Summary
  2. Key Criteria Report Methodology
  3. Considerations About the Evaluation Criteria
  4. Table Stakes
  5. Key Criteria
  6. Key Criteria: Impact Analysis
  7. Near-Term Game-Changing Technology
  8. Conclusion

1. Summary

The edge market has been growing by leaps and bounds despite the fact that many people are still uncertain about what the edge is, and which technology components enable edge applications. With the rollout of 5G wireless technology and the explosion of the Internet of Things (IoT) devices, the market for edge enablement technology is projected to explode. Industries and municipalities have taken notice of the potential use cases of edge applications, and are starting to explore possible deployment models. Edge applications need a place to run, and that place will be some iteration of the infrastructure edge.

Understanding the opportunity and building an edge computing strategy today will help technology leaders to improve several processes and increase competitiveness. Edge computing is a fundamental component for modern infrastructures because it allows data to be directly processed and consumed where it is created while sending only the necessary information to the network core and cloud. Edge infrastructure has specific characteristics that enable it to respond to unique requirements, at the same time as creating an improved experience for customers. Power consumption, space utilization, ease of use, reliability, and more are all crucial aspects that determine the success or failure of the overall strategy.

Thinking about edge infrastructure as an extension of the existing infrastructure rather than as a separate entity will increase the speed of new application development while improving day-to-day operation. Total cost of ownership (TCO) and overall simplicity should always remain among the top priorities for edge computing deployments.

Opportunities at the Edge

There are hundreds, if not thousands of edge use cases that provide an opportunity to improve efficiency, enhance customer experiences, reduce waste, and curtail shrinkage. The following are a few sample use cases in different industries.


A well-known manufacturing company is using IoT sensors in its assembly line to augment the existing sensors built into the original equipment. The new IoT sensors measure vibrations of the equipment at hundreds of different points. Data from those sensors is merged with information from the original equipment sensors to build a model of machine health. Anomalies can be detected with greater accuracy, which leads to less downtime and higher production line efficiency. Sensor data is sent to an infrastructure edge device for analysis and model building and then to the cloud for long-term storage, aggregate analysis, and modeling across multiple locations.

Shipping and Logistics

Shipping companies can encounter challenges with the proper arrangement, loading, and contents of pallets for shipping goods. One solution put into practice merges machine vision technology from smart cameras with data from the enterprise resource planning (ERP) system to track the packing of pallets in real-time. Workers receive immediate feedback on their packing process, and the company sees reduced shrinkage, improved packing efficiency, and lowered shipping anomalies. An infrastructure edge server performs the integration of machine vision with ERP data and the aggregation of data for analysis.


Hatch, a subsidiary of gaming company Rovio, has been leveraging edge computing provided in micro-data centers by Packet. Hatch uses specialized hardware in Packet’s edge locations to run local, multiplayer games in real-time, providing an enhanced experience for consumers on lower-end Android devices that may not have the necessary memory or CPU to run higher-end games. By streaming the game from a local edge point in sub-millisecond proximity to their device, gamers enjoy a rich gaming experience on an inexpensive device. Making an edge deployment model available to them has given Hatch a competitive advantage in emerging markets across the globe.

Defining the Edge

Edge is an immature market, and has been defined in different, often conflicting ways. To provide a level of consistency, organizations like the Linux Foundation Edge (LF Edge) alliance have attempted to establish common terminology with their Open Glossary of Edge Computing. They have also produced a landscape for edge, available on GitHub, that helps define common categories for edge solutions. We will be using many of the definitions laid out by the LF Edge alliance, notably, the most relevant solutions for this report are those in the infrastructure, platform, and real estate categories. For the purposes of this report, we are defining edge as encompassing the technology stack that sits outside of cloud providers, on-premises data centers, and enterprise offices.

The edge can be thought of as layers, with each layer defined by its compute capacity, power restrictions, and network connectivity. The device edge can be thought of as the outermost component, and it includes end-user devices and sensors located in the ‘last mile’ of the network. The next layer – and the focus of this report – is the infrastructure edge connected to the Device Edge by an access network. The infrastructure edge typically sits in close proximity to the device edge, but has characteristics of a traditionally hosted data center. The infrastructure edge may, in turn, be interconnected across many sites using an aggregation edge layer of networking, or through a core network to a public cloud provider or regional data center.

Infrastructure edge represents a sweet spot for many edge applications. While it would be nice to run applications on the device edge, in many cases, that approach has limitations. It can lack sufficient computational power, data storage, or electrical power. Device edge devices may have upgrade cycles of 10-15 years in the industrial space, making them a static platform for application development. It is also possible to run edge applications on a public cloud provider or traditional on-premises data center. Still, in that case, there may be too much network latency or insufficient bandwidth to the device edge for an application to be effective. The infrastructure edge sits in the middle, with sufficient network bandwidth, and lower latency than an upstream solution. It will also have greater computation power, more data storage, and dependable electrical power. While infrastructure edge equipment will not be as easy to upgrade as a traditional data center, it is still simpler to upgrade than thousands or millions of device edge endpoints.

This report will take a look at the infrastructure edge solutions that exist in the market through the lens of key criteria and metrics. Since the market is still so immature, the various solutions differ in significant ways regarding their approach, capabilities, and integrations. By establishing a concrete set of categories and a standard set of criteria, readers will be able to better determine whether those solutions meet the requirements of their use case.

The key criteria in the report have an impact on the following metrics:

  • Ease of Deployment and Management
  • Scalability
  • System Lifespan
  • Solution and Partner Ecosystem
  • TCO
  • Performance
  • Efficiency

These metrics help an organization measure how appropriate a solution is for their use case.

Full content available to GigaOm Subscribers.

Sign Up For Free