What Makes Something Edge Native?

In simplistic definitions, the edge is characterized as simply moving workloads closer to end users to reduce network latency associated with clouds. While that is an important component, reducing the network latency is just a third of the process. What makes an edge is reducing compute, network, and storage-related latency.

Why is Edge in Demand?

Imagine that your cloud location is close to your end users, so the network latency is under 20 milliseconds. With the sizable footprint of cloud data centers worldwide, edge services would only really be needed for remote locations, and you would expect demand to be low. But that’s not the case; we see a lot of demand for edge solutions, mainly because of the compute and storage-related latency components.

Compute latency dictates how long it takes for a request to be processed, from provisioning a compute instance to returning the result. Storage latency represents the time required to retrieve relevant data.

The Need for Speed

To reduce compute-related latency, edge services providers offer edge-native execution environments based on technologies such as WebAssembly, a binary instruction format designed as a portable compilation target for programming languages. WebAssembly offers cold starts of under one millisecond, which is particularly important when handling large variations in traffic that the services can scale without impacting performance.

To reduce storage latency, edge solutions use key value stores, which offer very fast performance for reads and writes because the database is looking for a single key and is returning its associated value rather than navigating matrices.

Reducing network latency is not a simple endeavor either, and it can take multiple forms, which give us the terms “far edge” and “near edge.” The far edge hosts compute instances in third-party infrastructure-as-a-service providers for latency times of under 20 milliseconds. The near edge entails compute instances deployed locally and managed by the customer for negligible network latency times.

Is this speed necessary? We’ve seen multiple reports that reveal that an n-second increase in page load times leads to an X% decrease in conversions. But I don’t think this is the only reason for investing in technologies that reduce latencies and load times. Over time, the amount of data and processes associated with a web page or web service has increased considerably. If we were to keep using the same technologies, we would quickly outgrow the performance standards for modern services. I believe that building and running edge-native services future-proofs infrastructure and removes any future innovation bottlenecks.

Next Steps

To learn more, take a look at GigaOm’s edge development platforms Key Criteria and Radar reports. These reports provide a comprehensive overview of the market, outline the criteria you’ll want to consider in a purchase decision, and evaluate how a number of vendors perform against those decision criteria.

If you’re not yet a GigaOm subscriber, sign up here.