Edge Computing: Drivers and Trends
Edge computing sounds very mystical, but it has been around for a while in some form. Edge computing architectures are designed to
spend most of their useful life near to its consumption rather than the cloud. The shift in the era from user generated to machine
generated data and the advent of technologies like IoT, Artificial Intelligence, Virtual Reality, High Definition Gaming and Streaming
etc. have created volumes of data at galactic scales needing a decentralized internet; much more data created, processed and stored
at the edge.
Cloud Principles at the Edge
Edge is not here to displace the cloud like initially believed; it
is an extension of the cloud. On-demand resources,
abstraction of physical infrastructure to locations close to
consumers are the principles applied The approaches to
manage edge locations are like cloud, it is just a matter of
which workloads are suitable to process at the cloud versus
the edge. Traditional cloud players like AWS with AWS
Outpost and Microsoft with Azure Stack Edge are extending
their edge computing initiatives, technically these are fully
assembled stacks of compute and storage resembling their
own data centers - just installed at the customer premises but
monitored, maintained and serviced by these vendors. Cloud
providers have made it clear that they are extending the
cloud through edge offerings.
New Capabilities for the Edge
Edge will entail a variety of architectures and will be spread
across many locations. Vendors and enterprises both have
been looking to scale their processes, tools and capabilities to
scale across cloud and edge and will need software to evolve
running across multiple locations. The common design is
going to be Kubernetes based to manage containerized
applications, but to manage distributed pods of Kubernetes
clusters. The first opportunities might arise from moderately
distributed environments, across a few regions and across
clouds, this lays a strong foundation for more scale in future.
AI Enabled Edge
A lot of AI applications are very cloud centric today, as new
technologies and use cases emerge e.g., VR, autonomous
driving etc., the latency will get to unacceptable levels given
the real time nature of outcomes. More organizations are
embarking on edge computing and hyperconverged
infrastructure to harness capabilities from AI. While cloud
created the initial enablement for AI, edge will create scale.
AI will also deliver capabilities at the edge to manage and
operate e.g., self-healing applications, self-healing
infrastructure and self-repairing code etc.
Edge Thinking
Given the rapid evolution of this space, edge-oriented
thinking, design principles and resources are getting
mainstream. The movement of technologies, use cases,
economics and consumers are increasingly in favor of
decentralization even if the real scale of edge computing is
still in its early days. Observing the progress of Cloud
Computing (less than 15 years), the edge and decentralized
internet are on a path to evolve and scale quickly. Rapid
evolution of specialized processes, hardware abstractions,
micro and nano data centers are now accessible to
executives, investors, entrepreneurs and developers which
will accelerate creation of unique architectures.
Edge, Beyond Latency
The edge is considered closer to the consumption point (end
user or device) relative to the cloud provider. The traditional
edge narrative has been around new technology, new
latency-sensitive use cases and new economics adoption by
minimizing latency. While latency mitigation is important, the
more valuable use case is network decongestion by traffic
reduction called “Cloud Offload” in the industry. The sheer
quantum of growth of data generated by users, sensors,
devices etc. will create a tipping point for the cloud offload.
The costs of moving bandwidth hungry data like audio, video,
industrial data etc. are non-trivial. Edge computing creates
value from data at the point of generation, some sub-set of
this can also be sent to the clouds for back-ups or enabling
deep learning etc.
Edge Momentum
Certain workloads are more optimal on-premises,
organizations like to reap the benefits of edge computing
without the burden and TCO of on-prem, creating the need
for a different kind based on cloud principles. The Edge
Computing momentum will evolve in waves driven by needs,
use cases and costs.
Wave 0: COVID-19 Edge
With the unfolding of the Coronavirus, more offices, schools,
content, commerce etc. have been creating a huge burden on
the existing global network infrastructure. Edge computing
applications considered nascent have started moving quickly
given the internet was not designed to handle these volumes
of traffic in a centralized manner. A lot of this traffic is local,
and bursty given employees of same offices, students of same
schools are working remote making the case of the edge
stronger and catalyzing quick migrations and edge enabled
applications.
Wave 1: Quasi Edge
Overlapping regional footprint with the cloud providers e.g.,
just enabling a customer to leverage lower latency and lower
cost will be the primary driver. Secondary drivers would be
multi-cloud strategies, security, hybrid approaches and
avoiding vendor lock-in.
Wave 2: Location Centric Edge
This architecture will leverage infra, apps in hundreds of
locations, not just footprint of the cloud providers. CDN
providers with caching static content at location of
consumption has been an early enabler of the edge with
many more locations than the average cloud player. CDNs
have opened their infrastructure to a multitude of workloads
enabling VM as a service, bare metal as a service, container
as a service and serverless functionality mimicking cloud
providers and helping them extend to more regional
infrastructure options.
Wave 3: High-Density Edge
This will finally expand the edge to full scale i.e., one hop
from the consumption point. The micro data centers and
nano data centers ranging from one rack could be deployed
in remote areas, near base stations etc. Innovations in power,
cooling, clean energy etc. will drive higher density
infrastructure in smaller ubiquitous data centers. The ROI
calculations of High-Density Edge works only marginally
better than the Location Centric Edge. Wave 3 will be
mainstream only when drones, autonomous driving, smart
cities, VR enabled surgeries, robotics etc., the need for short
network routes and ultra-low latency coupled with safety will
drive this as killer apps emerge.
Conclusion
Trends and use cases reveal the promise behind edge computing and the need to decentralize the internet infrastructure,
applications and data for efficiency, cost, performance and speed. A few aspects of edge computing are hyped while others are less
understood.
About the Author:
Nitin Kumar is a two-decade veteran in the Hi-Tech industry. He is currently the CEO of Appnomic but
played a variety of hands on executive roles ranging from CEO, Chief Growth Officer, Chief
Transformation Officer, M&A Integration/Separation Leader, BU Head and Management Consulting
Partner (corporate and PE portfolio companies). His passion is propelling organizations to greater levels
of success through strong relationships and differentiated solutions. He is considered a business
builder, thought leader and pioneer of many innovative approaches. Nitin Kumar is a member of the
Forbes Technology Council and shares his ideas and thoughts on the forum regularly.

Edge Computing: Drivers and Trends

  • 2.
    Edge Computing: Driversand Trends Edge computing sounds very mystical, but it has been around for a while in some form. Edge computing architectures are designed to spend most of their useful life near to its consumption rather than the cloud. The shift in the era from user generated to machine generated data and the advent of technologies like IoT, Artificial Intelligence, Virtual Reality, High Definition Gaming and Streaming etc. have created volumes of data at galactic scales needing a decentralized internet; much more data created, processed and stored at the edge. Cloud Principles at the Edge Edge is not here to displace the cloud like initially believed; it is an extension of the cloud. On-demand resources, abstraction of physical infrastructure to locations close to consumers are the principles applied The approaches to manage edge locations are like cloud, it is just a matter of which workloads are suitable to process at the cloud versus the edge. Traditional cloud players like AWS with AWS Outpost and Microsoft with Azure Stack Edge are extending their edge computing initiatives, technically these are fully assembled stacks of compute and storage resembling their own data centers - just installed at the customer premises but monitored, maintained and serviced by these vendors. Cloud providers have made it clear that they are extending the cloud through edge offerings. New Capabilities for the Edge Edge will entail a variety of architectures and will be spread across many locations. Vendors and enterprises both have been looking to scale their processes, tools and capabilities to scale across cloud and edge and will need software to evolve running across multiple locations. The common design is going to be Kubernetes based to manage containerized applications, but to manage distributed pods of Kubernetes clusters. The first opportunities might arise from moderately distributed environments, across a few regions and across clouds, this lays a strong foundation for more scale in future. AI Enabled Edge A lot of AI applications are very cloud centric today, as new technologies and use cases emerge e.g., VR, autonomous driving etc., the latency will get to unacceptable levels given the real time nature of outcomes. More organizations are embarking on edge computing and hyperconverged infrastructure to harness capabilities from AI. While cloud created the initial enablement for AI, edge will create scale. AI will also deliver capabilities at the edge to manage and operate e.g., self-healing applications, self-healing infrastructure and self-repairing code etc. Edge Thinking Given the rapid evolution of this space, edge-oriented thinking, design principles and resources are getting mainstream. The movement of technologies, use cases, economics and consumers are increasingly in favor of decentralization even if the real scale of edge computing is still in its early days. Observing the progress of Cloud Computing (less than 15 years), the edge and decentralized internet are on a path to evolve and scale quickly. Rapid evolution of specialized processes, hardware abstractions, micro and nano data centers are now accessible to executives, investors, entrepreneurs and developers which will accelerate creation of unique architectures. Edge, Beyond Latency The edge is considered closer to the consumption point (end user or device) relative to the cloud provider. The traditional edge narrative has been around new technology, new latency-sensitive use cases and new economics adoption by minimizing latency. While latency mitigation is important, the more valuable use case is network decongestion by traffic reduction called “Cloud Offload” in the industry. The sheer quantum of growth of data generated by users, sensors, devices etc. will create a tipping point for the cloud offload. The costs of moving bandwidth hungry data like audio, video, industrial data etc. are non-trivial. Edge computing creates value from data at the point of generation, some sub-set of this can also be sent to the clouds for back-ups or enabling deep learning etc.
  • 3.
    Edge Momentum Certain workloadsare more optimal on-premises, organizations like to reap the benefits of edge computing without the burden and TCO of on-prem, creating the need for a different kind based on cloud principles. The Edge Computing momentum will evolve in waves driven by needs, use cases and costs. Wave 0: COVID-19 Edge With the unfolding of the Coronavirus, more offices, schools, content, commerce etc. have been creating a huge burden on the existing global network infrastructure. Edge computing applications considered nascent have started moving quickly given the internet was not designed to handle these volumes of traffic in a centralized manner. A lot of this traffic is local, and bursty given employees of same offices, students of same schools are working remote making the case of the edge stronger and catalyzing quick migrations and edge enabled applications. Wave 1: Quasi Edge Overlapping regional footprint with the cloud providers e.g., just enabling a customer to leverage lower latency and lower cost will be the primary driver. Secondary drivers would be multi-cloud strategies, security, hybrid approaches and avoiding vendor lock-in. Wave 2: Location Centric Edge This architecture will leverage infra, apps in hundreds of locations, not just footprint of the cloud providers. CDN providers with caching static content at location of consumption has been an early enabler of the edge with many more locations than the average cloud player. CDNs have opened their infrastructure to a multitude of workloads enabling VM as a service, bare metal as a service, container as a service and serverless functionality mimicking cloud providers and helping them extend to more regional infrastructure options. Wave 3: High-Density Edge This will finally expand the edge to full scale i.e., one hop from the consumption point. The micro data centers and nano data centers ranging from one rack could be deployed in remote areas, near base stations etc. Innovations in power, cooling, clean energy etc. will drive higher density infrastructure in smaller ubiquitous data centers. The ROI calculations of High-Density Edge works only marginally better than the Location Centric Edge. Wave 3 will be mainstream only when drones, autonomous driving, smart cities, VR enabled surgeries, robotics etc., the need for short network routes and ultra-low latency coupled with safety will drive this as killer apps emerge. Conclusion Trends and use cases reveal the promise behind edge computing and the need to decentralize the internet infrastructure, applications and data for efficiency, cost, performance and speed. A few aspects of edge computing are hyped while others are less understood. About the Author: Nitin Kumar is a two-decade veteran in the Hi-Tech industry. He is currently the CEO of Appnomic but played a variety of hands on executive roles ranging from CEO, Chief Growth Officer, Chief Transformation Officer, M&A Integration/Separation Leader, BU Head and Management Consulting Partner (corporate and PE portfolio companies). His passion is propelling organizations to greater levels of success through strong relationships and differentiated solutions. He is considered a business builder, thought leader and pioneer of many innovative approaches. Nitin Kumar is a member of the Forbes Technology Council and shares his ideas and thoughts on the forum regularly.