Blog OCTO

Will The Cloud Find a New Home At The Edge?

Markus Nispel Chief Technology Officer, (CTO) - EMEA Published 26 Jan 2023

Organizations continue to extend and distribute their enterprise to the edge, where customers, employees, and assets are located, and they are connecting everything and everyone digitally. As the world becomes increasingly connected and distributed, the concept of the Infinite Enterprise is becoming more relevant than ever. However, the growing data demands of business applications are putting pressure on organizations to find ways to distribute data faster and reduce latency. This is driving the need for more advanced data management and processing capabilities beyond what is offered by the traditional models where applications operate solely in the public and private clouds provided by hyperscalers like Amazon, Google, and Microsoft.

The cloud has been the go-to solution for most organizations looking to store, process, and manage their data and applications. However, as the volume of data generated by enterprise networking devices, IoT devices, and 5G networks increase, the cloud may not always be the best option. Edge computing is a rapidly emerging trend that is driving digital transformation across industries, from manufacturing and smart buildings to education. In situations where low latency, real-time processing, and data sovereignty are important, edge computing may be a more suitable solution. Edge computing allows for data to be processed at or near the source, reducing the need to transfer large amounts of data to the cloud and providing faster processing times.

According to Gartner, “by 2025, more than 50% of enterprise-managed data will be created and processed outside the data center or cloud.” But what exactly is driving this trend?

First, edge computing is evolving due to several factors, but primarily due to latency concerns. As more data is generated at the edge, the focus will shift towards reducing bandwidth costs by placing more computing power closer to the source.

Second, the explosion of devices and data is one of the most significant trends in the technology industry today. According to Gartner, the number of Internet of Things (IoT) devices is projected to triple from 2020 to 2030, with a compound annual growth rate (CAGR) of 11%. Along with this growth in devices, the data they produce is also expected to increase exponentially, driven by larger data streams and bandwidth intensive applications.

There are three key advantages to moving applications to the edge:

  1. Enhanced response time and application experience: More and more applications are being created for immersive experiences that require real-time processing and low latency. The future of augmented reality (AR) and virtual reality (VR) in the enterprise are the two most cited examples of this, as even a small delay in processing can greatly impact the user experience. However, businesses already use a wide range of applications that will benefit from edge processing. For example, smart city solutions such as traffic management, parking, and public transportation, which provide real-time information to citizens and city officials. This is particularly important for real-time applications, such as autonomous vehicles and industrial automation, where a delay of few microseconds can have significant consequences. The edge, with its ability to process data locally, can provide the necessary low latency and real-time decision making to support these use cases, and just as important, improve the overall user experience
  2. Reduced bandwidth and compute costs: Although latency concerns have been a primary driver, cost savings will also drive the move towards the edge. The edge can act as a buffer, processing, and filtering data before it is sent to the cloud, reducing the load on the cloud, and improving its overall performance.
    • Gartner estimates that by 2025, “bandwidth cost will be the primary driver for new edge computing deployments, versus latency in 2021.” We already know the amount of data created at the edge is growing at exponential rates. So, by reducing the amount of data sent to the cloud, organizations can realize tremendous savings on bandwidth costs.
    • This also allows for more efficient use of resources, as the cloud can focus on more complex tasks, such as machine learning and analytics. However, let’s face it, compute costs in the traditional cloud model are also expensive. There can be a significant savings in compute costs as data is processed at the edge, reducing the need to even transfer it to the cloud. With the proper edge solution, even complex computation is possible and, according to Gartner, “by 2027, machine learning (ML) in the form of deep learning (DL) will be included in over 65% of edge use cases, up from less than 10% from 2021.”
  3. Increased data sovereignty: Data sovereignty is becoming an increasingly important consideration as organizations look to manage and process the growing volume and diversity of data generated at the edge. With the need for faster data distribution and reduced latency, there is a growing need for advanced techniques for managing and processing data that ensure data sovereignty. This is driving the need for methods for distributed data management and processing, as well as techniques that comply with geopolitical regulations and laws, ensuring compliance with data privacy regulations.

The Linux Foundation, a non-profit organization that promotes the use of Linux and open-source software, has a positive view on edge technologies. The Linux Foundation sees edge computing as a way to bring the power of the cloud closer to the edge of the network, where it can be used to improve the performance and security of various applications.

So where exactly does the edge reside? Depending on the technology and depending on who you talk to, you will get different answers. As shown in Figure 1, there are various terms when discussing ‘the edge.”

Figure 1 – Edge examples

The hyperscalers already offer cloud edge computing services. Gateways are placed at the edge of a network and are used to process and store data locally. They often run lightweight versions of cloud services, such as AWS Greengrass, Azure IoT Edge, and Google Cloud IoT Edge.

Discussions of 5G cellular are centered around the telco edge. Radio Access Network (RAN) is the part of a mobile network that connects mobile devices to the core network. RAN includes the base stations, or cell sites, that transmit and receive radio signals to and from mobile devices. Mobile Edge Computing (MEC) is a technology that brings computing power closer to the edge of a mobile network, by deploying servers and data centers at the base station level. Together, MEC and RAN technologies enable telco edge computing.

However, for the purpose of this blog post, I will mainly focus my discussion on the topic of edge in relation to enterprise networking. The enterprise edge refers to the network and compute resources that are situated at the boundary of an enterprise network, near the end users or devices. These may include edge devices, such as routers, switches, access points but also general compute resources as appliances and gateways. One could think of all of those components as a single, distributed compute platform that could run specific applications that are suitable for those edge deployments.

Of course, there is also the client devices. The device edge refers to the processing and analysis of data that is done directly on the devices that generate the data, such as IoT devices, smartphones, and cameras. As the compute capacity on devices continues to increase, it will provide unique opportunities to rethink application processing.

Cloud technologies have revolutionized the way we think about deploying and scaling applications. Containers, in particular, have become increasingly popular due to their lightweight and portable nature, making them well-suited for edge computing. They provide a level of isolation and security that is essential for enterprise-level deployments. But what if we took this concept one step further?

Instead of always sending data and applications to the cloud, keeping them local and using cloud native toolchains to deploy them on the enterprise edge can be considered an alternative approach. An Intelligent Enterprise Edge can be built that utilizes the networking and excess compute infrastructure of the enterprise edge, such as access points, switches, routers, and other customer-premises equipment (CPE), but in a distributed processing fashion.

Deployment and management of cloud native applications in the enterprise edge could be made possible with this approach. Computation could occur in partner data centers, customer data centers, the access layer networking equipment (switches, APs, etc.) or any combination of all of these. This would provide a level of flexibility and scalability that is unmatched by traditional enterprise edge solutions.

Down the road we can create applications that utilize coordinated data processing on both a wireless mobile device and an access point, in combination. For example, a mobile application that utilizes the processing power of both the client device and the access point to provide real-time analytics and insights. An edge IoT gateway can use data from IoT sensors to adjust its own settings and functions in real-time, resulting in improved efficiency and reduced downtime.

To develop these new applications, one would expect additional services. To expand on the vision, at some point, the telco and service provider edge could collapse into the Intelligent Edge ecosystem. Most certainly, the arrival of 6G cellular technology as early as 2030 could become a major factor in this integration.

To support the development of edge computing, the Linux Foundation has created the LFEDGE organization which is a unified community for open-source edge technology that fosters cross-industry collaboration across IoT, telecom, enterprise, and cloud ecosystems.

So, will the cloud find a new home in the edge?  I think the answer is yes. As stated earlier, Gartner believes that 50% of enterprise-managed data will be created and processed outside the data center or cloud. We think that by 2030 the numbers could be close to 75% or higher if an Intelligent Enterprise Edge ecosystem is created to bring the benefits of cloud technologies to the next level. The concept of the Intelligent Enterprise Edge provides a new way of thinking about enterprise edge computing, and how it can be used to power next-generation applications and services. This is an exciting development for the enterprise networking industry, and we look forward to seeing how it evolves in the future. Stay tuned for more blogs on the subject in the coming years.

Get the latest stories sent straight to your inbox!

Related Stories