Exploring the Edge and the Future of a Decentralised Internet

Exploring the Edge and the Future of a Decentralised Internet

The Internet of today is radically different from the Internet of yesterday, emblematic of a fundamental shift in how we access connectivity and what capabilities that access grants. Through this, change of profound proportion has swept across society, revolutionising everything it touches.

If we turned back the clock to the beginning of the new millennium, and I told you that soon, everything, everywhere will be connected to the Internet, a shower of laughter would have ensued. On its own, that’s a fascinating encapsulation of how much has changed in a pretty minuscule space of time.

"What was once viewed as magic is now treated as a status-quo commodity."

But, don’t fret, the Internet and the diverse ecosystems which blossom in its presence aren’t stagnating anytime soon. In fact, by all measures, the pace of advancement is undergoing progressive acceleration, with the technological fruits of the past decades converging to act as a catalyst for breakthrough innovation in next.

We are hurtling towards an inflection point. Much of the applications that define the Internet of today focus on consumption - think about streaming, browsing the web and playing multiplayer games. Tomorrow, the gulf between consumption and production won’t be so extreme, lending itself to a new era with new possibilities. 

The common foundation of those possibilities is data - the new oil. Picture a world in which the interpretation of data from millions of sensors, mounted on every tangible object, compels and supports society to maximise the efficient exploitation of resources and minimise the creation of waste. 

There is no singular term to describe the above vision because it represents an intersection of distinct technologies. The Internet of Things (IoT), Big Data, Blockchain and beyond have equally important roles to play in a world underpinned by data, and more specifically, the production and understanding of it.

“As we abandon the concept of the Internet acting solely as a medium for consumption of services, the physical distance over which information must traverse will become a critical differentiator.”

It is precisely here that the concept of edge computing enters the equation. While centralisation of compute resources in the cloud has facilitated the Internet’s remarkable ascension thus far, such an architecture is not suitable for emerging applications, most of which focus on the production and interpretation of data at the network edge - where sensors preside and where actionable insights can make a meaningful impact.

The Edge: Reduced Latency and Strain on Transport Networks, Enhanced Local Control, Resiliency and Security.

No alt text provided for this image

With a foundational objective to push processing resources from a centralised medium to the extreme peripherals of a network, edge computing paves the way for faster and more efficient analysis of data. The reduction in the geographical distance between devices which generate data and the compute resources that interpret the data produces these feats.

This compulsion to distribute processing power across multiple distinct networks stands in sharp contrast to the cloud computing model, which adopts centralisation as a means to bolster security and prevent fragmentation. Historically, by concentrating and consolidating resources in one place, the cloud, synergies could be achieved.

However, the challenge of enabling a hyper-connected world undermines the integrity of cloud architectures. Remember, a tidal wave of data is sweeping in as the number, density and intricacy of connected devices at the network edge explodes. Connectivity is no longer a luxury that’s limited to smartphones, think about smartwatches, smart speakers and smart TVs.

No alt text provided for this image

Under a cloud architecture, every bit of data produced by devices must be transported from the network edge to a centralised data centre. Without stating the obvious, this is not an efficient or scalable method to transport data, particularly when we consider the fact that actionable insights, generated from the analysis of data, must be fed back downstream. 

“Long-haul transportation of data across vast distances exerts unnecessary pressure on the backbone infrastructure that supports the Internet.”

Edge computing disrupts this model, reducing the occurrence of capacity bottlenecks and latency in the backbone and even middle-mile networks by eliminating the need to send every bit of data from the network edge to the central cloud and back again. This is of huge benefit, particularly for mission-critical applications.

As an extension of the above, a move to the edge will enhance availability. Time is money, and a reduction in downtime will be welcomed with open arms. Decentralisation of compute and storage resources creates multiple concurrent layers of physical redundancy, limiting the magnitude of outages with isolation. 

Moving away from performance, it is of utmost importance to consider the security implications of a decentralised architecture. With the relentless “cloudification” of things touching critical infrastructures such as water and electricity supply, we are inadvertently spawning new vulnerabilities, each of which wields the power to inflict decimation if exploited.

No alt text provided for this image

Decentralisation of storage resources makes it significantly more difficult for actors with mal-intent (aka blackhat hackers) to isolate and capture data, effectively quarantining the information fed from sensors in small, secure pockets across the network edge. In addition to the latter, the reduced geographical distance over which data must travel reduces the likelihood of man-in-the-middle attacks.

There are some key cost savings associated with edge computing too, notwithstanding the fact that cloud computing has traditionally been viewed as the most cost-effective approach. Processing of data at the edge permits the exploitation of previously under-utilised compute resources and reduces the urgency to invest in ultra-high-capacity transport networks.

Defining the Edge with three distinct implementations.

Unsurprisingly, not everyone is enthusiastic about the idea of effectively reverting the Internet to an architecture that more closely resembles its primitive state. Many eyebrows have and will continue to be raised. The crux of the issue revolves around definition of the edge - where is it?

To that question, there is no simple answer. Some will refer to the edge as being the place where distributed compute and storage resources process information that is fed from sensors. But, quite frankly, that’s a cop-out because it fails to recognise the diverse nature of the edge, which will vary in form and function depending on the implementation.

An Edge for the Telecoms Industry.

No alt text provided for this image

Perhaps the most universally explored edge architecture in recent times is one which can be adopted by fixed and mobile operators. In this model, a telecoms company leverages its existing assets to place compute resource across cell sites, aggregation nodes and central office sites. In effect, the network edge becomes the location of telecoms infrastructure.

As an operator-controlled edge architecture, this could support the provision of in-house services such as streaming to customers in the form of an integrated content delivery network (CDN). In the battle to compete with OTT players, such an advantage would be invaluable for telecoms companies.

It could also tie in with the emergence of neutral-host and small cells at the edge, areas of particular focus as mid-band and mmWave-based 5G NR deployments demand densification of the site grid.

However, facilitating an edge architecture with such a high degree of decentralisation across a large geographical area is riddled with challenges. There are very real concerns relating to fundamentals including power and size, and whether the demand for them can be sustainably met by operators at distributed locations.

To add to the above, while this decentralised architecture paves the way for ultra-low latency communication (<10ms), vital for mission-critical applications, achieving a high level of interconnectivity with peering is a daunting feat. We should not lose sight of the fact that server-to-server latency will remain an important factor for many use cases, and an exclusive focus on device-to-server latency moving forward would be a mistake.

Data Centres. But smaller and more dispersed.

No alt text provided for this image

For some (conservative) data centre owners, the emergence of edge computing is viewed as a threat of unprecedented magnitude. But, if we take a step back and examine the bigger picture, it is vividly apparent that data centre owners are in a prime position to capitalise on the decentralisation of their assets.

In practice, an edge architecture for data centre owners means expanding compute and storage facilities into smaller cities and towns. As decentralisation is an expensive and time-consuming process, it will need to be methodically planned to ensure facilities are positioned where ultra-low latency matters most to applications.

Even with a large geographical distribution of facilities, it will be of pinnacle importance to maintain an adequate level of interconnectivity, boasting fibres to multiple telecom operators. Through the latter, data centre owners can differentiate their services from those offered by fixed and mobile operators.

No alt text provided for this image

The high level of interconnectivity at data centres sets them apart. This is a key feature for applications that need to access databases in the central cloud with minimal latency (server-to-server). Think about biometric authentication as one such application.

Unfortunately, just as challenges apply to an operator-controlled edge architecture, the same is true with one deployed by data centre owners. Oftentimes, multinational companies are the largest customers of data centres, and they require ease of access to many global markets.

It will be incredibly difficult for data centre owners to scale their geographical presence at the edge, particularly across countries with varying regulations on issues such as data protection. Moreover, there is bound to be some complications in ensuring peering runs smoothly across edge facilities.

An Edge defined by software for software developers.

No alt text provided for this image

There is another and more abstract method to enable edge computing - utilisation of software to intelligently aggregate and exploit decentralised compute resources at the edge. With this architecture, edge infrastructure is transformed into something that is on-demand and omnipresent.

For clarity, the core driving force behind the aggregation of multiple edge facilities is to fulfil a desire to create a unified platform onto which software developers can test and deploy their products. Key to this vision will be a strong geographical presence and interconnectivity with a large number of telecoms operators and cloud platforms.

To simplify this complex architecture, we need to refer to an example. Imagine a developer intends to deploy their augmented reality (AR) game for Android and iOS across markets in Europe and North America. In this case, edge computing is required to minimise device-to-server latency.

For monetisation of the game, analytical and advertisement services need to run atop it, stemming from different servers on another cloud platform. Here, it is server-to-server latency that will matter most. Thanks to a neutral architecture, defined by software, that aggregates edge resources and integrates with different cloud platforms, the game can be deployed with relative ease across a large geographical area.

Proliferation of the Edge does not spell the end for the Cloud.

No alt text provided for this image

The Internet has never functioned under the one one-size-fits-all mantra, and the proliferation of edge computing won’t change that. For a large number of applications, and particularly those sensitive to server-to-server latency, centralisation of resources in the cloud will continue to deliver the best performance and end-user experience

Instead, I envisage a relationship of co-existence developing between the two architectures, with the centralised cloud working to support distributed resources at the network edge. The individual performance requirements of applications will dictate their suitability to a centralised or decentralised architecture.

“It is important that the industry doesn’t glorify edge computing to an extent that is of detriment to the cloud.”

Make no mistake, there are inherent benefits of a cloud architecture. It is dramatically more difficult to exert complete control over compute resources that are geographically dispersed, and this challenge will only exacerbate with time and continued expansion.

Fragmentation is rife at the edge too, an issue that cloud platforms have averted through evolution. One should question whether the demands for processing power and energy at the edge can be met, and at what cost?

However, despite the above, some applications will require access to both the edge and the cloud, setting the scene for the emergence of hybrid models. These hybrid developments and edge computing will be shaped, in no small part, by the advancement of peripheral technologies such as software-defined networking (SDN).

Conclusion: Embracing the Edge

No alt text provided for this image

Without a perpetual ambition to advance the Internet by reworking its underlying infrastructure, everything that depends on it stands still. In a world of inventors and visionaries, stagnation and hostility to change acts as a glass ceiling. Smashing it requires us, as one whole society, to embrace new technologies and ultimately shape their evolution.

A paradigm shift is already in full swing - the mass production of data at the edge. While the cloud has facilitated this connectivity explosion thus far, it is simply not sustainable to transport every bit of data from the network edge to a centralised data centre. Distribution of compute resources not only unlocks radically new applications but also ring-fences security.

It is not all roses, however, and the importance of understanding which challenges lie ahead and how best to tackle them cannot be underestimated. Power and size restrictions need to be reviewed, and achieving a high of interconnectivity will be a battle in a world where fibre availability at the edge is still lacking.

But, for the Internet and our society, edge computing is an exhilarating development.

Denis Canty

Experienced VP/CTO | Graduate Economist | AI, Product, Software & Cloud Leadership | Speaker

4y

Nice writing with passion.

Like
Reply
Brad McManus

Telecommunications Team Builder and Business Advisor | Founder Principal Consultant

4y

Enjoyed the write up I agree.... “The Internet of Things (IoT), Big Data, Blockchain and beyond have equally important roles to play in a world underpinned by data, and more specifically, the production and understanding of it.”

Like
Reply
Atila Horvat

Senior Director | Head of Network Slicing | Board Advisor | Ex Startup Owner | Telecom prodigy | Out-of-the-box & Big-picture thinker | 20+ years of relevant experience |

5y

You have missed the so called "neutral host" market that can step in into edge computing ecosystem.

Ger Woods

Projects Director, Artifex Consulting S.A. (& freelance Telecoms Regulatory / Policy consultant Asia based)

5y

Excellent write up Luke, thanks for sharing.

Like
Reply

To view or add a comment, sign in

Insights from the community

Others also viewed

Explore topics