Smart Cities

What is Edge Computing?

Black smartphone with paper cloud and rocket on yellow background, cloud computing concept
Edge computing is an emerging paradigm that brings computation and data storage closer to the sources of data generation, such as sensors and IoT devices. This approach addresses the limitations of traditional cloud computing by reducing latency, conserving bandwidth, and enhancing real-time data processing capabilities.

Edge computing represents an emerging paradigm in computing, bringing cloud computing services closer to end users to provide faster processing speeds and rapid application response.

Cloud Computing and Edge Computing

Edge computing represents an emerging paradigm in computing, bringing cloud computing services closer to end users to provide faster processing speeds and rapid application response.

Modern internet-based applications, such as surveillance systems, virtual reality experiences, and real-time traffic monitoring, depend heavily on quick processing and low response latency. Typically, end users access these applications through their mobile devices, which have limited resources, while primary computing tasks are executed on remote cloud servers running on cloud computing infrastructure.

Cloud computing is a computing model that provides users with on-demand access to various computing resources, including storage and processing capabilities. The primary offerings of cloud computing encompass Infrastructure as a Service (IaaS), Platform as a Service (PaaS), and Software as a Service (SaaS). Each of these services enables users to conveniently access and utilize computing functionalities, such as data storage and processing, whenever required.

However, utilizing cloud services from mobile devices can introduce significant latency and complications related to mobility. Edge computing addresses these challenges by relocating computational tasks closer to the edge of the network, thus satisfying the demanding performance requirements of these applications.

What is Edge Computing?

Edge computing shifts data processing, applications, and services from centralized cloud servers to points closer to the users at the network’s edge. This enables content providers and developers to deliver their services more directly and efficiently to users.

Edge computing represents a refined version of cloud computing, designed specifically to lower latency by positioning services closer to end-users. By offering computing resources and services directly within the edge network, edge computing effectively reduces the workload placed on cloud infrastructure.

Nevertheless, it doesn’t replace cloud computing; instead, it enhances user experiences by supporting latency-sensitive applications. Similar to cloud providers, edge computing providers offer data storage, computation, and application services directly to users.

Despite these similarities, edge and cloud computing differ substantially in several ways:

  • – Server Location: edge computing places servers close to users within local networks, whereas cloud computing servers are centralized and accessible via the internet.
  • – Latency: due to a centralized model, cloud computing typically involves multi-hop communication paths, resulting in higher latency compared to the shorter communication routes of edge computing.
  • – Jitter: cloud computing’s centralized model also makes it experience higher jitter levels compared to edge computing.
  • – Location Awareness: edge computing also distinguishes itself with location awareness and enhanced mobility support, using a distributed architecture, unlike the centralized approach typical in cloud computing.
  • – Security: because of shorter data paths, edge computing presents a reduced risk of data interception and attacks compared to cloud computing.
  • – Users: cloud computing targets general internet users with global reach, while edge computing specifically targets local, edge-based subscribers.
  • – Scalability: edge hardware generally has limited resources, making edge computing less scalable than the resource-rich, centralized cloud infrastructure.

Different Paradigms of Edge Computing

Mobile Edge Computing (MEC)

Source

Introduced by the European Telecommunications Standards Institute (ETSI), Mobile Edge Computing (MEC) allows mobile users to utilize the computing services from the base station.

MEC is a technology designed to give mobile users access to cloud services and other IT resources close to the Radio Access Network (RAN). Its primary aim is to reduce latency by relocating storage and computational resources from centralized networks to the network edge. 

MEC provides a business-centric cloud computing platform integrated within the RAN, placing it near mobile users to effectively support delay-sensitive and context-aware applications.

MEC has emerged as a key advancement in cellular base station development, enabling cellular base stations to operate alongside edge servers seamlessly. Additionally, MEC provides real-time RAN data, such as network load, user location, and congestion status, to application and content developers.

Utilizing this real-time information, MEC facilitates context-specific services, enhancing user satisfaction and improving the overall Quality of Experience (QoE). By empowering the edge network to manage computational tasks and services locally, MEC significantly reduces latency and bandwidth usage.

Fog Hashing

Fog computing combines the strengths of both cloud computing and edge devices to deliver high-quality services, reduce latency, and support mobility, multi-tenancy, and various functionalities essential for modern computing systems.

Source

A fog computing environment consists of three primary layers:

  • 1. the terminal layer
  • 2. the fog layer
  • 3. the cloud layer
Terminal Layer

The terminal layer includes geographically distributed end devices responsible for collecting data and transmitting it to higher layers for processing and storage. Typical devices at this layer include sensors, wearable gadgets, mobile phones, and smart vehicles.

Fog Layer

The fog layer, positioned between the terminal devices and the cloud, operates at the network edge.

Devices in this layer, called fog nodes, handle data storage, computation, and transmission tasks. Fog nodes can be either mobile or stationary, located strategically throughout the network. Common examples are routers, access points, switches, fog servers, and base stations.

Due to their computational capabilities, fog nodes enhance real-time data processing and analytics, optimizing services for latency-sensitive applications. Their close proximity to end devices enables efficient communication. Furthermore, by extending cloud resources into the fog layer, fog nodes gain additional computing and storage capacities.

Cloud Layer

The cloud layer contains powerful storage devices and servers capable of high-performance computing. It handles computational tasks that are not latency-sensitive and have been delegated from the fog layer.

The cloud provides infrastructure, platforms, and software services. Examples include server hosting services (IaaS and PaaS) from providers like DigitalOcean and Google App Engine, network storage such as Apple iCloud, and virtual IT solutions like Amazon EC2.

Cloudlet Computing

Source

Cloudlets represent a form of edge computing designed to deliver cloud-based services and applications close to end users, typically positioned just one network hop away. They can operate independently and continuously offer services even if the central enterprise cloud becomes temporarily unavailable, synchronizing later when cloud connectivity is restored. 

 

Cloudlets provide significant benefits, including:

  • – Reduced Communication Latency: Delivers faster responses due to proximity.
  • – Improved Connectivity: Provides persistent local services even if the enterprise cloud is temporarily unavailable.
  • – Extended Battery Life: Offloads intensive computational tasks from mobile devices, conserving battery resources.
  • – Lower Bandwidth Consumption: Processes most data locally, minimizing the amount sent to the central cloud.
  • – Reduced Storage Requirements: Requires less data storage in the cloud due to local data handling.
  • – Lower Delays and Jitter: Achieves more stable and consistent data processing.
  • – Enhanced Interactive Experiences: Especially beneficial for gaming and other latency-sensitive applications.
  • – Improved User Satisfaction: Ensures smoother performance, increasing overall user experience.
  • – Privacy and Security Assurance: Allows users to confidently use applications without significant privacy and security risks. 

EndNote

This article provided an overview of edge computing, highlighting its key concepts, technologies, and application scenarios.

It discussed related paradigms such as cloudlets, fog computing, and mobile-edge computing (MEC), emphasizing their roles in addressing latency, connectivity, and data processing challenges.

Information presented here integrates insights from recent academic research to offer accurate, current perspectives. As edge computing technologies continue to evolve rapidly, ongoing research and development will likely refine and expand these concepts, further shaping the future landscape of distributed computing.

SIGN UP TO GET THE LATEST NEWS

Newsletter

Subscription