Introduction 

Edge computing is the practice of getting data storage and processing capabilities as close as possible to the user and applications that consume and generate this data. It is commonly called a transformative technology since it contrasts centralization, practiced by data center and cloud adoption.

With cheap hardware, higher bandwidth and faster data generation, centralized architectures quickly become the bottleneck for sensor, telemetry, and event data. These data sources are expected to become even more relevant in the future since devices produce and consume data at an increasing rate. Use cases for applications based on Virtual Reality, Artificial Intelligence, and Internet of Things have become widely adopted and require data to be processed in real-time. 

Limitations of traditional cloud architectures 

Traditional cloud architectures are an integral part of business operations and therefore build the backbone of data processing and storage. However, these architectures face some challenges with the increasing importance of real-time applications.  

One of the primary challenges in this context is latency. In traditional cloud architectures, data travels from devices to the facilities of a cloud provider, is processed, and finally returned the same distance. This results in a delay that introduces many potential points of failure in the context of real-time. 

Another key point is scalability. While your cloud provider can scale infinitely, your budget needs to do so, too. Although adding more virtual machines, adjusting bandwidth, or orchestrating more application containers into your environment is theoretically possible, the higher you go in the tiers, the more you pay. 

Adding more sensors and devices to your central backbone inevitably leads to bottlenecks under the influx of incoming data. Edge computing solves this by shifting necessary computing and data processing to devices or servers close to the source, sending only defined and useful information to your cloud environment. 

Benefits and Impact on existing cloud architectures 

A combination of edge and cloud computing requires a change in thinking about data handling and processing. Distributing computing resources closer to the origin, edge computing reduces network latency and thus improves real-time applications. This fundamental change also induces a more efficient use of your cloud assets and hence supports scalability and agility by offloading necessary traffic and computation to the location where it is being actively used. 

As organizations mature and adapt to this trend, optimization of existing cloud architectures for this hybrid edge-cloud architecture becomes crucial. The resulting topology introduces new technical challenges that need to be addressed. The three main technical challenges in hybrid edge cloud architectures are: 

  • Synchronization of data 
  • Exchange of data 
  • Conflicts in data  

These difficulties depend on the use case, application, and technology employed and are subject to organizational and regulatory requirements.

Considerations 

Before organizations extend their existing environments with edge computing, there are several organizational and technical considerations to make. Moving data processing workloads out of a centralized, well-matured environment needs careful strategical planning. There are several factors to consider, including: 

  • Workload and data processing needs 
  • Network connectivity 
  • Security measures 
  • Cost 

These factors generally apply to any architectural design but are of special importance when taking the step towards the edge. For that reason, it is worth looking at it in detail. 

Optimize Workload and Data Processing Needs 

Whether it is a legacy or cloud application, it is always key to have a clear understanding of the hardware requirements, including computational demands such as CPU, RAM, and expected data volume. Since edge computing devices operate on physical rather than ephemeral hardware, careful consideration of these factors beforehand is important. Furthermore, it is key to have a good understanding of the application itself to balance latency and performance requirements on the application side versus the constraints of the hardware used. 

Ensure Resilient Networking 

Distributed computing relies heavily on stable network connections. But what happens when there is an actual outage of any ISP? How does it affect the application's performance or the output of it? It is necessary to plan for several types of failure by implementing mechanisms to keep the application running and resilient. Resiliency can be addressed by using failover-mechanisms, such as two redundant network connections between the edge and the next node. This may be by relying on only one type of active-active or a hybrid of wire and wireless communication. 

Depending on the application and environmental factors (such as physical location), it can be worth implementing offline capabilities for total connection loss. This can range from several seconds to even days. 

Secure Distributed Environment 

Organizations should always prioritize and invest in effective security measures for cloud infrastructure. Expansion towards highly distributed environments introduces a significantly bigger attack surface. With vast amounts of devices connected to the cloud environment, having effective security policies is inevitable. These policies should embody the security best practices such as zero trust and least privilege access. Adjustments are needed at different levels of firewalls, keeping the overall environment safe. Continuous monitoring and logging of edge traffic is an important aspect to keep pace, too. 

Balance Cost Considerations 

On a high level, there are several key drivers for the costs associated to any edge computing project, namely:

  • Make or buy decision 
  • Location 
  • Hardware 

Make or buy: This is the most common, yet unspecific question that is often asked at project initiation or with respect to the often very specific requirements. Although there is no easy answer to this, it is always advisable to have a look at existing products, such as Azure IoT Edge and Azure Stack Edge first. The indirect costs of maintaining a custom solution, especially with respect to the security needs quickly outweigh established third-party services. 

Location: Another important aspect is whether an intermediate physical location is needed to meet the application's use-case. Does the use-case rely on scattered micro-locations? Does the traffic and data volume require intermediate bundling? Options vary from single server blades, micro-datacenters or even a full-grown data center.  

Hardware: Balancing these high-level decisions with the variety of application and hardware requirements ensures not only successful but also positive adoption of this highly distributed technology framework. 

As a short takeaway, organizations can take advantage of edge computing by carefully analyzing the scope of your edge adoption, plan for failure, and definition and implementation of effective security policies and controls while keeping an eye on the related costs. 

Best Practices for Implementation and Adoption 

As already indicated in the preceding sections, adoption of edge computing capabilities needs strategical planning. On an organizational level, this requires more than the definition of use cases and technical requirements. Even more crucial than the technical aspects are organizational topics, such as: 

  • Authority e.g. Who is responsible for edge locations? 
  • Location, e.g. How do we ingest data into the cloud environment? 
  • Processes, e.g. What to do in case of an outage?  
  • Architecture, e.g. What infrastructure is required?  
    (Gateways, Nodes & Levels, Datacenter) 

Enabling Edge Computing on existing Cloud Architectures 

As organizations make progress into emerging fields such as AI and IoT, changes in existing infrastructure design become inevitable. In fact, edge computing can be a useful addition to existing cloud environments. With immense potential for cost savings and performance improvements and new potential applications comes another level of complexity. These complexities need an experienced partner with a thorough understanding of both implicit and explicit actions to integrate this rapidly evolving area into your existing assets. 

Work with us

We have experience helping customers with cloud transformation following industries best practices. We would love to talk to you about how we can make your edge computing efforts a success! Interested in how we approach your requirements? Have a look at our workshop process.