Gone are the days when desktop computers at homes became wide-spread with cheap commodity hardware. This was the last decentralization step, followed by the ever-so-famous centralized cloud architecture, with huge data centers performing basically every important task. But now, small edge computing is on the door, knocking, for the next shift.
There had been several alterations between centralized and decentralized architectures that the computing infrastructure went through, in the past decades. Currently, the cloud acts as the major centralized paradigm. Almost all web content is served through a major data center. Researchers rent their private servers from cloud to test their models and make their experiments. Enterprises perform their business logic in remote servers.
Cloud offers a convenient way for businesses, small or big, to get computing sources over a service provider instead of building their own data center for use. Usage of cloud services can be seen everywhere. AWS, DigitalOcean, Azure, Google Cloud, VMWare have become names that every developer knows. But, the winds are changing.
Computing on the Edge!
There has been a paradigm shift in the industry, from cloud to edge, since the inception of Artificial Intelligence. Moreover, with the bemusing of the digital space by Internet of Things, the edge computing is gaining domain authority. There have been various speculations by the industry veterans that edge computing is going to edge out cloud computing. Well, the short answer is no, but a more complex answer is that with the growing data crunch and quick adoption of AI, the cloud may not always be a viable option.
Edge computing architecture makes the devices physically close to computing, either in the form of the device performing the computation itself, or by deploying a cloudlet close to the device that acts as a miniature cloud, or a combination of both.
Edge devices and cloudlets are physically closer, often one hop away, compared to the cloud, to which usually several hops have to be made. Cloudlets can even be connected to the edge devices through a wired connection. This provides lower latency and higher bandwidth, since a fog node connects to a much fewer number of devices compared to a central cloud. Managing the data in the cloudlets provides lower response time.
Presence of the machine learning is there, wherever there is a vast amount of data. As such, edge devices and edge computing have a close relationship with machine learning. For example, a surveillance camera constantly generates images of the area it’s covering. This camera might utilize deep learning to recognize a specific object, perhaps a human or a car, and might move to keep the surveillance intact. A self-driving car needs to calculate its next action according to the data it receives from its sensors and cameras, and all these factors must be run through a machine learning model for a proper inference.
An interesting use case is the work of Chang et. al. on using machine learning for network edge caching. They claim that through accurately profiling user behaviors in an area, getting data from their edge devices such as mobile phones and personal computers, and constructing clusters with unsupervised learning on the data, appropriate content can be proactively cached either in users’ edge devices or in a cloudlet near the area to achieve low latency and to preserve energy. As IoT is getting more creative and seeps deeper into our daily lives, the possibilities of machine learning applications on edge are endless.
Smart milieu of the Edge Computing!
Given that the edge computing is performing several duties as several places, talking about new paradigms and shifting focus can be rather vague. Let’s deep dive into the Smart Tasks of Smart Edge Computing:
Smart Factories
As a solution to the limitations inherent to IoT, organizations are exploring the implementation of smart manufacturing with edge computing. Edge computing implies having most of the processing and storage elements of the IoT network closer to the points where the data is gathered from and where the actions are required. This means distributing the IIoT’s thinking and decision-making capabilities closer to the sensing and acting capabilities. Using such an architecture, manufacturers can maximize the benefits associated with IoT and minimize the risks and effects of its limitations. This can help manufacturing units to increase responsiveness to service requests, minimize storage and bandwidth costs and to increase reliability with minimizing the chance of failure. A typical sector in which this technology is used for is the automotive industry.
Smart Vehicles
The autonomous vehicle is the ultimate Internet of Things (IoT) device. Powerful enough to handle onboard computing tasks and well-connected enough to interface with multiple networks and devices, they will be in constant communication with the world around them and making split-second decisions based upon the information flooding in from an array of sophisticated sensors. No connection will be more important, however, than the machine to machine (M2M) communication between a self-driving car and the other vehicles on the road.
Every autonomous vehicle broadcasts critical data on weather changes and road conditions, allowing other vehicles to learn about potential hazards like detours, debris, accidents, and flooding early enough to adjust accordingly. Much of this data will be able to be sent and received between the vehicles themselves, without requiring them to interface with distant cloud servers. This communication effectively turns every vehicle on the highway into an extension of every other vehicle’s sensors, providing the very best information possible on the highway environment. With so much data pouring in, of course, it will be more important than ever for vehicles to have ready access to edge data centers to offload non-mission critical data for later analysis.
Smart Cities
For autonomous vehicles to reach their full potential, high traffic urban areas will need to step up their IoT game. Sensors relaying information about everything from road conditions to real-time reports on congestion will allow fully IoT-integrated smart cities to provide self-driving cars with valuable information to help them make better, more efficient decisions. By creating such a data-rich environment with ready access to local data center computing resources, cities can help to leverage the full potential of IoT edge devices.
While autonomous vehicles get most of the press, many cities are already making significant investments to prepare for the self-driving future. And it’s not just the wealthiest, most populous cities taking the first steps. It is all about how cities are positioning themselves to capitalize on the potential of autonomous vehicle technology.
Cloud to be replaced or co-exist?
Due to the reduced latency, the businesses are preferring edge over the cloud, as it is more cost-effective. Businesses want to move the processing and storage close to the application. When the data processing can be made available close to the device, it increases the reliability of the data collected and ensures that there is no delay in information as it switches between routers, servers, and firewall. Whereas, with the cloud, it is the opposite. The data is pushed and stored in the cloud and then pulled back when needed.
The main advantage of edge computing is speed. The IoT concept envisages quick responses for data processing like in the case of self-driving or autonomous cars. To stay safe on the roads, autonomous cars need to collect and process massive amounts of data like following the lane rules, stopping at red lights, identifying pedestrians, etc. All of this requires cars to process tons of data in real-time every millisecond of the ride. With so much to be processed quickly, these cars cannot rely on cloud servers where the data is uploaded in the server, processed, and then waiting for the results. Cloud services can be pretty fast but not fast enough to respond in real-time to the immediate dangers. Whereas, edge computing can do all these jobs in real-time. Instead of using the cloud for doing the processing off-shore, the data can be collected and stored in the cloud. The vehicle manufacturers can use the data they have gleaned to offer smoother and safer rides.
Similarly, in industrial IoT, there are scenarios where it is sometimes necessary to manage devices without a proper network bandwidth; this eliminates the opportunity to apply cloud computing. In such scenarios, edge computing is the optimal solution. Edge computing just moves the processing power and storage closer to the application.
When it comes to edge and cloud computing, they are not necessarily in competition with each other. Instead, the two technologies can complement each other. It is up to the businesses to analyze their unique needs and figure out the right balance between how much processing should be done on the cloud and how much should be done on the edge devices.
Summary
Cloud computing is about centralization of processing and storage to provide a more efficient and scalable platform for computing. Edge computing is simply about pushing some of that processing and storage out near to the devices that produce and consume the data, that is, to the edge. Edge computing will be one of the approaches we use to deploy in the cloud to support specific use cases, with the internet of things being the most applicable.
But edge computing replacing cloud computing is like a toe replacing a body. Cloud computing is a big, broad concept that spans all types of computing approaches and technology. It could be considered as a macro technology pattern. Edge computing is simply a micro pattern, where one can perform new tactical things with public and private clouds.
Edge computing is an approach within the large corpus of cloud computing. Edge computing is likely to work in tandem with cloud computing, not replacing it.