The volume of data created by IoT devices is increasing year by year. When large amounts of data are processed in a company’s data centre in the cloud, the network bandwidth may slow down, causing higher latency and making the process inefficient. These challenges gave rise to the idea of edge computing. 

According to Gartner’s predictions, by 2025, about 75% of enterprise-generated data will be created and processed outside a traditional centralised data centre or cloud. Does this mean that the cloud era is coming to an end and edge computing will be the new king? Are edge and cloud computing competitive technologies, or can they be combined?  

Let’s find out.  

What is edge computing? 

Edge computing enables data produced by IoT devices to be processed at or near the source of data generation at the network edge (hence, its name), instead of transmitting data to an external data centre for processing and analysis, risking latency issues due to bandwidth limitations or network disruptions. What is a network edge? It’s the exact spot where the local network connects to the third-party infrastructure.  

To better understand how edge computing works, let’s illustrate it with examples. In the context of the Internet of Things, for example, a smart factory, the data sources are various machines equipped with sensors or embedded devices. A wearable sleep tracker is another example of an edge solution that analyses data locally to provide users with insights and recommendations. In more complex cases, like in-vehicle systems, edge computing aggregates various kinds of data, for example, from GPS and proximity sensors and processes it in real time in order to improve safety. 

Edge computing vs Cloud computing 

Edge computing and the cloud have a similar purpose. The main difference lies in the location of the computing and data processing and storage resources. As you already know, in edge computing, these resources are decentralised and located at the same point as the data source (at the network edge). On the contrary, in cloud computing, the data is first transferred from a device to the cloud, where it’s stored and processed.  

From a business perspective, edge computing allows more efficient collection, filtering, processing and analysis of a large amount of data that would be expensive or technologically impractical to first move to the cloud. Edge computing, used by local data-generating IoT devices, solves the latency and network congestion challenges. Local servers perform essential data processing and analytics on the spot, allowing decisions to be made in real time. The data can then be sent to the cloud for more in-depth analysis, necessary, for example, to make process improvements in the long run.  

What are the advantages and risks of edge computing? 

The benefits of edge computing include: 

Real-time data analysis 

Edge-computing allows quick analysis and decisions to be made as it reduces the latency to milliseconds. That’s especially important, for example, in the case of autonomous vehicles, where real-time system response can even save lives. The decision of whether to stop a car before a pedestrian starts walking is certainly very time-sensitive and should be made by an onboard computer. As the demand for IoT solutions is growing across almost all industries, the future will only see a higher and higher demand for real-time data analysis and exchange. 

Latency reduction  

Edge computing enables increased processing speed to respond to changes in situations where the latency cannot exceed several milliseconds, such as monitoring health issues.  

Processing large amounts of data irrespective of bandwidth limitations 

Thanks to processing data at its source, edge computing reduces the need to transfer large data volumes among servers or to the centralised cloud storage. That’s one of the key factors that popularised the idea of edge computing.  

However, edge computing also comes with certain risks and disadvantages. The key ones are related to: 

Security 

If not properly secured, the decentralised architecture of edge computing is prone to DDoS attacks, and for this reason, it demands more control, monitoring and implementation of secure communication solutions. There should be extra security precautions regarding hacker attack detection and prevention extending to sensors and embedded software in IoT devices.   

Investment costs 

Establishing an edge computing infrastructure requires significant upfront investment in local hardware and network devices as well as back-up power sources in case of electricity outage. There are also further maintenance costs that need to be taken into account.  

In conclusion 

Edge computing is recommended in cases when real-time data processing and analysis are crucial, like smart health monitoring devices or autonomous vehicles. It’s also a go-to option when the volume of data is so large that it would be technologically impractical or ineffective to transfer it all to the cloud due to bandwidth limitations and costs. Edge computing can also help prevent network overload by processing data locally and sending only the necessary files to the cloud.   

In practice, edge and cloud computing shouldn’t be perceived as rivalry technologies. Most organisations use a combination of both for maximum efficiency. Functions supporting on-the-spot decision making are handled at the edge, while big data processing and analysis are done in the cloud, usually through advanced analytics and machine learning algorithms. 

Are you considering merging both edge and cloud computing? Our experts will advise you on the optimal solution. Visit our cloud offering page for more information.  

About the author

Małgorzata Kruszyńska

Małgorzata Kruszynska

Business Researcher