What is Edge Computing?

Tahir Mehmood Sardar
3 min readMay 10, 2021

--

Edge computing is what it sounds like:

Computing takes place at the edge of corporate networks, with “the edge” being defined as the place where end devices access the rest of the network — things like phones, laptops, industrial robots, and sensors. The edge wont to be an area where these devices connected so they could deliver data to, and receive instructions and download software updates from a centrally located data centre or the cloud.

Now with the explosion of the Internet of Things, that model has shortcomings. IT devices gather so much data that the sheer volume requires larger and more expensive connections to data centres and the cloud. The nature of the work performed by these IT devices is additionally creating a requirement for much faster connections between the data centre or cloud and the devices. For example, if sensors in valves at a petroleum refinery detect dangerously high pressure in the pipes, shutoffs got to be triggered as soon as possible.

With analysis of that pressure data taking place at distant processing centres, the automatic shutoff instructions may come too late. But with processing power placed local to the end devices, latency is less, and that roundtrip time can be significantly reduced, potentially saving downtime, damage to property and even lives. Even with the introduction of edge devices that provide local computing and storage, there will still be a requirement to attach them to data centres, whether they are on-premises or in the cloud. For example, temperature and humidity sensors in agricultural fields gather valuable data, but that data doesn’t have to be scrutinized or stored in real-time.

Edge devices can collect, sort and perform a preliminary analysis of the data, then send it along to where it must go: to centralized applications or some form of long-term storage, again either on-prem or in the cloud. Because this traffic may not be time-sensitive, slower, less expensive connections — possibly over the internet — can be used. And because the data is presorted, the volume of traffic that must be sent in the least could also be reduced.

So the upside of edge computing is the faster response time for applications that require it and the slowing of the growth of expensive long-haul connections to processing and storage centres. The downside can be security. With data being collected and analyzed at the edge, it’s important to include security for the IT devices that connect to the edge devices and for the edge devices themselves.

They contain valuable data, but they are also networked elements that, if exploited, could compromise other devices that contain stores of valuable assets. With edge computing becoming more essential, it’s also important to make sure that the edge devices themselves don’t become a single point of failure. Network architects need to build in redundancy and provide failover contingencies in order to avoid crippling downtime if a primary node goes down.

The industry has already gone a long way toward addressing the demands of edge computing, and it is becoming mainstream. Its importance is likely to grow even more because the use of real-time applications becomes more prevalent.

Regards.

Tahir Mehmood

--

--