The Importance of Reducing Latency at The Edge

With the modern nature of computing and networking, there’s no underestimating the importance of minimizing latency as much as possible. Even just one delay or a small reduction in speed can have an impact on a company’s overall success.

Latency can affect many different industries and individuals. Regarding the latter, high latency is perhaps best associated with online gaming. Known as “lag,” high latency when gaming can lead to stuttering performance, freezing, and dropouts.

Yet it can have a much more serious role when it comes to business. Below is a closer look at the importance of reducing latency with edge computing.

What is Latency?

For a technical definition, latency is the word used to describe the delay between input and reception. Say you press a button to start a program. The button press is the input, while the reception is the program starting up.

Milliseconds (ms) are typically used to measure latency. While this is a small measurement, all it takes is for a few milliseconds of latency to cause disaster for some organizations. Low latency is the aim, as this means there is minimal delay between sending and receiving. High latency, on the other hand, results in a relatively large delay.

The most common influence of latency is distance. The longer the distance between output and reception, the longer the travel time for the data to reach its destination. Hardware and software elements located along the network path can impact latency, as can network congestion.

Edge Computers and the Importance of Reducing Latency

How important is it to keep latency minimized as much as possible? For some organizations, just a few milliseconds’ delay can cost them millions of dollars. With navigation equipment, it can even result in fatal disconnects. The aim of reducing latency with edge computers is to achieve a low latency rate of at least 10ms. However, some business leaders believe this needs to be reduced to 5ms or less.

This is why a lot of companies are using specialist computer suppliers like Things Embedded for bespoke edge systems. One of the main reasons is that edge computing is designed to minimize latency with online functions. Data processing is decentralized with the edge, where the modalities and locations are narrowed to the point actions are completed where real world actions are happening.

Due to edge computing being widely distributed latency can be lowered as it ensures all processing and telecommunications are performed closer to their use. That’s not all. Another major advantage is improved consistency. With an edge application, it doesn’t necessarily have to be hundreds, even thousands of kilometers away. Instead, it can be done by just tens of kilometers – or even less when done on-site. The result: latency is significantly reduced.

When you think about how many edge applications are used today – including the likes of smart cameras and IoT devices – it is essential for latency levels to be kept low. If there’s high latency, the ability for these applications to function in real-time can be inhibited, leading to delays with processing.

Post a Comment

Previous Post Next Post