Sponsored by Sumo Logic
Visit Now

Edge Applications Bring New App Performance Challenges

PinIt

Modern observability solutions provide visibility into the workings of complex edge applications and help staff make data-driven decisions when investigating latency issues.

App performance requirements are incredibly stringent these days. Yet, the tough service level expectations for web and mobile apps seem quaint when it comes to the performance constraints imposed by edge applications. This is all the more reason businesses need modern observability solutions to better understand and eliminate performance issues in such applications.

Download Infographic Now: The growing role of observability

Let’s put the issue into perspective. Expectations of today’s employees, customers, and other users are very demanding. They have no tolerance for delays. Forty percent of users will abandon a website that takes longer than three seconds to load. And 53 percent of users will abandon a mobile app that fails to load in three seconds.

When it comes to edge, the tolerances are much tighter. IoT devices and applications typically require latency rates no higher than 50 milliseconds and sometimes as low as 10 milliseconds.

What’s worse is that the consequences can be much more severe if the edge application incurs even slight performance degradation.

Take a common situation. An online user is about to make a purchase, but when they go to pay, the transaction gets held up. The problem might be that a third-party payment processing gateway is performing poorly. If the customer gets frustrated, he or she will abandon the purchase. The company may lose not just that sale, but if the customer found the same item on another site, they may lose their business forever.

That’s a significant impact for a poorly performing app. But the issue could be life-threatening if an edge app has delays. Just imagine if an autonomous car’s pedestrian detection system had a hiccup.

See also: Continuous Intelligence Insights

With edge applications, latency is the devil

Many new technologies, like connected vehicles, AR/VR, and industrial automation, place new demands on latency. Many applications require single-digit millisecond latency. Complicating matters is the fact that data traversing multiple networks between a data center and the edge device can take tens of milliseconds or more.

There are several ways to reduce latency. One of the most important things is to architecture an edge app properly. For example, one might use a hub and spoke model with latency-sensitive components at the edge.  

Another thing is to have insights into an edge application’s health. However, with the complexity of distributed applications, it can be it can hard to access real-time telemetry. This can impair troubleshooting and slow down root cause analysis.

What’s needed is a modern observability solution that provides visibility into the workings of complex edge applications. Such a solution could help staff make data-driven decisions and reduce the time to investigate operational issues.

Download Infographic Now: The growing role of observability
Salvatore Salamone

About Salvatore Salamone

Salvatore Salamone is a physicist by training who has been writing about science and information technology for more than 30 years. During that time, he has been a senior or executive editor at many industry-leading publications including High Technology, Network World, Byte Magazine, Data Communications, LAN Times, InternetWeek, Bio-IT World, and Lightwave, The Journal of Fiber Optics. He also is the author of three business technology books.

Leave a Reply

Your email address will not be published. Required fields are marked *