Why Edge Computing Is Here to Stay: Five Use Cases

edge computing

Data loses its value when it can’t be analyzed fast enough. Edge computing and analytics can solve the challenge for enterprises ranging from oil and gas production to banks and retailers.

Security cameras, phones, machine sensors, thermostats, cars and televisions are just a few of the items in daily use that create data that can be mined and analyzed. Add to it the data created at retail stores, manufacturing plants, financial institutions, oil and gas drilling platforms, pipelines and processing plants, and it’s not hard to understand that the deluge of streaming and Internet of Things (IoT) sensor data can — and will — very quickly overwhelm today’s traditional data analytics tools.

Organizations are beginning to look at edge computing as the answer. Edge computing consists of putting micro data centers or even small, purpose-built high-performance data analytics machines in remote offices and locations in order to gain real-time insights from the data collected, or to promote data thinning at the edge, by dramatically reducing the amount of data that needs to be transmitted to a central data center. Without having to move unnecessary data to a central data center, analytics at the edge can simplify and drastically speed analysis while also cutting costs.

Why Data Is More Valuable at the Edge

Much like the time value of money, the time value of data means that the data you have in this second won’t mean as much a week, day or even hour from now. This coupled with the proliferation of IoT sensor, social, and other streaming data is driving organizations to use edge computing to provide the real-time analytics that impact the bottom line, or in some cases, stop a disaster from happening before it starts.

Organizations are currently reliant on large and complex clusters for data analytics, and these clusters are rife with bottlenecks including data transport, indexing and extract, as well as transform and load processes. While centralized infrastructures work for analyses that rely on static or historical data, it is critical for many of today’s organizations to have fast and actionable insight by correlating newly obtained information with legacy information in order to gain and maintain a strong competitive advantage.

An increasing amount of data is priceless in the seconds after it’s collected—consider the instance of a fraudster or hacker accessing accounts—but it loses all value during the time it takes to move it to the centralized data center infrastructure or upload it to the cloud. Losing value from that data due to slow decisions is not acceptable, especially when an edge-computing platform that eliminates moving data provides the near-instant intelligence needed. Organizations cannot afford to wait days, weeks or even months for insights from data. With data analytics at the edge, they do not have to.

Five Use Cases for Edge Computing

While we do know that edge computing is a trend on the rise, we don’t know yet how far-reaching it will be. But we do expect it to become an integral part of a more robust and productive data analytics infrastructure. Organizations in several industries can benefit from the results of real-time edge analytics, including:

  • IoT sensor data monitoring and analysis: IoT sensors are already creating massive
    edge computing
    amounts of data, and with the number of sensors collecting data growing, data volume is set to continue growing exponentially. Moving data analytics to the edge with a platform that can analyze batch and streaming data simultaneously enables organizations to speed and simplify analytics to get the insights they need, right where they need them.
    Related: “Why edge computing is crucial for the IoT.”
  • Retail customer behavior analysis: Brick-and-mortar stores are looking for any competitive advantage they can get over web-based retailers, and near-instant edge analytics — where sales data, images, coupons used, traffic patterns, and videos are created – provides unprecedented insights into consumer behavior. This intelligence can help retailers better target merchandise, sales, and promotions and help redesign store layouts and product placement to improve the customer experience. (One way this is accomplished is through use of edge devices such as beacons, which can collect information such as transaction history from a customer’s smartphone, then target promotions and sales items as customers walk through the store.)
  • Mobile data thinning: Mobile data, much like IoT sensor data, is being created with increasing rapidity,  but the downside of all of this data is that some, or even most of it, isn’t needed to answer data analysis queries. How can a company hone in quickly on which ad is having the most impact, while eliminating the “noise” of other social activities happening around it? Having edge analytics enables companies to better understand their data and analytics needs to thin Big Data and only process and analyze the information necessary to the query.
  • Compliance analysis at financial branch locations: One major function of
    edge computing
    data analysis at financial institutions is to find, and stop, non-compliant transactions. When organizations have to take the time to move data back to the central data center or upload it to a cloud-computing architecture for processing and analysis, the lag time decreases the value of the data. Using micro data centers in financial institution branches enables analytics to happen in real-time, meaning that non-compliant transactions are caught and stopped much more quickly, which can have a real and positive impact on the bottom line.  Gartner analyzes the emergence of micro data centers here.
  • Remote monitoring and analysis for oil and gas operations: Edge computing for manufacturing
    edge computing
    and oil and gas operations can mean the difference between normal operations and a disaster. Today’s traditional centralized data analytics infrastructures can tell you what caused downtime, or in the very real case of these types of operations, an explosion—but only after the fact. Having near-instant analysis at the site as the data is being created can help these organizations see the signs of a disaster and take measures to prevent a catastrophe before it starts.


Edge Computing Architecture

Edge analytics is easier than ever before with the new high-performance platforms being engineered. These micro data centers in field and remote locations use a fraction of the space, power and cost of a traditional analytics infrastructure; however, organizations get massive performance gains from them. It’s a win/win situation where insights come faster than ever before, and costs are cut due to lower operational expenses, power and administration needed to run the systems.

Because of the space, power and cost advantages of edge computing, traditional analytics clusters will not work for edge computing. The space, power, cooling and other operational costs make them cost prohibitive, and they do not provide the speed or simplicity needed to make edge analytics a feasible undertaking. Innovation in both hardware and software has now occurred, creating platforms that meet the operational and performance metrics required for a micro data center to be a viable option.

Businesses are looking past the x86 clustered processing architectures that have stymied innovation in real-time analytics and are looking now toward accelerated systems to provide the size, speed and performance needed. These systems use hybrid computing technology, seamlessly integrating diverse computing technologies, whether they are x86, GPU or FPGA technologies, or any combination thereof. They are extremely compact in space and require very little power, yet still provide performance that is several orders of magnitude more than what today’s traditional systems can provide. In other edge scenarios where resources may not be at a premium, these same versatile accelerated systems can complement incumbent infrastructures, making existing edge clusters function better.

A Shift in Corporate Thinking

Keeping data analytics at the edge does require a shift in corporate thinking, but the benefits far outweigh the transition. The cost savings by scaling back central data analytics infrastructures to handle non-time sensitive analysis while installing cost-efficient platforms purpose-built for edge analytics can have a real impact on an organization’s budget. Additionally, avoiding latency and getting near-instant insights is the brass ring of data analytics, and eliminating the time needed to transport data to and from the edge is a major step toward achieving that.

With growing and increasingly disbursed sources of information and the pace of organizational change accelerating rapidly, the ability to simultaneously and in real-time analyze historical data with social, IoT sensor and other streaming data is invaluable. It’s a bold yet fair statement to say that organizations that adopt edge computing to solve these problems will have a significant advantage over their competitors.


IoT architectures for edge analytics

Fog computing: a reference architecture

Liked this article? Share it with your colleagues 

Patrick McGarry

About Patrick McGarry

Patrick McGarry brings extensive technology and leadership experience in hardware and software engineering (full bio). He is currently Vice President of Engineering at Ryft.

Leave a Reply

Your email address will not be published. Required fields are marked *