This post was produced in partnership with Hazelcast.

Stream Data from the Network Edge for Insight-Driven Advantage

PinIt

As the number of edge devices grows and enterprises push more processing to the network edge, data streaming solutions will become increasingly common to—and essential for—data-driven organizations.

When I meet new people in my travels, nobody is too surprised that as a San Francisco Bay Area resident, I work in the software industry. But when I tell them that my company makes solutions for high-performance data processing applications such as data streaming, their eyes light up in recognition. And then they invariably ask, “Like Netflix?”

I then give a quick explanation of what streaming means to my industry. I realize that data streaming isn’t well-understood yet. Ironically, I talked a lot about data streaming many years ago when I worked at a search engine software company, so the term isn’t new. Unfortunately, the video streaming business hijacked the term and now many people associate streaming with movies and music. But that’s okay, because when I see how rapidly the technology world is changing—and what streaming can mean to businesses—the confusion around terminology won’t be a big deal.

Enterprises and organizations are generating huge volumes of information from new digital sources, such as Internet of Things (IoT) sensors, health monitors, industrial automation devices, and smart meters. The number of sources and the size of the data grows exponentially every month. Data is produced continually, resulting in the transfer of vast volumes of information that could clog even the largest pipelines.

Most of these data sources reside at the network edge. In many environments, such as oil fields or remote manufacturing locations, it is impractical to process data where it is created. Perhaps processing power is insufficient or the environment is unsuitable for sensitive computer equipment. Even if companies wanted to send time-sensitive data from the edge to the data center for processing, poor network latency often creates bottlenecks.

Instant Insights, Smarter Operations

That’s where data streaming can help. Modern streaming solutions are based on a powerful technology foundation that is designed to process data right at the edge.

They can rapidly ingest, categorize, and process huge volumes of data from multiple sources. Or they can identify the essential data that must be transmitted to the data center, and send only that for processing—instead of the entire data stream. Often these solutions include intelligent technologies such as machine learning, which automates the process of deriving insights from huge volumes of data.

Together, edge computing and machine learning can help data streaming users run analytics for instant insight. They can spot conditions that could lead to potential failure or trigger maintenance or repairs.

I’ve seen some inspiring examples of how data streaming can help deliver valuable insights that help enterprises work faster, make better decisions, or even introduce innovative new solutions that offer competitive advantage, such as:

● An oil and gas company that monitors drill head stability, reducing equipment downtime and accelerating the time needed to achieve true vertical depth of a well by 10 percent
● A hospital that improves the outcomes of robotically assisted surgery by getting instant feedback on cameras, robotic devices, and patient monitors
● A media company that combines data from customers’ set-top boxes with chatbots to provide service representatives with the information needed to address customer calls and optimize the service experience

Essential Technology Features

It’s still early days for the application of data streaming. The companies that are deploying solutions typically see a specialized opportunity that the technology can address. But as the number of edge devices grows and enterprises push more processing to the network edge, powerful data streaming solutions will become increasingly common to—and essential for—data-driven organizations.

The trick is to find the right technology—solutions that can support ultra-fast processing at extreme scale; enable run-anywhere, portable edge computing; and enable elastic, seamless scalability. When people ask me how to choose a technology vendor, I tell them to look for solutions that will address not only their current needs, but also helps you identify a path for future growth. After all, you always need to plan ahead so you don’t find yourself pigeon-holed into a limited infrastructure. You need something like a stream processing engine with an integrated in-memory computing platform. That’s the only way you’ll be able to get the most out of your hardware to enable fast application speed and scale. Such a configuration also offers scalable storage, so you can cache data from third-party systems. Finally, you need a high-speed but lightweight stream processing engine that can support low-latency batch and stream processing in any environment.

You probably won’t be surprised when I say that Hazelcast has the best solution for streaming data from the network edge. The Hazelcast In-Memory Computing Platform consists of Hazelcast IMDG, the most widely deployed in-memory data grid, and Hazelcast Jet, the most advanced in-memory stream processing solution. Our distributed caching architecture allows you to scale up to hundreds of terabytes and scale out for maximum efficiency when dealing with remote data or edge processing. This solution was literally built for ultra-fast processing at extreme scale.

The Hazelcast data streaming solution uses Intel ® Optane™ DC Persistent Memory technology to deliver in-memory processing as an affordable alternative to solutions based on random-access memory (RAM). The Optane chips can run in volatile memory mode, which offers comparable speed at a lower cost and with higher capacities. Or they can run in persistent mode as a faster alternative to solid-state drives, which is used by Hazelcast for faster recovery after an outage. Either way, you get the power and performance of in-memory technology, which can make data streaming a reality for your business.

Data streaming might not be as recognizable as Netflix. But for the companies that experience safer operations, greater responsiveness, higher performance, and enhanced insight, the benefits of data streaming are better than any movie night.

To learn more about the benefits of data streaming and how Hazelcast technology can help you get started developing a use case, read our special report, Streaming Data from the Network Edge: How In-Memory Technologies and Machine Learning Catalyze Innovation. Find out more about the architectural considerations of data streaming in our e-book, Stream Processing: Instant Insight Into Data As It Flows.

Dale Kim

About Dale Kim

Dale Kim is the Senior Director of Technical Solutions at Hazelcast and is responsible for product and go-to-market strategy for the in-memory computing platform. His background includes technical and management roles at IT companies in areas such as relational databases, search, content management, NoSQL, Hadoop/Spark, and big data analytics. Dale holds an MBA from Santa Clara, and a BA in computer science from Berkeley.

Leave a Reply

Your email address will not be published. Required fields are marked *