Sponsored by Nstream

Streaming Data, Digital Twins, and Overcoming the Tyranny of Averages

PinIt
streaming data and digital twins

Nstream helps business build applications that use streaming data to enable sophisticated business logic, which, in turn has a valuable impact on the business.

Businesses have long used analytics to help make decisions and take actions. The insights gleaned from the analytics were based on historical data and the laws of average. Increasingly, businesses need more to be competitive. They need real-time insights put into context and the ability to highly personalize the actions they take.

Fortunately, such capabilities are possible due to new technologies, including analysis that incorporates streaming data, digital twins, the development and use of streaming applications, and more. Unfortunately, many businesses find implementing these technologies challenging and beyond their skill sets.

Ready to Complete the Data Pipeline with Streaming Applications? [Visit  Nstream.io]

Why the status quo must go

The trouble with traditional approaches is that businesses end up suffering what some call the tyranny of averages, where actions are taken based on past norms. For example, an apartment leasing company might set rental rates for the year based on last year’s rates. In other words, the business acts based on what the data tells them what happens “on average” instead of directing the business to what is actually happening, so they can take action.

In this example, a more sophisticated approach would be to factor in other data (average occupancy rates in the area, time of year, length of lease, number of simultaneous applications at this moment, etc.) to dynamically set rates for each applicant based on real-time market conditions.

Regardless of the industry or application, the tyranny of averages is detrimental in modern business environments. For example, using averages, a maintenance organization would replace a part based on the mean lifetime of the part. There are two problems with that. One, if the part fails before its average lifetime, it could disrupt a production line. And it could incur added expenses to rush both a technician and the part to a machine’s location. That can be astronomically expensive if we’re talking, for example, about a drill on an isolated oil exploration platform in the middle of the ocean.

Conversely, replacing a part “on schedule” makes inefficient use of whatever extended life it might have beyond the average. That represents an expense that could be deferred if the part is left in place for its real lifetime.

Retailers using averages based on average customer purchasing patterns suffer similar issues. Sending every customer the same promotion is a waste when a targeted one, based on their specific buying patterns, would be much more effective. And increasingly, it is much more important to time promotions and recommendations to a specific customer’s needs at a particular moment. So, a directed pitch as a customer dwells over a specific product on a website is much more effective than an offer developed based on a pool of customers’ average behavior.

Financial services organizations are finding the same issues. They need to be able to use real-time information about the market and economy and combine that with a specific customer’s needs at that moment. It is no longer enough to offer everyone the same loans at the same rates based on averages for a class of customers.

Another field where the tyranny of averages is problematic is security. New threats and new attacks get missed if a business or a tool is looking for repeat patterns identified by analyzing past attacks. There needs to be a way to spot anomalies that are precursors to an attack.

What’s needed in all fields in modern businesses is the ability to identify outliers and to individualize and personalize decisions and actions.

New thinking for old problems

One way to move beyond the ways of old and average-based analysis is to make use of streaming data to derive contextual insights in real time.

Why? Businesses need to understand what is going on and what can be done about it. They want to be able to make recommendations or suggestions about how to be more optimal, productive, or cost effective. That’s where streaming data and context come in. Streaming data essentially represents what’s happening now or, more specifically, what’s changing now. And then businesses need to put that information into context. Is this good or bad, as expected or unexpected? And if it’s “bad,” they want to understand what can be done to improve the situation. If it’s “good,” how can the organization optimize or capitalize on the information?

In a conversation with RTInsights, Chris Sachs, founder and CTO at Nstream (formerly Swim), put how this differs from traditional approaches into perspective:

“Big data, which came before, solved the problem of how to do the coarse steering of the enterprise on a quarter-by-quarter basis. Big data is very effective at directing or steering these large enterprise ships. What big data is not great at is the micro-adjustments, the small adjustments on an hour-by-hour, minute-by-minute, or even second-by-second basis.

That’s where streaming data enters the equation. Big data does the coarse steering. The goal of streaming data is to provide that fine steering, which is that next level of optimization and automation that businesses are looking for.”

There are many areas where such information is valuable. The approach can be used to detect credit card fraud or to support customer 360 efforts. One use case is that of a major retailer looking to reconcile the movement of various entities along the supply chain (truck, package, inventory, delivery time, customer notification, etc.).

Another use case is the idea of an “infrastructure state machine,” where infrastructure could be the power grid, EV charging networks, wireless networks, retail inventory, supply chain inventory, or something else. There are many solutions for collecting “metrics” about these things. But most do not put those metrics in context, integrate those metrics to construct the current state of the system, interpret what the integrated picture means, and finally apply automated business logic enabling the organization to take action in a more efficient and faster manner.

Enter digital twins

Increasingly, businesses are using digital twin technology to take insights from streaming data and its context to act on an individual basis. Such digital twins are a different notion than the digital twins term that’s been around for a long time. Originally, digital twins arose in a manufacturing environment.

Here we’re talking about a much more expansive definition of digital twins here. Businesses want to create digital twins of any and all of their digital and physical assets, including vehicles, delivery routes, customers, access points, infrastructure, and much more. And they want to feed those digital twins with streaming data so that they have a live picture of everything that’s going on in their business and how it’s related. In this way, a business could create a real-time picture of a network or food delivery service based on the business’s needs. Once they have a real-time picture of the entire landscape, they can see what’s happening, find the specific outliers (not the averages) and then take action.

What’s needed for success?

These are all areas where Nstream and its platform can help. Built by developers for developers, Nstream enables developers to create domain-specific applications that unify static and dynamic data from message brokers, databases, and other data sources. With Nstream, they can create smart, interlinked real-world models—Web Agents—which power extremely responsive, scalable, and efficient continuous intelligence applications. And most importantly, Nstream lets developers build an application that is specific to their business.

Nstream Web Agents efficiently transform streaming, contextual and historical data into real-time insights while reducing application complexity and taking advantage of available compute resources closest to the data sources.

That lets businesses model the real-time state and context of every individual business entity with active digital twins called Web Agents. Nstream allows businesses to feed data and context into Web Agents from multiple in-motion and at-rest data sources. It also lets them execute arbitrary business logic on real-time state changes to any Web Agent, deriving new states, computing real-time analytics, or taking automated action in the process.

Nstream lets businesses stream real-time state changes to other applications and enterprise systems via multiplexed streaming APIs, so applications can stream just the data they’re interested in without getting drowned by a firehose of irrelevant events. And the solution helps businesses visualize the real-time state of an entire system for the purposes of automation oversight and to triage anomalous situations that the automation can’t remediate on its own.

The bottom line is that Nstream provides the ability to build applications that use streaming data to enable sophisticated business logic, which, in turn, has a valuable impact on the business.

Ready to Complete the Data Pipeline with Streaming Applications? [Visit  Nstream.io]
Salvatore Salamone

About Salvatore Salamone

Salvatore Salamone is a physicist by training who has been writing about science and information technology for more than 30 years. During that time, he has been a senior or executive editor at many industry-leading publications including High Technology, Network World, Byte Magazine, Data Communications, LAN Times, InternetWeek, Bio-IT World, and Lightwave, The Journal of Fiber Optics. He also is the author of three business technology books.

Leave a Reply

Your email address will not be published. Required fields are marked *