Sponsored by Vantiq
Center for Real-Time Applications Development

Real-time Squared: The New Norm

PinIt

Real-time systems must be updated and modified in real-time as soon as an issue is identified. Hence the need for real-time squared.

Real-time systems are characterized so by their ability to size up events and instantly react to changes. But another attribute is becoming necessary due to the demand for accurate predictions in an unpredictable world. Specifically, real-time applications and systems themselves must adjust in real-time to compensate for disruptions that obviate the underlying data and assumptions used to create them.

The issue is getting extra attention these days due to major business disruptions brought on by the COVID-19 pandemic. Many predictive models used in financial services, retail/eCommerce, supply chain management, logistics, and more must be tested and vetted to ensure the assumptions used to make decisions are still accurate. If a model was developed using a certain dataset and that data changes, what would the implications be on the prediction? Also, what if the wrong analytics algorithms or machine learning models are used to make the predictions?

Take a financial services model that tries to predict customer payment waiver requests. The normal distribution for those requesting one, two, or three waivers fits certain distribution patterns. Do these patterns hold up under this year?s economic conditions that have seen?more than 40 million people filing unemployment claims in the U.S. and global stock markets suffered dramatic falls?

COVID-19?s impact on retail and eCommerce has thrown trusted models out the window. Recommendations engines and customer engagement services now must deal with a base that has shifted from a well-defined mix of brick and mortar versus online business to one dominated by online almost exclusively.

The impact in other industries has been dramatic, too. A recent Harvard Business Review article on the challenges of predicting consumer demand in today?s unpredictable times noted: ?Covid-19 has shattered the demand forecasts that guide retailers and suppliers of consumer goods and services in figuring out how much to order or manufacture, where to stock inventory, and how much to advertise or discount.??

When such problems occur in real-time systems, the result is bad decisions or wrong conclusions. Even worse, a natural tendency when forecasts or predictions break down or models no longer reflect the state of matters is for managers in organizations to revert to gut instincts. Just look at what happened in grocery stores in the U.S. at the pandemic’s start to see why a seat of the pants approach fails. Perhaps grocery chains could have anticipated the explosive demand that led to problems keeping toilet paper and cleaning supplies stocked. But who knew sourdough starter kits and dried beans would be hoarded?

Worse, running operations based on gut instincts, outdated models, or data that is no longer relevant, among other things, can introduce bias into decision-making processes. Bias is increasingly gaining the attention of regulators.

And in some cases, bias can garner incredibly bad publicity. One example that got global attention was when Apple introduced the Apple Card, which used an algorithm that discriminated against women in credit-scoring evaluation.?The issue was raised?after Apple?s co-founder Steve Wozniak and entrepreneur David Heinemeier Hansson received credit limits 10 to 20 times higher than their wives, even though the spouses shared bank accounts and had similar credit ratings.

Real-time system knowledge and transparency is essential

The first step in addressing this issue is to know when a potential problem exists. Too often, applications, automated processes, and systems are considered immutable and infallible. We certainly know that is not the case in all circumstances.

The question becomes, how do you find out if the data, models, and assumptions used to create the application or system are flawed or outdated? There needs to be a way to evaluate the validity and quality of the app or system to instill confidence in their output or actions. Businesses must deeply examine the data, models, and assumptions that comprise their real-time applications and systems to determine which elements and business rules to update.

What can help? The first thing is to stop treating real-time systems and applications like black boxes. The issue is coming up more and more in systems that use sophisticated analytics for forecasts and automation decisions. Businesses find they must justify outcomes and decisions. Accomplishing this requires extensive documentation about which data and algorithms are used, how they are used, and more. Adoption of approaches based on explainable or humble AI is becoming more common in use cases where AI is involved.

Systems must also be constantly evaluated and their outcomes tested. Is there model drift? Is the data outdated? Have the core business assumptions and rules changes? Are any of these factors impacting the accuracy of forecasts or validity of decisions?

Make continuous adjustments

Once issues are identified, applications and systems must be updated or refreshed. The process may involve using different algorithms, new datasets, more extensive model training, or something else.

Here is where that second ?real-time? in real-time squared comes in. The systems and applications must be changed not in a week or a month or next year. They must be updated and modified as soon as an issue is identified.

Organizations are doing this through strategies, including low-code development methods and continuous integration and either continuous delivery or continuous deployment (CI/CD).  CI/CD?s goal has always been to release higher quality software faster. Now, it plays a greater role in helping to ensure problems such as changes in the dataset, model drift, and fundamental assumptions about market conditions are removed from the application or system.

Salvatore Salamone

About Salvatore Salamone

Salvatore Salamone is a physicist by training who has been writing about science and information technology for more than 30 years. During that time, he has been a senior or executive editor at many industry-leading publications including High Technology, Network World, Byte Magazine, Data Communications, LAN Times, InternetWeek, Bio-IT World, and Lightwave, The Journal of Fiber Optics. He also is the author of three business technology books.

Leave a Reply

Your email address will not be published. Required fields are marked *