Improving Advanced Analytics for Real-Time Decisioning


Advanced analytics – data mining, machine learning and predictive analytics – have long been established in managing risk and detecting fraud as well as in targeting customers. It used to be that these analytics were applied in batch, updating the database with a risk score or a customer segment based on yesterday’s data. In addition, these analytic models often took months to implement so the models themselves were based on data that might be months out of date.

But today’s world is real-time. Customers want answers in real-time. Supply chains operate in real-time. Fraud and risk are managed in real-time. This means companies must decide how to act in real-time. If they want to use analytics to make data-driven decisions, they must use up-to-the-minute data and analytic models that are fresh and accurate. This means changing how advanced analytic projects are managed and how the results are deployed.

Changes needed in analytics projects

The first change is that advanced analytic projects must be more focused than ever before on their role in decision making. Batch analytics were often semi-detached from the operational processes and systems that drive the business. With a move to real-time, analytics projects can no longer afford this and must be laser-focused on the operational decision making they are going to improve. It’s no longer enough to know which metrics for which the project is trying to “move the dial.” Now the team needs to understand how the decisions that impact that metric are made and what the rules, policies, regulations and trade-offs are. Plus, they have to be able to see how their analytics model will plug into this decision-making framework. A model of the decision making is essential to correctly frame the real-time analytics opportunity.

The second change is how fast the analytics models need to be developed. Analytic teams need to pull data from up-to-date sources, to combine it with increasingly unusual and high-volume, fast-moving “Big Data” sources – and do so quickly. Automated, scalable analytics workbenches are increasingly replacing hand-coded scripts and procedures so that analytics models can be developed faster and stay more current when they are completed.

The third change is deployment. Batch analytics models can be deployed in a variety of low-tech ways, running scripts against the database to score it or generating database extracts containing the highest risk or best value customers. Real-time, analytics decision making means pushing the analytic model into a high-performance IT environment, wrapping the analytics with the rules of their business – regulations, policies and best practices – and pushing the whole decision-making component into their SOA/BPM/event-based infrastructure. Decision models help define how this should be done but standards such as PMML and real-time analytics scoring services are also critical.

Finally, analytics teams need to think about model refresh and model management. Real-time analytics are not one-time exercises. Fire-and-forget analytics projects are a thing of the past. With rapidly changing streaming data, analytics models need to be continually evaluated for accuracy and be regularly updated. New data from existing data sources, as well as from new data sources, can be fed into automated and semi-automated learning loops to ensure that the model continues to perform well – and even improves.

Real-time analytics are the future of advanced analytics. Successful analytics teams will think about decision-making tools for rapid model development, deployment into real-time infrastructure, and ongoing updates and refinement.


About James Taylor

James is the CEO of Decision Management Solutions and is a leading expert in how to use business rules and analytic technology to build real-time decision making systems. He is passionate about using decision management to help companies improve decision-making and develop an agile, analytic and adaptive business. He provides strategic consulting to companies of all sizes, working with clients in all sectors to adopt decision-making technology. James is a faculty member of the International Institute for Analytics and is the co-author of The MicroGuide to Process and Decision Modeling in BPMN/DMN: Building More Effective Processes by Integrating Process Modeling with Decision Modeling with Tom Debevoise (2014). He previously wrote Decision Management Systems: A practical guide to using business rules and predictive analytics (IBM Press, 2011) and Smart (Enough) Systems: How to Deliver Competitive Advantage by Automating Hidden Decisions (Prentice Hall) with Neil Raden. James has contributed chapters on Decision Management to multiple books as well as many articles to magazines, and writes a regular blog at JT on EDM. James also delivers webinars, workshops and training. He is a regular keynote speaker at conferences around the world.

Leave a Reply