SAP Adds Machine Learning Algorithms to Analytics Cloud


SAP has deployed machine learning algorithms into their Analytics Cloud as part of an effort to deliver real-time analytics.

SAP at the Strata Data Conference this week announced it has injected an array of machine learning algorithms into the SAP Analytics Cloud as part of an effort to deliver real-time analytics that proactively suggests actions to be taken as data is captured.

Rather than waiting for end users to come up with specific queries, SAP Analytics Cloud is now employing machine learning algorithms to continuously provide insights concerning, for example, total spend analysis or customer segmentation.

See also: Going inside SAP’s new data strategy

Those capabilities should go a long way to eliminating a data science bottleneck that has emerged as organizations increasingly rely on Big Data to drive business decisions, says Mike Flannagan, senior vice president for SAP Analytics and Leonardo at SAP. Leonardo is a brand name SAP has applied to all its research and development efforts into machine learning algorithms and other emerging technologies. Those research efforts have resulted in a broad spectrum of machine learning algorithms that SAP is increasingly embedding into its application portfolio. In the case of SAP Analytics those algorithms are based on a mixture of in-house developed capabilities and algorithms it gained in 2013 via the acquisition of KXEN, a provider of a predictive analytics application.

By embedding machine learning algorithms inside its analytics applications, end users are no longer dependent on an internal data science team to construct an artificial intelligence (AI) model to glean insights from Big Data, says Flannagan.

That’s critical because the rate at which critical business decisions need to be made has accelerated considerably, says Flannagan.

“The cycle time for business decisions is being compressed,” says Flannagan. “There’s a lot more pressure on end users.”

Will we trust these analytics decisions?

While organizations are clearly making significant investments in building their own AI models, it’s probable the first experience end users will have with AI models will be within an application provided by vendors such as SAP. Providers of packaged applications have more resources to build AI models that take a considerable amount of time to train before they can be reliably employed in a production environment. In the case of SAP, those AI models will manifest themselves first in cloud applications before also being made available in the on-premises editions of its software.

The challenge organizations now face is making sure the data being captured to drive the machine learning algorithms employed in those AI models is reliable enough to drive advanced analytics. In addition to focusing on data quality and best practices for master data management, Flannagan says organizations would be well-advised to digitize every data capture process possible. The fewer number of instances where organizations rely on humans to capture data the less likely errors will be introduced into a data set.

It may take a while for end users to trust the suggestions being surfaced by analytics applications employing machine learning algorithms. Many of them already regularly question the validity of reports generated by analytics applications that don’t always line up with their business experience.

But as the quality of the data being captured continues to improve confidence in the suggestions and alerts being proactively surfaced by analytics applications should increase. At the very least, end users will be made more aware of potential trends they should be tracking.

Leave a Reply

Your email address will not be published. Required fields are marked *