Predictive Intelligence Only Works With High Quality Data

PinIt

Poor data quality leads to several other intangible losses such as customer dissatisfaction, less effective decision-making, and the reduced ability to execute business strategies.

At a time when businesses are looking to tech for any way to find a competitive advantage, the sales pitch for predictive intelligence is very enticing. A set of tools that automate the analyzation process of data collected, and provide businesses with warnings and forecasts allowing them to be proactive instead of reactive. 

Predictive intelligence can be embedded into many business processes, such as recognizing changes in consumer demand for a product type or warnings of potential network failures and blindspots. Predictive maintenance, a branch of this intelligence, can also notify manufacturers of degradation or fault in heavy equipment. With this knowledge, businesses can play on the front foot. 

SEE ALSO: It’s Time to Decentralize with Edge Analytics

“Predictive intelligence should be thought of as a guiding hand that helps businesses see and measure performance across all networks that impact the user experience, forecasts issues based on historical data and influences decision-making,” said Mohit Lad, co-founder and general manager of Cisco ThousandEyes, to VentureBeat.

The issue, as is usually the case with analytics and artificial intelligence tools, is one of data quality. Businesses have spent the past decade creating new ways to collect more and more data, often more than is necessary, but there has not been the same improvements in quality control. 

“It takes an enormous amount of data to predict the beginnings of a degradation or performance deterioration with a high degree of accuracy,” said Lad. “Although the volume of data needed to train a model has existed for some time, the data often wasn’t as clean as it needed to be. That caused flow–on effects in statistical models. Without good data, the models simply weren’t capable of producing granular assessments and actionable recommendations.”

According to a study conducted by University of Washington professor Debabrata Dey, the estimated losses attributed to poor data quality are between 8 and 12 percent of an organization’s revenues, which can exceed billions of dollars each year for major companies. 

“Poor data quality leads to several other intangible losses such as customer dissatisfaction, less effective decision-making, and the reduced ability to execute business strategies,” said Dey. 

There are plenty of tips available for an organization to improve data quality, some are rather simple such as optimizing the data collected, or improving the delivery method of obtaining data. Others are culture-wide changes that organizations need to make to ensure that employees are aware that capturing a single glance of data is not enough, and for these predictive intelligence platforms to reach granular levels of performance they need more data points. 

“We have strong feelings on data quality: “Garbage in, garbage out.” If the data you feed into the machine is no good, the resulting predictions will be useless,” said Veda Konduru, founder and CEO at VectorScient. “That means businesses must focus on more than just data quantity — they also need to put data quality into perspective when eyeing for the right product for their business.”

David Curry

About David Curry

David is a technology writer with several years experience covering all aspects of IoT, from technology to networks to security.

Leave a Reply

Your email address will not be published. Required fields are marked *