Naveego Tool Identifies Data Quality Issues in Minutes

PinIt
data quality

Cloud service uses Hadoop and machine learning to find and fix issues in real time.

Before any organization can derive business value from the data collected, it needs to know how reliable that data is. One of the primary reasons so many business executives don’t trust the reports being generated by analytics applications is they know the underlying data is suspect.

To help organizations gain more confidence in their data, Naveego has launched Naveego Accelerator, a data health analysis tool that analyzes how accurate any data might be in a matter of minutes,

“Our whole goal is to get rid of bad data,” says Naveego CEO Katie Horvath.

See also: 96% of All AI/ML Projects Have Training Problems

Naveego is a provider of a master data management (MDM) platform delivered as a cloud service. The Naveego Complete Data Accuracy Platform leverages Hadoop and machine learning algorithms to enable organizations to detect, manage and eliminate data accuracy issues across enterprise data sources in real-time, regardless of structure or schema.

The Naveego cloud service can connect to all the various data sources residing in the cloud or on-premises to provide a comprehensive inventory of all the data sources spanning the entire enterprise, says Horvath.

The Naveego Accelerator is a self-service tool takes that capability a step further by auto-profiling data in a few minutes and conducting comparisons across systems to calculate the percentage of records with consistency errors impacting business operations, says Horvath.

Old problem, costly new twists

Data quality has always been a major IT issue. Inconsistency in customer records stored in multiple applications often makes it extremely difficult for organizations to establish the “single truth” about any customer or supplier relationship.

However, what was once a nuisance that organizations learn to work around has now become a major artificial intelligence issue. Organizations are investing millions of dollars hiring data scientists to create sophisticated artificial intelligence (AI) models only to discover most of those data scientists are spending most of their time cleaning up data. In effect, data scientists have become the world’s most expensive digital maintenance workers.

Of course, it’s not too long before those data scientists get frustrated to the point where they decide to quit, notes Horvath. The Naveego tools are intended to prevent that from happening by removing all the drudgery associated with ensuring data quality, says Horvath.

See also: Not Good at Analytics? Not Ready for AI

Most organizations are investing in AI as part of a bigger bet on digital business transformation. Most of those bets are not likely to pay off if the data on which processes will one day be automated is at best unreliable. There are already plenty of customer service issues involving existing legacy applications loaded with bad data. It’s quite another thing, however, to automate processes at scale using bad data that was fed into an AI model. The potential for catastrophic outcomes is going to be high.

As such, it’s now only a matter of time before AI projects force organizations to once again reexamine data quality issues that many of them have long ignored.

Unfortunately, many of the senior leaders of those organizations are about to discover a whole host of data quality issues that have been regularly swept under the proverbial IT rug now for decades. The good news is once those issues are finally addressed many of them might finally start to trust the data they’ve already spent millions to collect.

Leave a Reply

Your email address will not be published. Required fields are marked *