Survey Shows Many Weak Links in the Data Supply Chain

PinIt

Lack of automation and data security challenges are impeding the flow of enterprise information, known as the data supply chain.

Is real-time data still an illusion? There may be a lot of data surging across today’s enterprises, but a lot of it is stale, not reaching decision-makers or applications in time to add its full value. Furthermore, more than four in five executives believe data privacy and security requirements will limit data access even more.

That’s the finding of a survey of 525 executives conducted by 451 Research and underwritten by Immuta, which suggests that a large gap exists between data consumers and data suppliers in organizations. “Data suppliers are unable to efficiently deliver relevant data to a growing number of data consumers,” the report states, indicating that more than half (55%) of those surveyed report that data is often stale or out-of-date by the time it is consumed or analyzed.

The survey refers to the flow of enterprise information as the “data supply chain,” which has many weak or broken links. “Lack of automation and data security challenges are root causes of the friction, and unless resolved, organizations will struggle to be successful with DataOps and cloud migration,” states Paige Bartley, author of the survey report.

The report also finds privacy, security, and governance challenges to be particularly troublesome, with 84% of respondents reporting that data privacy and security requirements will limit access to data at their organizations over the next 24 months. “As data workflows and processes have become more complex over time – and as organizational demand for data grows – there are clear points of friction in the data supply chain,” said Bartley. “Chief among them is data suppliers that have limited resources, skills shortages, and little automation being tasked with trying to deliver a steady stream of relevant data to a growing number of data consumers.”

Among data suppliers, 38% report “lack of personnel or skills” and 29% report “not enough automation available” as top pain points. In addition, 90% of organizations report not having an “optimized” DataOps strategy, which would help automate the flow of data across enterprises, as well as support collaboration between data suppliers and engineers. While most (85%) say that their strategy is accelerated, emerging, or nascent, few believe they have achieved DataOps maturity.

At the same time, the survey finds that regulation is actually a driver for improvement: organizations subject to data privacy and protection regulations, such as GDPR, are more likely to report having a cloud-first strategy, face fewer challenges with data access and use, are more likely to have a dedicated data engineering team, and more frequently provide self-service analytics programs.

The challenge is “speed. Insights and business value cannot be generated from data quickly unless it can be shared, modeled, and analyzed in a frictionless manner,” Bartley explained. “Underlying problems in people, processes, and
technology prevent real-time data use in most organizations today.”

There are clear points of friction in the data supply chain, the survey finds. “Chief among them is that data suppliers who have limited resources, skills shortages and little automation are being tasked with delivering a steady stream of relevant data to a growing number of data consumers.” Among data suppliers, 38% reported ‘lack of personnel or skills’ and 29% reported ‘not enough automation available’ as top pain points. Also, 55% of all respondents either ‘somewhat’ or ‘completely’ agreed that data is often stale or out of date by the time it is consumed or analyzed.

External factors also have complicated data-driven insight initiatives. “As regulatory changes around the
world bear down across industries, ensuring the privacy, security, and governance of this data has become
paramount, further complicating workflows and responsibilities,” Bartley observes. “Even the universal march to adopt cloud-based data technologies has been hampered by concerns about security, compliance and privacy functionality.”

The survey also shows that data quality is becoming more important than data quantity. Nearly all (90%) survey respondents report that data quality and trust will become more important than data volume or quantity to their organizations over the next 24 months.

“Organizations need to consider the different internal perspectives and potential mismatches between employee experiences and subjective realities,” Bartley concludes. “The survey shows that, overall, leadership personas and those in overseer roles tend to paint a rosier picture than the data suppliers and data consumers that report directly to them. This only adds to the friction. Schisms between perspectives of those in different roles are natural but can also be indicative of deeper problems. In the data supply chain,”

“Make sure that all stakeholders in the data supply chain have their voices heard in strategic planning
discussions,” Bartley urges. “And while data-driven strategy needs buy-in from the very highest levels of corporate leadership, it also needs organic input from the workers who interact directly with data and supporting technology.”


.

Avatar

About Joe McKendrick

Joe McKendrick is RTInsights Industry Editor and industry analyst focusing on artificial intelligence, digital, cloud and Big Data topics. His work also appears in Forbes an Harvard Business Review. Over the last three years, he served as co-chair for the AI Summit in New York, as well as on the organizing committee for IEEE's International Conferences on Edge Computing. (full bio). Follow him on Twitter @joemckendrick.

Leave a Reply

Your email address will not be published. Required fields are marked *