Data: The Competitive Differentiator for Innovation


As businesses master the automation and orchestration of data and compliance processes, data is proving to be the competitive differentiator in this age of innovation.

Companies know data is strategic. But the evolution of tech capabilities and the ever-growing constraints surrounding the usage of data means it’s time to change strategies. The hard truth is that data systems and flows were designed back when a business’ success was solely defined by its stable growth. But today, it’s a business’s ability to innovate which determines its longevity and success against a competitive landscape that has near equal access to technology. Data has become the fuel for new innovation strategies. Innovation is now at the core of business transformation.

Our experience working with companies at the leading edge of innovation has shown that data needs to be:

  • Fast, minimizing the time to delivery of innovative features, products, and services.
  • Flexible, so that innovation teams can seize opportunities and won’t be derailed by siloed processes or inevitable, unpredicted events.
  • Secure and compliant, because it’s the right thing to do—not to mention the severe business consequences of non-compliance.
  • Simple for all those in the innovation chain to use, even if behind the scenes, the data system is wildly complex and lives across different platforms.
  • Designed for both human and machine-based innovation.

Let’s look at some examples.

Fast, flexible

“Our mission now is to do what we call data-driven engineering and data-driven R&D,” says Yves Caseau, Michelin Group’s Chief Digital & Information Officer. As a leading source of global expertise in Engineering and Material Science, Michelin generates vast amounts of data via its scientific research. However, that data only enables Michelin to innovate ahead of the competition if it’s available on-demand in usable form.

Therefore, when Caseau started in his position, he undertook the task of automating Michelin’s data processes. The result? Now, researchers, engineers, and scientists who need a specialized dataset to try out an idea can obtain it in minutes rather than days or weeks using the data processes of the past.

This same increase in responsiveness is essential to Michelin beyond the ideation phase. Throughout the production process, right up to the delivery of a new product or feature, Michelin needs to be able to spin up well-managed development environments rich with data. Caseau’s automated data approach provides the innovation teams with the ability to create comprehensive datasets on demand.

How? For this level of operational excellence, Michelin relies on the data center’s API framework that not only selects the data but also knows how to process it according to the team’s specific needs and the organization’s overall standards. And with platform innovation in mind, that API framework can also run on top of cloud-based data storage and orchestration. This allows Michelin to institute global cloud accessibility while centralizing data management processes.

Secure and compliant

The concept of data fluidity may initially seem to pose a problem for compliance adherence as data privacy regulations continue to increase in number and aperture. This inherently requires a data system to be highly accountable while preserving its agility. Data integrity proof points must be a part of the innovation chain, including, but not limited to, being able to reconstitute any masked datasets and environments spun up during the innovation process.

Automating both the data creation and the data masking within a single data process has enabled organizations to accelerate innovation while ensuring they stay within strict and evolving regulatory boundaries. In short: we can now have agile compliance.

For example, in the face of the COVID-19 pandemic, new legislation rapidly emerged around the world. Ultimate Kronos Group (UKG), a leading human capital and workforce management company, realized it would have to get new functionality out to help clients “navigate changing legislation, better understand employees’ feedback, and communicate more effectively,” says John Machado, Chief Technology Officer at UKG. 

So, UKG quickly enhanced its software’s ability to handle taxes and payments while staying compliant with new legislation such as the Families First Coronavirus Response Act (FFCRA) and the Coronavirus Aid, Relief, and Economic Security (CARES) Act. Through rapid provisioning of data environments to its research sandboxes, the company was able to provide features crucial to newly dispersed workforces:  immediate notifications to employees, contact tracing, and even a way for companies to organize charitable giving campaigns.

Simple, API-centric data architectures for humans and machines

Since machine learning systems program themselves by turning data into software, looping machine learning into frequently updated data streams is essential to making AI an effective part of the innovation pipeline.

For example, BNP Paribas is one of the world’s largest banks, with one of the world’s largest technology organizations. As Bernard Gavgani, the bank’s CIO, says, in the banking sector, “digital transformation has never been as urgent as today,” citing the migration of banking services to online mobile solutions and the ever-growing webs of business partners who depend upon rapid, sometimes massive, secure data flows. To enable the bank’s competitive differentiation, Gavgani has been moving the bank’s information system from an application-centric to a data-centric architecture.

Gavagni is highly alert to the rapidly increasing role of machine learning in virtually all business sectors, especially in banking. He points to the scoring engine used to assess customers’ creditworthiness, pricing for SMEs on the retail banking side, and more effective marketing. In these ways and many others, AI is helping BNP Paribas leverage opportunities as the world migrates to cashless economies and to new financial services and instruments.

He credits his group’s success in managing these complex processes to an architectural decision: “APIs are the core enablers for bank evolution, along with data and cloud initiatives.”

His group is now enabling other business departments to develop their own machine learning applications using this API-centric approach, aiming at accelerating the pace of innovation throughout the organization.

Culminating in business transformation

These new data processes and architectures not only increase the speed with which new ideas are turned into products and services but also play a significant role in transforming the business itself.

For example, it was a significant achievement when Choice Hotels launched its new portal last year, easing and speeding the opening of new hotels for its franchisees. But by making more radical changes to their data processes, including building their new reservation distribution engine in the cloud, Choice Hotels is now also operating not just as a hotel chain but as an online travel agency. “We have partners in Europe and Latin America that leverage Choice to book rooms in their hotels. As of 2019, we were able to service 1,100 different shopping requests per second,” notes Jason Simpson, Vice President of Engineering. Automating its data processes has enabled Choice Hotels to open up a new revenue stream and control more of its customers’ experiences.

The entire global economy is changing so fast that enterprises have to reinvent themselves to become data companies, accessible from everywhere, operating at lightning speed, providing maximum security, and adhering to a regulatory environment that is becoming more and more complex. Today’s leading companies don’t look much like yesterday’s, and neither do their data processes. Visionary leadership and a commitment to think in new ways are more crucial than ever.

It comes down to this: Rapid innovation is now a competitive business imperative that has to be supported with operational excellence. At the core of this innovation, across almost every vertical, are customer-facing applications that run the business and an ever-increasing audience of data consumers that need constant access to data in order to:

  • Deliver Site Reliability Engineering (SRE) for these customer-facing applications
  • Rewrite, rehost or replatform these customer-facing applications
  • Aggregate and inspect the data coming from these customer-facing applications
  • Build the next set of features of these customer-facing applications
  • Create predictive models and software to increase the impactfulness of these customer-facing applications

While the advent of public/private cloud and embracing of DevOps has become table stakes, data has emerged as the new battleground. Every business today has an opportunity to modernize its data and an obligation to ensure all data privacy regulations are adhered to. This is an incredibly exciting time for businesses that are mastering the automation and orchestration of data and compliance processes – data is proving to be their competitive differentiator in this age of innovation.

Alex Hesterberg

About Alex Hesterberg

Alex Hesterberg is Chief Customer Officer at Delphix. He is responsible for leading Delphix’s solution consulting, implementation, pre-sales, customer success, support, education, and other services functions to ensure that Delphix’s customers and partners can optimize the value of the Delphix platform across different use cases and applications. He has held similar positions at Riverbed Technology and Pure Storage. Prior to those companies, he worked in customer success and services roles at Symantec, Veritas, and Turbonomic.  

Leave a Reply

Your email address will not be published. Required fields are marked *