Why Data Integration is the Rodney Dangerfield of BI


Disrespected as mere digital plumbing, a modern data integration platform enables maximum value from the boom in SaaS application adoption.

Most finance professionals now have the ability to create attractively designed and formatted reports in minutes. But what good are blazing fast calculations and beautiful reports if the data informing important trends and patterns is suspect? The C-suite needs accurate data for those dazzling presentations. Data integration must play a central role.

As Deloitte says in its CFO Guide to Data Management Strategy, “for any size business, achieving desired outcomes starts with good data.” To optimize cash flow, forecast with greater accuracy, and coordinate action across the business, motley hordes of real-time precision data must be integrated to feed scenario planning — yet only 16% of CFOs are getting it at the scale they need, according to Accenture.

Business Intelligence (BI) and Enterprise Performance Management (EPM) software enable finance professionals to gain better insight into business performance and make timely decisions based on constantly evolving conditions. But many finance chiefs find themselves data-rich and information-poor thanks to siloed and disparate systems. In recent years, there have been significant improvements in the range of software products that collect, manage, and move important information. However, a significant shortcoming exists in the ability to automate the chain of processes that get multiple applications to “talk to each other” coherently and ensure data consistency, reliability, accuracy, and compliance.

Services flowing through digital “pipes”

In the past, companies bought one BI technology solution stack – say, IBM or Oracle or SAP. The benefit was that applications in that stack should work together seamlessly – the drawback was that the available applications didn’t necessarily suit your particular business needs. You just had to work with whatever was offered.

Today, business technology needn’t be chained to a monolithic solution stack. You can just select among thousands of applications from myriad vendors based on added value alone. This is a result of the growth in cloud-based Software-as-a-Service (SaaS) offerings as a cost-effective alternative to the traditional IT deployment.

And this SaaS revolution is huge! One recent report found that spending on SaaS applications rose 14%, and application quantity growth doubled in 2020 compared to 2019. Another annual trends survey estimates the average company uses around 137 unique SaaS apps, but an enterprise of over 1,000 employees could easily have 288 unique SaaS apps.

There is no single uber-application that suits all the needs of the office of the CFO. Rather, there are any number of purpose-built applications from one or more vendors, each of which is optimized to suit specific needs. You can buy the best account reconciliation platform, the best consolidation platform, the best planning platform, and the best business intelligence platform based on its added value.

So, what’s the hitch? While it sounds obvious, the process of integrating data successfully and precisely among those specialized applications is often overlooked and underappreciated. Gartner defines data integration in a 2020 Magic Quadrant as “the ability for different data stores to synchronize, to move data from one store to another, and to combine, de-duplicate and aggregate data from different stores.” Yet like the Rodney Dangerfield of business intelligence, data integration infrastructure gets “no respect.” It’s considered digital plumbing, invisible and largely forgotten — until a blockage, leak, or flood causes disaster.

Achieving data integration e pluribus unum

The peril of deploying a bunch of SaaS best-of-breed solutions from disparate vendors is that you may address individual use cases while still not solving your problems at an architectural level. Invariably, applications in the ecosystem need to share data and interact with each other. The ability to exchange data among the solutions becomes critical, as they’re only as good as the data that’s in them. The precise, reliable, and performant integration of these best-of-breed systems is what allows you to derive the greatest value from them across the board.

The complexity involved in managing the chain of multiple applications is a significant cost and burden for most Finance and IT departments, hindering a business’ ability to derive optimum value from its critical applications. Historically, data integration has required a particular technical skill set to develop and maintain. Creating an integration strategy and choosing from the data integration tools available can be daunting. Out of necessity, custom code threads together the various pieces of the finance application ecosystem. Unfortunately, an ecosystem that is automated by disparate tools and custom code is highly inept in providing immediate alerting to the root cause of any problem that exists in the process chain. As a result, companies often lean on consulting partners for help.

But by selecting the appropriate integration technology to harmonize and map the data among all these different inputs and outputs, financial executives can deliver on the promise to put critical data in the hands of the people who need it on a timely, predictable, and consistent basis. More importantly, they can be sure that the data is accurate without spending precious time on verification, and they can do it without the expense of professional consulting services.

Key requisites of a data integration platform

The tsunami of data generated by dozens of disparate applications presents both an opportunity for finance teams to glean new insights and boost their value to the business, but also a challenge in collecting, processing, and acting upon the data – and in ensuring data quality.

New tools using emerging technologies can automate data management and improve data quality better and faster than ever before. Automation tools were once considered only as a solution for larger enterprises, but a 2021 Xerox study found that 80% of small and medium-sized businesses consider automation of tasks and processes as critical to their survival in the post-pandemic environment.

There are a number of key characteristics to keep in mind when deciding on the data integration platform that suits your business and can deliver precision data to support critical requirements in finance, planning, reporting, and accounting. These can include:

  • Modern architecture
  • Built for business users rather than IT personnel
  • Integration of data from both cloud-resident and on-premises applications
  • Application-aware integration for widely used complex enterprise applications
  • High flexibility for mapping to a broad range of use-case requirements and versatile deployments
  • Running of automated processes on an ad-hoc basis
  • Ability of users closest to the applications and data (e.g., accountants)  to manage their own mapping and data preparation
  • Enterprise-ready for scaling to meet evolving business demands.


CFOs who don’t take advantage of sophisticated financial reporting technology can be hamstrung in planning and strategy development. Pulling inaccurate or outdated data into reports puts companies at risk of making late or uninformed decisions.

Give your digital plumbing the respect it deserves by choosing a data integration system that ensures data integrity and precision. In that way, the company can derive full benefit from the best SaaS point applications for your business needs.

Quin Eddy

About Quin Eddy

Quin Eddy is the chief executive officer and co-founder of New York City-based OneCloud, Inc. Previously, Quin was the Head of Customer Success at AnaPlan. He also was CEO of Star Analytics and joined IBM as Director or Automation & Integration when IBM acquired the company. Quin holds a Bachelor’s degree from University of Vermont.

Leave a Reply

Your email address will not be published. Required fields are marked *