The logical data fabric is a vision of a unified data delivery platform that abstracts access to multiple data systems by hiding complexity and exposing data in business-friendly formats.
In today’s data-first economy, it is not unusual for some organizations to have multiple data science, data engineering, and data platform teams dealing with issues such as pricing, supply-chain, and in-store shopping-related advanced data analytics and data science to propel their business and achieve a competitive advantage. As a result, one of the biggest challenges IT teams face these days is to serve a broad range of data consumers with varying levels of skill-sets. That is why a logical data fabric approach is rising to prominence.
Such an approach promises to achieve a flexible, real-time, and augmented data integration pipeline, combined with comprehensive data management capabilities, to serve the most data-savvy as well as the least data-savvy consumers within an organization. By utilizing knowledge graphs, data catalogs, and AI/ML on active metadata, this new data integration, and data management approach supports faster and automated data access and sharing.
A data fabric is a composable architecture, meaning components like data catalog, knowledge graph, data preparation layer, recommendation engine, DataOps, and orchestration can be combined with disparate tools working together. While that is true, some of the best-of-breed data fabrics are a single platform, offering all the important capabilities of a data fabric. The logical data fabric is a vision of a unified data delivery platform that abstracts access to multiple data systems for business consumers, hiding the complexity and exposing the data in business-friendly formats while, at the same time, guaranteeing the delivery of data according to predefined semantics and governance rules. In the digital world we live in, it is not much of a stretch to say that will make every CIOs dream come true.