Data Integration

Do You Need a Data Fabric?

Traci Curran

March 3, 2020

Big data abstract visualization

Are you challenged to provide faster access to integrated data across a diverse and distributed IT landscape?

Over the past few years, IT organizations are increasingly being asked to automate the systems and processes involved in integrating and preparing their data for reporting, leveraging things like active metadata, Artificial Intelligence (AI) / Machine Learning (ML) algorithms, and knowledge graphs.

Bringing traditional data sources together and augmenting them using these modern capabilities requires a different design approach – that’s where a data fabric comes in.

What is a Data Fabric?

A data fabric is a design concept (an architecture) that provides a consistent set of data services and capabilities across on-premises and cloud environments. It enables you to abstract your data from systems that are physically and logically different into a common set of data objects so you can treat them as a unified enterprise data set.

This is particularly important in supporting digital transformation initiatives where business processes leverage different systems that are on-premises, spread across multiple clouds, and even deployed remotely (things like IoT and Mobile apps). By utilizing a data fabric, companies can achieve faster and more efficient data sharing across systems, which leads to more integrated business insights and increased business agility.

A data fabric is comprised of a set of capability layers that transform and abstract data from different sources on their way to the data consumer. Think of this as a value chain where raw materials, through a series of value-add steps, are transformed into consumable finished goods. A typical data fabric is comprised of 5 steps of refinement that take place in three layers of capabilities.

Raw data is first organized in a data catalog/metadata layer. Each data source includes contextual information called “metadata” about what the data is, when it was collected, what format it was in, etc. The cataloging layer performs a “rough sort” of the raw data.

This catalog then informs a knowledge-graph layer where analytics are applied to activate meta-data and infer connections and relationships that may exist. AI/ML algorithms are used for enrichment to of the active metadata to help interpret the data, put it into context, and make it simpler so automation rules can be defined for data integration.

The data from the knowledge graph then moves into an integration layer where data from different sources is brought together and reconciled into a common, integrated data set. This data set is then used to drive data orchestration and automation that pushes relevant data to the individuals and systems that need to consume it.

What Problems Does a Data Fabric Address?

The data fabric design concept is intended to solve the age-old data problem – “how can I make things that are fundamentally different look and act similar enough to treat them as if they were the same?”  As IT environments grow and evolve, the challenge gets more significant, and the urgency of providing a solution becomes more apparent.

  • Data siloes across business functions.
  • Diversity of data sources and types.
  • IT systems spread across physical operating environments (multi-cloud, on-premises, mobile, etc.).
  • Demand for real-time and event-driven data for decision making.
  • Growth in operational analytics and business-led data modeling activities.

The IT systems are getting more complex while the business is demanding simpler, faster data for decision making. Data fabric provides the capability to address both.

How Integration Platform as a Service Can Help Support Your Data Fabric

If you want to implement a data fabric, before you can start cataloging and refining data through the layers of the fabric, you first need to collect it from all of its various sources and get it in one place.  It also enables you to manage and orchestrate the connections from your data fabric to all the target consuming systems.

An IPaaS solution like Actian DataConnect provides this connectivity needed to make the design successful.   The data fabric provides a platform that enables data transformation; the IPaaS solution manages connectivity, security, authorized access, and orchestration of the data flow across your organization.

Do You Need a Data Fabric?

If you have a complicated IT environment, an ever-evolving business environment, and decision-makers that demand real-time data, then you need to be looking at a data fabric.  You also need to be looking at an Integration Platform like Actian DataConnect to manage the flow of data across your organization in a consistent and controlled way.

To learn more, visit

Traci Curran headshot

About Traci Curran

Traci Curran serves as Director of Product Marketing at Actian focused on the Actian Data Platform. With more than 20 years of experience in technology marketing, Traci has previously held senior marketing roles at CloudBolt Software, Racemi (acquired by DXC Corporation), as well as some of the world’s most innovative startups. Traci is passionate about helping customers understand how they can accelerate innovation and gain competitive advantage by leveraging digital transformation and cloud technologies.