Part 1: You Need a Data Fabric
Engineered business intelligence is, when you think about it, just what it says: a deliberate process to deriving business intelligence from data. The why of it is also straightforward: Businesses need to make informed decisions as a situation is unfolding. After the fact is usually too late. Too many decision-making outcomes are unsuccessful because complex ecosystems of data in motion make it hard to assemble data in a timely and contextually relevant manner. That’s the problem that engineered decision intelligence is trying to overcome.
But there’s an even more fundamental problem to overcome first. Right now there’s a huge arsenal of decision intelligence tools one can use—from basic query and reporting to analytics and advanced artificial intelligence and adaptive system applications. However, the insights these provide are only as good as the data that powers them. According to Gartner, that means engineered decision intelligence is about pairing these tools with a common data fabric and composability support—which enables the use of components from multiple data, analytics, and AI solutions—thus paving the way for decisions that are more accurate, repeatable, and traceable. To these benefits, I would add “timely,” because you need real-time decision-making capabilities in order to assess and share information as soon as it’s collected.
Let’s start by looking at why the data fabric is essential for engineered decision intelligence. I’ll be sharing insights on composability support and how to achieve it in a subsequent post.
What is a Data Fabric?
A data fabric is an architecture that provides a consistent set of data services and capabilities across your critical on-premises and cloud environments. It acts as a foundation that enables you to abstract data from systems that are physically and logically distinct to create a common set of data objects that you can treat as a unified enterprise data set.
Why Do I Need a Data Fabric for Engineered Decision Intelligence?
Because engineered decision intelligence needs to work with data from systems that may be on-premises, in the cloud, spread across multiple clouds, and even deployed remotely at the network’s edge, the data fabric provides a way to weave these sources into a network of information to power your decision intelligence tools.
By utilizing a data fabric, you can realize the full potential of your decision intelligence tools. Since they can access data across the enterprise faster and more efficiently, you’ll gain more integrated and accurate business insights and increased business agility. And, as decisions become more operationalized and standardized by the data fabric, they become more repeatable and traceable. Plus, as decision intelligence tools are able to execute more iterations on new data exposed by the data fabric, they can learn from previous outcomes to produce more reliable and repeatable results.
Avalanche: The Foundation of a Modern Data Fabric
The Actian Avalanche™ hybrid-cloud data warehouse, integration, and management platform provides the critical capabilities you need to implement a modern data fabric and unlock the value of your data for engineered decision intelligence. This is roughly a three-step process involving integration services built into Avalanche:
- The first step is to build a metadata catalog of contextual information about the data you intend to access—such as where the data came from, how the data is defined, and when it was last updated. Metadata makes the data more easily searchable and provides insight into the data profiles used in decision intelligence.
- Avalanche then uses the metadata catalog to create a knowledge graph. This provides a semantic layer that represents each entity (things such as person, location, organization, product, etc.) and its relationships with other entities. Avalanche uses artificial intelligence and machine learning to enrich the metadata, which further enhances data interpretation and contextualization. This helps users get more relevant and faster query responses. The knowledge graph also makes it possible to view the data from multiple dimensions and to access the data using a variety of decision intelligence tools—without modifying the source data on the underlying systems.
- Lastly, the integration services in Avalanche use the knowledge graph to bring together the requisite enterprise data sources and reconcile them into a common data set. Avalanche integration services connect with Google Cloud Storage, Amazon S3, and Azure Data Lake Storage—as well as more than 200 applications, web-service APIs, JSON data, and even spreadsheets. Once your data sources are integrated, the data fabric drives data flow orchestration and automation to deliver information to users and decision intelligence tools.
That’s a Wrap!
This is just a brief overview of the role a modern data fabric plays in the delivery of engineered decision intelligence. One of the key takeaways, here, is that a data fabric really matters in real-time decision-making use cases. If you want more insight into how Actian Avalanche can provide the data fabric functionality I’ve been talking about, you should read the three-part blog series by my colleague Lewis Carr, where he looks at the impact of Covid-19 on the retail industry and showcases how a modern data fabric can improve support for decision-makers. You’ll find that story here.
This article was originally published on The New Stack.
This article was co-authored by Lewis Carr.
Senior strategic vertical industries, horizontal solutions, product marketing, product management, and business development professional focused on Enterprise software, including Data Management and Analytics, Mobile and IoT, and distributed Cloud computing.