Summary

  • Identifies common barriers to innovation, such as data silos, quality issues, and latency, which prevent CEOs and teams from trusting their data.
  • Explains that true innovation begins with high-quality, real-time data that allows organizations to move swiftly from raw information to confident decisions.
  • Highlights how the Actian Data Platform simplifies complex transformations, enabling users of all skill levels to access and analyze data without relying on IT.
  • Positions a unified data platform—combining integration, quality, and analytics—as the key to managing hybrid and multi-cloud environments through a single pane of glass.
  • Connects trusted data to a stronger data-driven culture, allowing businesses to explore new use cases, increase revenue, and gain a strategic edge.

One highlight of my job is being able to talk to customers and prospective customers throughout the year at various events. What I keep hearing is that data is hard, and this holds true for companies of all sizes. And they’re right. Data can be hard. It can be hard to integrate, manage, govern, secure, and analyze. Building pipelines to new data sources can also be hard.

Business and IT both need data to be accessible to all users and applications, cost-effective to store, and deliver real-time insights. Any data challenges will limit these capabilities and present major barriers to innovation. That’s why we’ve made it our mission to make data easy and trustworthy.

Actian exists to provide the most trusted, flexible, and easy-to-use data platform on the market. We know that’s a bold promise and requires solving a lot of your data pain points. Yet we also know that to be truly data-driven, you must have uninterrupted access to trusted data.

Overcoming the Trust Barrier

At Actian, we’ve been saying for a long time that you need to be able to trust your data. For too many companies, that’s not happening, or it’s not happening promptly. For example, nearly half—48%—of CEOs worry about data accuracy, according to IBM, while Gartner found that less than half of data and analytics teams—just 44%—are effectively providing value to their organization.

These numbers are unacceptable, especially in the age of technology. Everyone who uses data should be able to trust it to deliver ongoing value. So, we have to pause and ask ourselves why this isn’t happening. The answer is that common barriers often get in the way of reaching data goals, such as:

  • Silos that create isolated, outdated, and untrustworthy data.
  • Quality issues, such as incomplete, inaccurate, and inconsistent data.
  • Users do not have the skills needed to connect and analyze data, so they rely on IT.
  • Latency issues prevent real-time data access, which limits timely insights.
  • Data management problems that existed on-premises were migrated to the cloud.

Organizations know they have some or all of these problems, but they often don’t know what steps are needed to resolve them. Actian can help. We have the technology and expertise to enable data confidence—regardless of where you are on your data journey.

Innovation Starts With Trustworthy Data

What if you could swiftly go from data to decision with full confidence and ease? It doesn’t have to be a pipe dream. The solution is readily available now. It ensures you’re using high-quality, accurate data so you have full confidence in your decision-making. It simplifies data transformations, empowering you to get the data you want, when and how you want it, regardless of your skill level, and without relying on IT. Plus, you won’t have to wait for data because it gets delivered in real-time.

The Actian Data Platform makes data easy-to-use, allowing you to meet the needs of more business users, analysts, and data-intensive applications. You can collect, manage, and analyze data in real-time with our transactional database, data integration, data quality, and data warehouse capabilities working together in a single, easy-to-use platform.

The platform lets you manage data from any public cloud, multi- or hybrid cloud, and on-premises environment through a single pane of glass. The platform’s self-service data integration lowers costs while enabling you to perform more use cases without needing multiple data products.

What does all of this mean for your business? It means that data integration, access, and quality are easier than ever. It also means that you can trust your data to make confident decisions that accelerate your organization’s growth, foster new levels of innovation, support your digital transformation, and deliver other business value.

Enabling a Data-Driven Culture

With data volumes becoming more robust, having immediate access to high-quality data is essential, but challenging. Any problems with quality, latency, or integration will compound as data volumes grow, leading to potentially misinformed decision-making and mistrust in the data. Establishing data quality standards, making integration and access easy, and putting data in the hands of everyone who needs it advances the business, promotes a data-driven culture, and drives innovation. And this is where Actian can play a critical role.

What makes the Actian Data Platform unique, at a high level, is its ability to consolidate various data functions into a single platform, making data readily available and easy to use across your organization.

The platform handles extract, transform, and load (ETL), data transformation, data quality checks, and data analytics all in one place. Bringing everything and everyone together on a single platform lowers costs and reduces the resources needed to manage your data system. You benefit from real-time, trustworthy data across the entire organization, giving you full confidence in your data.

When you trust your data, you have the ability—and the confidence—to explore more use cases, increase revenues, reduce costs, fast-track innovation, win market share, and more for a strategic edge in your industry. Our customers are using data to drive new successes everyday!

Additional Resources:


Blog | Data Intelligence | | 7 min read

What is Data Sharing: Benefits, Challenges, and Best Practices

Summary

  • What data sharing is and why it matters for AI and analytics.
  • 10 concrete benefits—from trust to cost efficiency.
  • Challenge→solution guidance (privacy, security, scale, quality).
  • 6‑step playbook with KPIs and SLO examples to operationalize sharing.

Introduction

Data sharing is the intentional exchange of data between people, teams, systems, or organizations so that it can be discovered, trusted, and reused to create business value. Modern data sharing is not just transferring files — it requires cataloged metadata, access controls, quality SLAs, and governance that together enable secure, compliant, and measurable reuse of data as products. This article explains what data sharing is, the concrete benefits, the common challenges and mitigations, and a practical 6‑step implementation roadmap with metrics and sector checklists.

Definition and the AI Imperative

What data sharing really means

Data sharing includes the packaging, documentation, access controls, observability, and lifecycle management that allow data producers to publish reliable data products and data consumers to discover and consume them confidently. It covers internal sharing across domains and external sharing with partners, regulators, or customers.

Why data sharing matters now

Widespread AI adoption, real‑time analytics, and distributed architectures make high‑quality, discoverable data essential. Good data sharing accelerates AI initiatives, reduces duplicated engineering effort, and enables cross‑functional workflows by making authoritative data products available where and when they’re needed.

10 Concrete Benefits of Data Sharing

  1. Faster decision-making — timely access to trusted data reduces time‑to‑insight.
  2. Better collaboration — shared data products align business and analytics teams.
  3. AI readiness — consistent labeled datasets accelerate model training and validation.
  4. Cost efficiency — reuse reduces duplicate ingestion, storage, and integration effort.
  5. Higher data trust — standardized metadata, lineage, and SLOs increase confidence.
  6. Compliance posture — centralized policies and audit trails simplify reporting.
  7. Innovation velocity — external and cross‑domain sharing spurs new use cases.
  8. Operational resilience — shared observability helps detect and fix data issues faster.
  9. Revenue enablement — monetizable data products and partner integrations create new streams.
  10. Measurable outcomes — SLOs/SLIs enable objective measurement of data product health.

Key Challenges and How to Mitigate Them

Below are common challenges with practical mitigations you can implement.

1. Privacy & compliance

Challenge: Regulatory obligations and consent limits what you can share.
Mitigation: Classify data, enforce purpose‑based access, deploy masking/anonymization, and embed consent metadata. Maintain an auditable policy catalog.

2. Security & access control

Challenge: Overexposure or misconfigured access causes breaches.
Mitigation: Use role‑based access, attribute‑based policies, encryption in transit and at rest, and automated entitlement reviews.

3. Data quality & trust

Challenge: Consumers don’t trust data they didn’t produce.
Mitigation: Publish quality metrics, lineage, and SLOs with each data product; require producers to attach data contracts and validation checks.

4. Volume, latency & transport

Challenge: Moving massive datasets is slow and expensive.
Mitigation: Share by reference where possible (remote query, virtual views), use federated queries, and compress or stream only required slices.

5. Interoperability & format drift

Challenge: Heterogeneous formats and schemas block reuse.
Mitigation: Standardize schemas and APIs, provide sample queries and adapters, and version data products.

6. Governance and ownership confusion

Challenge: No clear owner leads to stale or conflicting data products.
Mitigation: Define domain ownership, publish SLAs, require stewards, and enforce lifecycle policies in the catalog.

6‑Step Best‑Practice Roadmap (Actionable)

Follow these steps to operationalize data sharing. Each step includes recommended KPIs.

Step 1 — Set outcomes & operating model

  • Actions: Define business use cases, data products, and success metrics.
  • KPIs: % of use cases with mapped data products; executive sponsor coverage.

Step 2 — Establish governance and policies

  • Actions: Create role definitions (producers/consumers/stewards), data classification, and sharing policies.
  • KPIs: Policy coverage (% of data products governed), compliance audit pass rate.

Step 3 — Cataloging & metadata-first design

  • Actions: Publish data products with rich metadata, business glossary, lineage, tags, and SLOs.
  • KPIs: Discoverability rate (search success), % data products with lineage and metadata.

Step 4 — Secure access controls & data contracts

  • Actions: Implement RBAC/ABAC, data contracts, encryption, and dynamic masking where needed.
  • KPIs: Unauthorized access incidents, time to grant/revoke access.

Step 5 — Observability & SLO-driven sharing

  • Actions: Instrument data products with SLIs (freshness, completeness, accuracy) and SLOs, and set alerts.
  • KPIs: SLO attainment rate, mean time to detect/resolve data incidents.

Step 6 — Marketplace, reuse & continuous improvement

  • Actions: Provide a data marketplace or exchange with pricing/consumption tracking, feedback loops, and lifecycle automation.
  • KPIs: Reuse rate, consumer satisfaction score, cost per data product.

Data Mesh, Data Products, and Marketplaces (Practical Guidance)

Domain ownership and data products

Adopt a product mindset: each domain publishes data products they own and maintain. Define explicit APIs, SLAs, metadata, and a lifecycle policy. This federates responsibility while keeping governance consistent.

Central marketplace features

A data marketplace should provide searchable catalog entries, usage and cost metrics, access workflows, contracts, and automated onboarding for new consumers. Coupling a marketplace with governance and observability reduces friction.

Suggested SLIs (examples) and typical SLO targets you can adapt:

  • Freshness: time since last update; SLO example: 95% of records updated within X hours.
  • Availability: query success rate; SLO example: 99% success.
  • Accuracy/Quality: % of records passing validation checks; SLO example: 98% pass rate.
  • Discoverability: % of searches that return relevant data products; SLO example: 80%+ success.
  • Access compliance: % of access events with policy checks; target: 100%.

Sector‑Specific Compliance Checklist

For any regulated use case:

  • Classify personal and sensitive data.
  • Apply minimization and purpose limits.
  • Attach consent and retention metadata.
  • Use encryption and least privilege.
  • Maintain audit logs and retention policies.
  • Validate cross‑border transfer rules and update contracts with partners.

Use Cases and Measurable Outcomes (Examples)

Healthcare (internal & cross‑provider sharing)

Outcome: Securely sharing longitudinal patient records reduces duplicate tests, improves continuity of care, and enables better population health analytics. Measure: decrease in integration time and fewer manual reconciliations.

Financial services (risk modeling)

Outcome: Shared canonical customer and transaction data enables faster, auditable risk models and reduced model training time. Measure: improved model retraining cadence and reproducible lineage for regulators.

Retail (personalization & supply chain)

Outcome: Sharing inventory, sales, and customer signals across teams helps optimize assortment and personalization. Measure: faster experiments and reduced time between data availability and campaign activation.

(Note: Use cases illustrate typical outcomes; adapt KPIs to your environment.)

What Can Go Wrong — Common Failure Modes and Prevention

  • Publishing poor or undocumented data products → prevent by requiring metadata, tests, and reviews.
  • Excessive copying of data → use virtual views and federated queries.
  • Stale or broken pipelines → instrument observability and SLOs with automated alerts.
  • Overexposure to partners → enforce contracts, purpose checks, and tokenized access.

Implementing With Your Data Stack (How Tooling Fits)

To operationalize these practices, you’ll typically combine:

  • A metadata catalog (discoverability, glossary, lineage).
  • Access control and entitlement systems (RBAC/ABAC, encryption).
  • Observability/monitoring (SLO/SLI tracking, lineage‑linked alerts).
  • A data marketplace or portal (consumption workflows, catalogs, contracts).

Actian’s data intelligence and data observability solutions can be used to integrate these capabilities into existing environments and workflows.

Next Steps

Start by mapping the highest‑impact use cases, defining the smallest viable data products, and publishing them to a catalog with SLAs and lineage. Use the 6‑step roadmap and the SLO suggestions above as your implementation checklist.

FAQ

  • What is the difference between internal and external data sharing?

Internal is sharing within an organization to break silos; external includes partners, suppliers or regulators and requires stricter controls and contracts.

  • How do you measure successful data sharing?

Use KPIs such as reuse rate, SLO attainment (freshness/accuracy), discoverability, time‑to‑insight, and compliance audit pass rates.

  • Q: When should you use federated queries vs copying data?

Use federated access for large or frequently updated datasets to avoid duplication; copy slices when latency and performance require local materialization with clear update policies.

  • How do data products relate to Data Mesh?

Data Mesh emphasizes domain ownership and treating shared datasets as products with owners, SLAs, and discoverable metadata — a pattern that supports scalable sharing.

  • What are minimal controls for secure external sharing?

Data classification, encryption, contractual agreements, least privilege access, masking/anonymization, and full audit trails.


Blog | Manufacturing | | 5 min read

Measuring and Reporting on Supply Chain Sustainability the Right Way

plane and truck showing supply chain optimization

In an era where sustainability is not just a buzzword but a strategic imperative, the supply chain plays a pivotal role in shaping an organization’s environmental and social footprint. Here are some ways to guide your business on the essential aspects of measuring and reporting sustainability within the supply chain, focusing on data management, goal and metric definitions, and adherence to reporting standards.

Data Management: Unraveling the Threads of Sustainability

In the intricate web of supply chain operations, data serves as the thread that weaves together the fabric of sustainability. Comprehensive data management is essential for measuring, monitoring, and optimizing sustainability initiatives within all aspects of your organization’s supply chain.

The first step in sustainable data management is collecting relevant information across the organization. Some examples of this data include energy consumption, water usage, waste generation, emissions, and social impact factors such as labor practices and community engagement. The challenge, however, is gathering data from diverse sources—including suppliers, manufacturers, logistics partners, and internal operations. Strategies for overcoming this include implementing data-sharing agreements with vendors, conducting regular audits, and leveraging emerging technologies like Internet of Things (IoT) sensors, blockchain, and the API integration capabilities of your data platform to track and trace environmental and social performance throughout the supply chain.

Once collected, sustainability data must be organized coherently and structured to facilitate fast analysis and decision-making. This means establishing a clear taxonomy and data schema that categorizes information according to relevant sustainability indicators, like carbon emissions or waste generation. This is where data visualization tools and dashboards come in handy because they will help present the information in a user-friendly format.

Defining Goals and Metrics: Charting a Course for Sustainable Success

Once the data is collected and integrated, the next step is to establish goals and metrics for meaningful action and measurable progress. By breaking down silos and integrating data from various departments, sources, and stakeholders, organizations can gain a comprehensive understanding of their environmental and social impact across the entire supply chain. This integrated approach allows you to identify and establish goals that address the most significant areas of opportunity and risk.

Implementing policies to act on the data requires a strategic and proactive approach that aligns with your defined goals and metrics. Best practices include setting ambitious, yet achievable, targets based on data-driven insights and industry benchmarks. These targets should provide clear direction and accountability for sustainability efforts. Additionally, your organization should develop policies and procedures to track progress toward these targets, leveraging technology and data analytics to monitor performance in real-time to course correct as needed.

Establishing a culture of continuous improvement and accountability is essential, with regular reviews and updates to policies and targets based on evolving data insights and stakeholder expectations.

Reporting Standards: Navigating the Landscape of Transparency

Established reporting frameworks such as the Global Reporting Initiative (GRI) and the Sustainability Accounting Standards Board (SASB) play a crucial role in guiding organizations toward transparent and consistent sustainability reporting. These frameworks provide comprehensive guidelines and standardized metrics for measuring and disclosing environmental, social, and governance (ESG) performance.

Adhering to recognized reporting standards helps organizations enhance credibility and comparability in the eyes of stakeholders—including investors, customers, employees, and regulators. Consistent reporting enables investors to make informed investment decisions, customers to make ethical purchasing choices, and regulators to enforce compliance with environmental and social regulations.

The emergence of integrated reporting represents a paradigm shift in how organizations disclose their performance and make holistic decisions, moving beyond traditional financial metrics to encompass broader value creation for all stakeholders. Integrated reporting seeks to present financial and sustainability performance cohesively, acknowledging the interconnectedness between financial success and environmental and social impact.

By integrating financial and non-financial data into a single, comprehensive report, organizations can provide stakeholders with a holistic view of their long-term value-creation strategy. Integrated reporting encourages a more balanced and sustainable approach to business decision-making, where financial considerations are complemented by environmental and social considerations. As organizations increasingly recognize the importance of holistic value creation, integrated reporting, and integrated data in general, is the key for communicating sustainability performance and demonstrating long-term resilience and viability.

Integration is Hard, but Actian Can Help

The Actian Data Platform offers invaluable capabilities to companies striving to enhance their ESG efforts and reporting accuracy. By providing a unified platform for data management, integration, and analytics, Actian empowers organizations to access, analyze, and leverage sustainability-related data from across the entire supply chain in real-time.

With these real-time insights into key ESG metrics, your company can make informed decisions that drive sustainable practices and optimize resource usage. Actian’s advanced integration capabilities empower your organization to identify trends, patterns, and opportunities for improvement, facilitating proactive interventions to minimize environmental impact and maximize social responsibility. Moreover, by streamlining data collection and aggregation, Actian enhances confidence that sustainability reports are comprehensive, accurate, and timely, bolstering credibility and trust with stakeholders.

Measuring and reporting sustainability in the supply chain requires a strategic and holistic approach. By mastering data management, defining clear goals and metrics, and adhering to reporting standards, businesses can not only enhance their environmental and social impact but also build trust with stakeholders. By making data easy, the Actian Data Platform enables you to drive and monitor sustainability initiatives across your entire supply chain.


Almost every organization is embarking on some sort of digital transformation initiative. The role of a Chief Digital Transformation Officer (CDO) has emerged to lead and oversee success. Major responsibilities of the CDO include:

  • Define and implement a digital strategy for the company’s future that includes technologies such as the cloud, artificial intelligence, automation, Internet of Things (IoT), and social media.
  • Integrate digital initiatives with strategic planning to gain executive leadership commitment and budget, and resource allocation.
  • Work with cross-functional teams to generate innovative digital solutions for products, services, customer experiences, sales, marketing, and optimized business processes.
  • Own, prioritize, monitor, and manage the company’s digital innovation project portfolio.
  • Serve as an evangelist and a change agent, championing the use of digital technology and practices.

Top Challenges for the CDO

Executing the above activities is a demanding job. While specific challenges may vary based on the industry and organizational context, some common challenges faced by the CDO include resistance to change, budget justification, hard-to-replace legacy technologies, the skills gap, and demonstrating success as discussed below.

Resistance to Change

A general sentiment expressed as “if it ain’t broke don’t fix it” competes with an overarching sense of urgency created by the need for digital transformation. To overcome the status quo, CDOs are constantly engaged in identifying and clearly communicating the pain points stagnation is causing. These often include compatibility and obsolescence issues, security and compliance risks, missing functionality, lack of scalability to meet business growth, expensive maintenance, inefficient workflows, and processes that hamper business agility, and poor user experiences.

Budget Justification

It can be challenging to show that the cost of modernization is significantly less than maintaining legacy systems over time. The budget justification challenge is compounded by maintenance and innovation budgets that are usually separate along with the sentiment that money might be better spent on opportunities other than what is working as intended, especially if the return on investment of modernization will take time to materialize.

These issues place a heavy onus on CDOs to highlight the opportunities that digital transformation presents. By embracing digital transformation, CDOs elaborate on the business value, such as operational efficiency, optimizing the customer experience, product innovation, data-driven decision-making, business agility, sustainability, and staff productivity.

Plus, it’s a lot easier to integrate digital technologies with a wide range of systems, processes, and functions across an organization than to integrate legacy ones. This is important because digital technologies play a critical role in optimizing the supply chain by improving visibility, efficiency, and collaboration. Legacy modernization in the realm of electronic commerce is another key example that can lead to a more agile and user-friendly online shopping experience that supports a greater choice of web and mobile interfaces.

Hard-to-Replace Legacy Technologies

As businesses attempt to modernize, many have legacy systems and infrastructure that are hard to replace. On-premises to cloud migration can be a long and risky journey. Migrating or replacing these systems while ensuring business continuity requires careful planning and resources. The CDO often oversees the development of a data migration strategy to ensure a smooth transition from legacy systems to modern platforms and their integration with existing applications, databases, and platforms. Identifying and mitigating risks associated with legacy system replacement is critical to avoid disruption of mission-critical systems. 

Talent Acquisition and Skill Gaps

Not only is attracting, developing, and retaining talent with the right digital skills a constant challenge, but existing legacy staff will need to be retrained and/or upskilled. Layoffs in technology may be in full swing, but demand in 2024 for digital transformation technical skills such as cloud, DevOps, security, privacy, development, artificial intelligence, automation, system updates, data integration, and analytics is high according to Robert Half Technology’s 2024 IT salary report.

Showing Success

Demonstrating positive business outcomes is critical to continued success, but how to measure them isn’t easy. CDOs often use these types of key performance indicators (KPIs) to gauge impact:

  • Percentage increase in digital sales or revenue.
  • Customer satisfaction scores (CSAT), Net Promoter Score (NPS), and other customer engagement metrics.
  • Time to launch for digital products and services.
  • Percentage of users or employees adopting new digital tools and processes.
  • Cost savings achieved through process automation or efficiency gains.

Digital Transformation With Actian

Actian transforms business by enabling customers to make confident, data-driven decisions that accelerate their organization’s growth. We are committed to helping our customers secure their digital future by making it easy to modernize their databases and database applications, including flexible choices for on-premises to cloud migration.


New demands, supply chain complexity, truly understanding customers, and other challenges have upended the traditional business landscape and forced organizations to rethink their strategies and how they’re using data. Organizations that are truly data-driven have opportunities to gain new market share and grow their competitive advantage with proper data management. Those that don’t will continue to struggle—and in a worst-case scenario, may not be able to keep their doors open.

Data Management is Needed to Drive and Support Use Cases

As organizations face the threat of a recession, geopolitical instability, concerns about inflation, and uncertainty about the economy, they look to data for answers. Data has emerged as a critical asset for any organization striving to intelligently grow their business, avoid costly problems, and position themselves for the future.

As explained in the webinar “Using Data in a Downturn: Building Business Resiliency with Analytics,” successful organizations optimize their data to be proactive in changing markets. The webinar, featuring William McKnight from McKnight Consulting Group, notes that data is needed for a vast range of business uses, such as:

  • Gaining a competitive advantage.
  • Increasing market share.
  • Developing new products and services.
  • Entering new markets.
  • Increasing brand recognition and customer loyalty.
  • Improving efficiency.
  • Enhancing customer service.
  • Developing new technologies.

McKnight says that when it comes to prioritizing data efforts, you should focus on projects that are easy to do with your current technology set and skill set, those that align with your business priorities, and ones that offer a high return on investment (ROI).

Justifying Data and Analytics Projects During a Downturn

The webinar explains why data and analytics projects are needed during an economic downturn. “Trusted knowledge of an accurate future is undoubtedly the most useful knowledge to have,” McKnight points out. Data and analytics predict that future, giving you the ability to position your company for what’s ahead.

Economic conditions and industry trends can change quickly, which means you need trustworthy data to inform the analytics. When this happens, you can uncover emerging opportunities such as products or features your customers will want or identify areas of risk with enough time to take action.

McKnight explains in the webinar that a higher degree of accuracy in determining your future can have a significant impact on your bottom line. “If you know what’s going to happen, you can either like it and leave it, or you can say, ‘I don’t like that, and here’s what I need to do to tune it,’ and that’s the essence of analytics,” he says.

Applying Data Management and Customer Analytics to Achieve High-Value Results

Not surprisingly, the more data you make available for analytics, the more precise your analytics will be. As the webinar explains, artificial intelligence (AI) can help with insights. AI enhances analytics, provided the AI has the robust and quality data sets needed to deliver accurate and actionable results. The right approach to data and analytics can help you determine the next best step you can take for your business.

You can also use the insights to drive business value, such as creating loyal customers and repeat buyers, and proactively adjusting your supply chain to stay ahead of changing conditions. McKnight says in the webinar that leading companies are using data management and customer analytics to drive ROI in a myriad of ways, such as optimizing:

  • Product placement in stores.
  • Product recommendations.
  • Content recommendations.
  • Product design and offerings.
  • Menu items in restaurants.

All of these efforts increase sales. Likewise, using data and analytics can drive results across the supply chain. For example, you can use data to optimize inventory and ensure fast delivery times, or incorporate real-time data on customer demand, inventory levels, and transportation logistics to have products when and where they’re needed. Similarly, you can take a data-driven approach to demand forecasting, then optimize product distribution, and improve visibility across the entire supplier network.

Data Best Practices Hold True in Soft Economies

Using data to drive the business and inform decision-making is essential in any economy. During an economic downturn, you may need to shift priorities and decide what projects and initiatives to pursue, and which to pause or discontinue.

To help with these decisions, you can use your data foundation, follow data management best practices, continue to use data virtualization, and ensure you have the ability to access accurate data in real time. A modern data platform is also needed to integrate and leverage all your data.

The Actian Data Platform offers integration as a service, makes data easy to use, gives users confidence in their data, improves data quality, and more. The platform empowers you to go from data source to decision with confidence. You have the ability to better utilize data in an economic downturn, or any other market conditions.


You’ve undoubtedly heard of APIs—ubiquitous yet often misunderstood. Curious to learn everything about APIs, or Application Programming Interfaces? Let’s uncover what they do, their benefits, and how they operate.

API—three letters without which companies today couldn’t seamlessly deploy their data strategies. An Application Programming Interface is a set of rules and protocols enabling two distinct software programs to communicate. It defines the methods and data formats allowed for information exchange, facilitating the integration of different applications or services.

The concept of APIs dates back to the early days of computing. In the 2000s, with the growth of the Internet and the rise of web services, APIs gained significant importance. Companies began providing APIs to enable the integration of their services with other applications and systems. In 2020, it’s estimated that nearly 2 billion euros were invested worldwide to develop APIs.

How Does an API Work?

In the world of diplomacy, there are interpreters. In the IT universe, there are APIs. This somewhat straightforward comparison sums up the function of an API. It acts as an intermediary, receiving requests and returning structured responses. An API operates by defining endpoints accessible via HTTP requests. These endpoints represent specific functionalities of the application, and developers interact with them using standard HTTP methods such as GET, POST, PUT, and DELETE. Data is then exchanged in JSON or XML format. The API specifies necessary parameters, expected data types, and possible responses. HTTP requests contain information like headers and query bodies, allowing data transmission. Responses provide status codes to indicate success or failure, accompanied by structured data.

API documentation, usually based on specifications like Open API, describes in detail how to interact with each endpoint. Authentication tokens can be used to secure API access. In summary, an API acts as an external interface, facilitating integration and communication between different applications or services.

What are the Benefits of APIs?

Using APIs offers numerous advantages in the software and system integration realm. They simplify access to an application’s features, allowing developers to leverage external services without necessarily understanding their internal implementation. This promotes modularity and accelerates the development of interconnections between essential business solutions for your employees’ efficiency.

Furthermore, APIs facilitate integration between different applications, creating interconnected software ecosystems. The key advantage? Substantially improved operational efficiency. Updates or improvements can be made to an API without affecting the clients using it. Code reuse is encouraged, as developers can leverage existing functionalities via APIs rather than recreating similar solutions, resulting in significant cost savings and shorter development timelines that contribute to your business’s agility.

Finally, APIs offer an improved collaboration perspective between teams, as different groups can work independently using APIs as defined interfaces.

Different Types of APIs

APIs form a diverse family. Various types cater to specific needs:

Open API

Also known as an external API or public API, it is designed to be accessible to the public. Open APIs follow standards like REST or GraphQL, fostering collaboration by allowing third-party developers or other applications to access a service’s features and data in a controlled manner.

Partner API

Partner APIs, or partner-specific APIs, are dedicated to specific partners or trusted external developers. These APIs offer more restricted and secure access, often used to extend an application’s features to strategic partners without exposing all functionalities to the public.

Composite API

Behind the term Composite API lies the combination of several different API calls into a single request. The benefit? Simplifying access to multiple functionalities in a single call, reducing interaction complexity, and improving performance.

Internal API

Designed for use within an organization, this type of API facilitates communication between different parts of a system or between different internal systems. It contributes to the modularity and coherence of applications within the company.

Different API Protocols

If APIs can be compared to interpreters, the protocols they use are, in a sense, the languages that enable them to communicate. There are four protocols:

SOAP (Simple Object Access Protocol)

Using XML, SOAP is a standardized protocol that offers advanced features such as security and transaction management. However, it can be complex and require significant resources.

XML-RPC (XML Remote Procedure Call)

The primary quality of this protocol is its simplicity. Based on XML, it allows the calling of remote procedures. Although less complex than SOAP, it offers limited features and is often replaced by more modern alternatives.

REST (Representational State Transfer)

Founded on HTTP principles, REST uses standard methods like GET, POST, PUT, and DELETE to manipulate resources. It exploits the JSON data format, deriving its simplicity, scalability, and flexibility.

JSON-RPC (JavaScript Object Notation Remote Procedure Call)

Lightweight and based on JSON, JSON-RPC facilitates the calling of remote procedures. It provides a simple alternative to XML-RPC and is often used in web and mobile environments.


Blog | Data Intelligence | | 6 min read

Why a Data Catalog is Essential for Data Product Management

business glossary data catalog

Data Mesh is one of the hottest topics in the data space. In fact, according to a recent BARC Survey, 54% of companies are planning to implement or are implementing the Data Mesh in their companies. Implementing Data Mesh architecture in your enterprise means incorporating a domain-centric approach to data and treating data as a product. Data Product Management is, therefore, crucial in the Data Mesh transformation process. Eckerson Group Survey 2024 found that 70% of organizations have or are in the process of implementing Data Products.

However, many companies are struggling to manage, maintain, and get value out of their data products. Indeed, successful Data Product Management requires establishing the right people, processes, and technologies. One of those essential technologies is a data catalog.

In this article, discover how a data catalog empowers data product management in data-driven companies.

Quick Definition of a Data Product

In a previous article on Data Products, we detailed the definition and characteristics of Data Products. We define a Data Product as being:

“A set of value-driven data assets specifically designed and managed to be consumed quickly and securely while ensuring the highest level of quality, availability, and compliance with regulations and internal policies.”

Let’s get a refresher on the characteristics of a Data Product. According to Zhamak Dehghani, the Data Mesh guru, to deliver the best user experience for data consumers, data products need to have the following basic qualities:

  • Discoverable
  • Addressable
  • Trustworthy and truthful
  • Self-describing semantics and syntax
  • Inter-operable and governed by global standards
  • Secure and governed by a global access control

How can you ensure your sets of data meet the criteria for becoming a functional and value-driven Data Product? This is where a data catalog comes in.

What Exactly is a Data Catalog?

Many definitions exist of what a data catalog is. We define it as “A detailed inventory of all data assets in an organization and their metadata, designed to help data professionals quickly find the most appropriate data for any analytical business purpose.” Basically, a data catalog’s goal is to create a comprehensive library of all company data assets, including their origins, definitions, and relations to other data. And like a catalog for books in a library, data catalogs make it easy to search, find, and discover data.

Therefore, in an ecosystem where volumes of data are multiplying and changing by the second, it is crucial to implement a data cataloging solution – a data catalog answers the who, what, when, where, and why of your data.

But, how does this relate to data products? As mentioned in our previous paragraph, data products have fundamental characteristics that they must meet to be considered data products. Most importantly, they must be understandable, accessible, and made available for consumer use. Therefore, a data catalog is the perfect solution for creating and maintaining data products.

View our Data Catalog capabilities.

A Data Catalog Makes Data Products Discoverable

A data catalog collects, indexes, and updates data and metadata from all data sources into a unique repository. Via an intuitive search bar, data catalogs make it simple to find data products by typing simple keywords.

Our data catalog enables data users to not only find their data products but to fully discover their context, including their origin and transformations over time, their owners, and most importantly, to which other assets it is linked for a 360° data discovery. Actian Data Intelligence Platform was designed so users can always discover their data products, even if they don’t know what they are searching for. Indeed, our platform offers unique and personalized exploratory paths so users can search and find the information they need in just a few clicks.

A Data Catalog Makes Data Products Addressable

Once a data consumer has found the data product, they must be able to access it or request access to it in a simple, easy, and efficient way. Although a data catalog doesn’t play a direct role in addressability, it certainly can facilitate and automate part of the work. An automated Data Catalog solution plugs into policy enforcement solutions, accelerating data access (if the user has the appropriate permissions).

A Data Catalog Makes Data Products Trustworthy

We strongly believe that a data catalog is not a data quality tool. However, our catalog solution automatically retrieves and updates quality indicators from third-party data quality management systems. With the Actian Data Intelligence Platform, users can view their quality metrics via a user-friendly graph and instantly identify the quality checks that were performed, their quantity, and whether they passed, failed, or issued warnings. In addition, our Lineage capabilities provide statistical information on the data and reconstruct the lineage of the data product, making it easy to understand the origin and the various transformations over time. These features combined increase trust in data and ensure data users are always working with accurate data products.

A Data Catalog Makes Data Products Understandable

One of the most significant roles of a data catalog is to provide all the context necessary to understand the data. By efficiently documenting data, with both technical and business documentation, data consumers can easily comprehend the nature of their data and draw conclusions from their analyses. In the Actian Data Intelligence Platform, Data Stewards can easily create documentation templates for their Data Products and thoroughly document them, including detailed descriptions, associating Glossary Items, relationships with other Data Products, and more. By delivering a structured and transparent view of your data, the Actian Data Intelligence Platform’s data catalog promotes the autonomous use of Data Products by data consumers in the organization.

A Data Catalog Enables Data Product Interoperability

With comprehensive documentation, a data catalog facilitates data product integration across various systems and platforms. It provides a clear view of data product dependencies and relationships between different technologies, ensuring the sharing of standards across the organization. In addition, a data catalog maintains a unified metadata repository, containing standardized definitions, formats, and semantics for various data assets. Our platform is built on powerful knowledge graph technology that automatically identifies, classifies, and tracks data products based on contextual factors, mapping data assets to meet the standards defined at the enterprise level.

A Data Catalog Enables Data Product Security

A data catalog typically includes robust access control mechanisms that allow organizations to define and manage user permissions. This ensures that only authorized personnel have access to sensitive metadata, reducing the risk of unauthorized access or breaches. With the Actian Data Intelligence Platform, you create a secure data catalog, where only the right people can act on a data product’s documentation.

Start Managing Data Products in the Actian Data Intelligence Platform

Interested in learning more about how Data Product Management works in the Actian Data Intelligence Platform? Get a 30-minute personalized demo with one of our experts now.


Insights from Matt Aslett, VP Research Director at Ventana Research: Would you be surprised to know that by 2026, eight in 10 enterprises will have data spread across multiple cloud providers and on-premises data centers? This prediction by Ventana Research’s Matt Aslett is based, at least in part, on the trend of organizations increasingly using more than one cloud service in addition to their on-premises infrastructure.

Optimizing all of this data—regardless of where it lives—requires a modern data platform capable of accessing and managing data in hybrid environments. “As such, there is a growing requirement for cloud-agnostic data platforms, both operational and analytic, that can support data processing across hybrid IT and multi-cloud environments,” Aslett explains.

For many organizations, managing data while ensuring quality in any environment is a struggle. New data sources are constantly emerging and data volumes are growing at unprecedented rates. When you couple this with an increase in the number of data-intensive applications and analysts who need quality data, it’s easy to see why data management is more complex but more necessary than ever before.

As organizations are finding, data management and data quality problems can and will scale—challenges, silos, and inefficient data processes that exist on-premises or in one cloud will compound as you migrate across multiple clouds or hybrid infrastructures. That’s why it’s essential to fix those issues now and implement effective data management strategies that can scale with you. 

Replacing Complexity With Simplicity

Ventana research also says that traditional approaches to data processing rely on a complex and often “brittle” architecture. This type of architecture uses a variety of specialized products cobbled together from multiple vendors, which in turn require specialized skill sets to use effectively.

As additional technologies are bolted onto the architecture, processes and data sharing become even more complex. In fact, one problem we see at Actian is that organizations continue to add new data and analytics products into ecosystems that are bogged down with legacy technologies. This creates a complicated tech stack of disparate tools, programming languages, frameworks, and technologies that create barriers to integrating, managing, and sharing data.

For a company to be truly data-driven, data must be easily accessible and trusted by every analyst and data user across the enterprise. Any obstacles to tapping into new data sources or accessing quality data, such as requiring ongoing IT help, encourage data silos, and shadow IT—common problems that can lead to misinformed decision-making and will cause stakeholders to lose confidence in the data.

A modern data platform that makes data easy to access, share, and trust with 100% confidence is needed to encourage data use, automate processes, inform decisions, and feed data-intensive applications. The platform should also deliver high performance and be cost-effective to appeal to everyone from data scientists and analysts who use the data to the CFO who’s focused on the IT budget.

Manageability and Usability Are Critical Platform Capabilities

Today’s data-driven environment demands an easy-to-use cloud data platform. Choosing the best platform to meet your business and IT needs can be tricky. Recognized industry analyst research can help by identifying important platform capabilities and identifying which vendors lead in those categories.

For example, Ventana Research’s “Data Platforms Value Index” is an assessment you can use to evaluate vendors and products. One capability the assessment evaluated is product manageability, which is how well the product can be managed technologically and by the business, and how well it can be governed, secured, licensed, and supported in a service level agreement (SLA).

The assessment also looked at the usability of the product—how well it meets the various business needs of executives, management, workers, analysts, IT, and others. “The importance of usability and the digital experience in software utilization has been increasing and is evident in our market research over the last decade,” the assessment notes. “The requirements to meet a broad set of roles and responsibilities across an organization’s cohorts and personas should be a priority for all vendors.”

The Actian Data Platform ranked second for manageability and third for usability, which reflects the platform’s ease of use by making data easy to connect, manage, and analyze. These key capabilities are must-haves for data-driven companies.

Cut Prep Time While Boosting Data Quality

According to Ventana Research, 69% of organizations cite data prep as consuming the most time in analytics initiatives, followed by reviewing data for quality issues at 64%. This is consistent with what we hear from our customers

This is due to data silos, data quality concerns, IT dependency, data latency, and not knowing the steps to optimize data to intelligently grow the business. Organizations must remove these barriers to go from data to decision with confidence and ease.

The Actian Data Platform’s native data integration capabilities can help. It allows you to easily unify data from different sources to gain a comprehensive and accurate understanding of all data, allowing for better decision-making, analysis, and reporting. The platform supports any source and target data, offers elastic integration and cloud-automated scaling, and provides tools for data integration management in hybrid environments.

You benefit from codeless API and application integration, flexible design capabilities, integration templates, and the ability to customize and re-use integrations. Our integration also includes data profiling capabilities for reliable decision-making and a comprehensive library of pre-built connectors.

The platform is unique in its ability to collect, manage, and analyze data in real-time with its transactional database, data integration, data quality, and data warehouse capabilities. It manages data from any public cloud, multi- or hybrid cloud, and on-premises environments through a single pane of glass. In addition, the platform offers self-service data integration, which lowers costs and addresses multiple use cases, without needing multiple products.

As Ventana Research’s Matt Aslett noted in his analyst perspective, our platform reduces the number of tools and platforms needed to generate data insights. Streamlining tools is essential to making data easy and accessible to all users, at all skill levels. Aslett also says, “I recommend that all organizations that seek to deliver competitive advantage using data should evaluate Actian and explore the potential benefits of unified data platforms.”

At Actian, we agree. That’s why I encourage you to experience the Actian Data Platform for yourself or join us at upcoming industry events to connect with us in person.


Blog | Actian Life | | 3 min read

The Trend Continues: Actian Once Again Named a Top Workplace

hands representing teamwork for Actian being a top workplace

At Actian, we’re about enabling customers to trust their data. But, within our company, we also trust each other—our highly skilled, talented, and personable employees have confidence in each other and in our leadership team. That’s one of the reasons why Actian Careers stands out as a top choice for employment opportunities.

Our dedicated staff and employee-first approach to business make a significant difference in the services and technologies we provide to customers. They’re also why Actian is recognized by our employees for our culture and also why we just earned another award for being a Top Workplace.

Elevating the Employee Experience in the Virtual Workspace

Actian was recognized by Monster—a global leader in connecting people and jobs—with a 2024 Top Workplaces for Remote Work award. “These awards underscore the importance of listening to employees about where and when they can be their most productive and happiest selves,” explains Monster CEO Scott Gutz. “We know that this flexibility is essential to helping both employers and candidates find the right fit.”

The 2024 Top Workplaces for Remote Work award celebrates organizations with 150 or more employees that provide an exceptional remote working environment. The Top Workplaces employer recognition program has a 17-year history of researching, surveying, and celebrating people-first organizations nationally and across 65 regional markets.

The company Energage determines the awards through an employee survey. This means we received the award based on direct and honest employee feedback. Results of a confidential employee engagement survey were evaluated by comparing responses to research-based statements that predict high performance against industry benchmarks.

Proven History of Offering an Inclusive, Supportive, and Flexible Workplace

Actian offers a culture where people belong, are enabled to innovate, and can reach their full potential. It’s not just a place to work—it’s a place to thrive, belong, and make a difference.

Being honored with a Top Workplace award demonstrates that when we say we place employees first, we mean it and employees experience it every day. Some of the ways we engage and reward our staff include:

  • A Rewards and Recognition Program that showcases an individual’s work and contributions.
  • Professional development to empower employees to grow their skill set.
  • Seasonal events and regular gatherings—including some that are virtual.
  • A commitment to work-life flexibility.
  • Time off to volunteer and give back to communities.
  • Quarterly peer nominations to recognize colleagues for their work.

People feel welcome at Actian, which is why we’ve seen a pattern of being recognized for our workplace and culture. This includes receiving 10 Top Workplace awards for Culture Excellence in 2023, seven in 2022, and one each in 2021 and 2020.

These awards span Innovation, Work-Life Balance, Leadership, Cross-Team Collaboration, Meaningful Work, Employee Appreciation, and more. We’ve also been named a Top Workplace by other organizations based on employee feedback.

Join Us

It is the highest honor to have employees give us high marks for our workplace. If this sounds like an environment where you’d like to work, and you’re interested in bringing your talent to Actian, view our open career opportunities.


Blog | Data Intelligence | | 5 min read

What is a Data Product Owner? Role, Skills and Responsibilities

data product owners

In our previous article on Data Products, we discussed the definition, characteristics, and examples of data products as well as the necessity to switch to a product-thinking mindset to truly transform your datasets into viable data products. Amid this shift towards a Data Mesh architecture, it is important to highlight a very important part of data product management – data product ownership. Indeed, it is crucial to appoint the right people as stakeholders for your enterprise data products.

In this article, we go over the human side of data products – the role, responsibilities, and required skills of a Data Product Owner.

What are the Role and Skills of a Data Product Owner?

As the name suggests, Data Product Owners are the guarantors of the development and success of data products within an organization. They act as a bridge between data teams, stakeholders, and end-users, translating complex data concepts into actionable insights that drive value and innovation. To do so, Data Product Owners have unique sets of technical skills, including the ability to extract insights from data & identify patterns, understand programming languages such as Python or R, and have a strong foundation in data technologies such as data warehouses, databases, data lakes, etc.

In addition to technical skills, a Data Product Owner has great business acumen, with the ability to understand the business context, objectives, trends, and overall landscape and develop data strategies that are aligned with said context. They therefore use data for decision-making by correctly collecting and analyzing data.

Lastly, Data Product Owners have great communication skills, with the ability to convey data insights to the different stakeholders in the company such as data scientists and developers but also non-technical roles such as business users and analysts. They usually also have experience in agile methodologies and problem-solving skills to deliver successful data products on time.

What are a Data Product Owner’s Core Responsibilities?

The multifaceted nature of a Data Product Owner as described above makes them have a variety of responsibilities. In Data Mesh in Action, by J. Majchrzak et al., they list Data Product Owners’ tasks as:

  • Vision definition: They are responsible for determining the purpose of creating a data product, understanding its users, and capturing their expectations through the lens of product thinking.
  • Strategic planning of product development: They are in charge of creating a comprehensive roadmap for the data product’s development journey, as well as defining key performance indicators (KPIs).
  • Ensuring satisfaction requirements: Ensuring the data product meets all requirements is a critical responsibility. This includes providing a detailed metadata description and ensuring compliance with accepted standards and data governance rules.
  • Backlog Management & Prioritization: The Data Product Owner makes tactical decisions regarding the management of the data product backlog. This involves prioritizing requirements, clarifying them, splitting stories, and approving implemented items.
  • Stakeholder Management: They must gather information to understand expectations and clarify any inconsistencies or conflicting requirements to ensure alignment.
  • Collaboration With Development Teams: Engaging with the data product development team is essential for clarifying requirements and making informed decisions on challenges affecting development and implementation.
  • Participation in Data Governance: The Data Product Owner actively contributes to the data governance team, influencing the introduction of rules within the organization and providing valuable feedback on the practical implementation of data governance rules.

While the principle dictates one Data Product Owner for a specific data product, a single owner may oversee multiple products, especially if they are smaller or require less attention. The size and complexity of data products vary, leading to differences in the specific responsibilities shouldered by Data Product Owners.

What are the Differences Between a Data Product Owner and a Product Owner?

The relationship between a Product Owner and a Data Product Owner can vary based on specific characteristics and requirements. While in some instances, these roles overlap, in others, they distinctly diverge. In the book Data Mesh in Action, they distinguish between three different scenarios:

Case 1: The Dual Role

In this scenario, the Data Product Owner also serves as a Product Owner, and the development teams for both the data product and the overall product alignment. This configuration is most fitting when the data product extends from the source system, and the complexity is manageable, not requiring separate development efforts.

An example would be a subscription purchase module providing data on purchases seamlessly integrated into the source system.

Case 2: Dual Ownership, Separate Teams

Here, the Data Product Owner holds a dual role as a Product Owner, but the teams responsible for the data product and the overall product development are distinct. This setup is applied when analytical data derived from the application is extensive, requiring a distinct backlog and a specialized team for execution.

An example would be a subscription purchase module offering analytical data supported by a ML model, enabling predictions of purchase behavior.

Case 3: Independent Entities

In this scenario, the roles of the Data Product Owner and Product Owner are distinct, and the teams responsible for the data product and the overall product development operate independently. This configuration is chosen when the data product is a complex solution demanding independent development efforts.

An example would be building a data mart supported by an ML model for predicting purchase behavior.

In essence, the interplay between the roles of Product Owner and Data Product Owner is contingent upon the intricacies of the data product and its relationship with the overarching system. Whether they converge or diverge, the configuration chosen aligns with the specific demands posed by the complexity and integration requirements of the data product in question.

Conclusion

In conclusion, as organizations increasingly adopt Data Product Management within a Data Mesh architecture, the effectiveness of dedicated Data Product Owners becomes essential. Their capacity to connect technical intricacies with business goals, combined with a deep understanding of evolving data technologies, positions them as central figures in guiding the journey toward unleashing the full potential of enterprise Data Products.


Over the past decade, data catalogs have emerged as important pillars in the landscape of data-driven initiatives. However, many vendors on the market fall short of expectations with lengthy timelines, complex and costly projects, bureaucratic Data Governance models, poor user adoption rates, and low-value creation. This discrepancy extends beyond metadata management projects, reflecting a broader failure at the data management level.

The present situation reveals a disconnection between technical proficiency and business knowledge, a lack of collaboration between data producers and consumers, persistent data latency and quality issues, and unmet scalability of data sources and use cases. Despite substantial investments in both personnel and technology, companies find themselves grappling with a stark reality – the failure to adequately address business needs.

The good news, however, is that this predicament can be reversed by embracing an Enterprise Data Marketplace (EDM) and leveraging existing investments.

Introducing the Enterprise Data Marketplace

An EDM is not a cure-all but rather a transformative solution. It necessitates that companies to reframe their approach to data, introducing a new entity – Data Products. A robust Data Mesh, as advocated by Zhamak Dehghani in her insightful blog post, becomes imperative, with the EDM serving as the experiential layer of the Data Mesh.

However, the landscape has evolved with a new breed of EDM – a Data Sharing Platform integrated with a robust federated Data Catalog:

EDM = Data Sharing Platform + Strong Data Catalog

This is precisely what the Actian Data Intelligence Platform accomplishes, and plans to enhance further, with our definition of an EDM:

An Enterprise Data Marketplace is an e-commerce-like solution, where Data Producers publish their Data Products, and Data Consumers explore, understand, and acquire these published Data Products.

The Marketplace operates atop a Data Catalog, facilitating the sharing and exchange of the most valuable Domain Data packaged as Data Products.

Why Complement Your Data Catalog With an Enterprise Data Marketplace?

We’ve compiled 5 compelling reasons to enhance your Data Catalog with an Enterprise Data Marketplace.

Reason #1: Streamline the Value Creation Process

By entrusting domains with the responsibility of creating Data Products, you unlock the wealth of knowledge possessed by business professionals and foster a more seamless collaboration with Data Engineers, Data Scientists, and Infrastructure teams. Aligned with shared business objectives, the design, creation, and maintenance of valuable, ready-to-use Data Products will collectively adopt a Product Design Thinking mindset.

Within this framework, teams autonomously organize themselves, streamlining ceremonies for the incremental delivery of Data Products, bringing fluidity to the creation process. As Data Products incorporate fresh metadata to guide Data Consumers on their usage, an EDM assumes a pivotal role in shaping and exploring metadata related to Data Products – essentially serving as the Experience Plane within the Data Mesh framework.

By adhering to domain-specific nuances, there is a notable reduction in both the volume and type of metadata, alongside a more efficient curation process. In such instances, a robust EDM, anchored by a potent Data Catalog like the Actian Data Intelligence Platform, emerges as the core engine. This EDM not only facilitates the design of domain-specific ontologies but also boasts automated harvesting capabilities from both on-premises and cloud-based data sources. Moreover, it empowers the federation of Data Catalogs to implement diverse Data Mesh topologies and grants end-users an effortlessly intuitive eCommerce-like Data Shopping experience.

Reason #2: Rationalize Existing Investments

By utilizing an EDM (alongside a powerful Data Catalog), existing investments in modern data platforms and people can be significantly enhanced. Eliminating intricate data pipelines, where data often doesn’t need to be moved, results in substantial cost savings. Similarly, cutting down on complex, numerous, and unnecessary synchronization meetings with cross-functional teams leads to considerable time savings.

Therefore, a focused approach is maintained by the federated governance body, concentrating solely on Data Mesh-related activities. This targeted strategy optimizes resource allocation and accelerates the creation of incremental, delegated Data Products, reducing the Time to Value.

To ensure measurable outcomes, closely monitoring the performance of Data Products with accurate KPIs becomes paramount – This proactive measure enhances decision-making and contributes to the delivery of tangible results.

Reason #3: Achieve Better Adoption Than With a Data Catalog Only

An EDM, coupled with a powerful Data Catalog, plays a pivotal role in facilitating adoption. At the domain level, it aids in designing and curating domain-specific metadata easily understood by Domain Business Users. This avoids the need for a “common layer”, a typical pitfall in Data Catalog adoption. At the Mesh Level, it offers means to consume Data Products effectively, providing information on location, version, quality, state, provenance, platform, schema, etc. A dynamic domain-specific metamodel, coupled with strong search and discovery capabilities, makes the EDM a game-changer.

The EDM’s added value lies in provisioning and access rights, integrating with ticketing systems, dedicated Data Policy enforcement platforms, and features from Modern Data platform vendors – a concept termed Computational Data Governance.

Reason #4: Clarify Accountability and Monitor Value Creation Performance

Applying Product Management principles to Data Products and assigning ownership to domains brings clarity to responsibilities. Each domain becomes accountable for the design, production, and life cycle management of its Data Products. This focused approach ensures that roles and expectations are well-defined.

The EDM then opens up Data Products to the entire organization, setting standards that domains must adhere to. This exposure helps maintain consistency and ensures that Data Products align with organizational goals and quality benchmarks.

In the EDM framework, companies establish tangible KPIs to monitor the business performance of Data Products. This proactive approach enables organizations to assess the effectiveness of their data strategies. Additionally, it empowers Data Consumers to contribute to the evaluation process through crowd-sourced ratings, fostering a collaborative and inclusive environment for feedback and improvement.

Reason #5: Apply Proven Lean Software Development Principles to Data Strategy

The creation of Data Products follows a similar paradigm to the Lean Software Development principles that revolutionized digital transformation. Embracing principles like eliminating waste, amplifying learning, deciding late, delivering fast, and building quality is integral to the approach that a Data Mesh can enable.

In this context, the EDM acts as a collaborative platform for teams engaged in the creation of Data Products. It facilitates:

  • Discovery Features: Offering automatic technical curation of data types, lineage information, and schemas, enabling the swift creation of ad hoc products.
  • Data Mesh-Specific Metadata Curation: The EDM incorporates automatic metadata curation capabilities specifically tailored for Data Mesh, under the condition that the Data Catalog has federation capabilities.
  • 360 Coverage of Data Products Information: Ensuring comprehensive coverage of information related to Data Products, encompassing their design and delivery aspects.

In essence, the collaboration between an Enterprise Data Marketplace and a powerful Data Catalog not only enhances the overall data ecosystem but also brings about tangible benefits by optimizing investments, reducing unnecessary complexities, and improving the efficiency of the data value creation process.


In my previous blog on digital transformation, I wrote about the benefits of migrating mission-critical databases to the cloud. This time, I’m focusing on database modernization in applications. Application modernization can involve modernizing an application’s code, features, architecture and/or infrastructure. It’s a growing priority according to The 2023 Gartner CIO and Technology Executive Survey that places it in the top 4 technology areas in spending, with 46% of organizations increasing their spending on application modernization. Further, Foundry, an IDG company, reports that 87% of its survey respondents cite modernizing critical applications as a key success driver.

7 Benefits of Database Application Modernization

Why all the recent interest in transitioning to modern applications? Application modernization and database modernization are closely intertwined processes that work together to enhance the overall agility, efficiency, performance, security, innovation, and capabilities of an organization’s business. Here’s how application modernization complements database modernization:

Accelerated Time to Market

Monolithic legacy applications are time-consuming to update. Modernized applications with a loosely coupled architecture can enable faster development cycles, reducing the time it takes to bring new features or products to market. Agile development methodologies often accompany application modernization, enabling incremental and iterative development so that teams can respond rapidly to changing business requirements.

Cloud-Enabled Opportunities

Moving applications to the cloud as part of an application modernization initiative provides an extensive list of advantages over on-premises deployments, including elasticity, scalability, accessibility, business continuity, environmental sustainability, and more.

Optimized User Experience

Modernizing applications offers many ways to increase user satisfaction, and productivity, including more intuitive interfaces, personalization, improved response times and better accessibility.  Multi-channel support such as mobile and web and cross-platform compatibility extend reach while advanced search and navigation, rich media incorporation, and third-party integrations add value for users.

Stronger Security and Compliance

Legacy applications built on outdated technologies may lack security features and defenses against contemporary threats and may not comply with regulatory compliance requirements. Modernizing applications allows for the implementation of the latest security measures and compliance standards, reducing the likelihood of security breaches and non-compliance.

Staff Productivity

Legacy systems can be difficult to maintain and may require significant technical resources for updates and support. Modern applications can improve staff efficiency, reduce maintenance expenses, and lead to better utilization of resources for strategic initiatives that deliver greater value to the business.

Easier Integration

Application modernization supports integration with technologies and architectural best practices that enhance interoperability, flexibility, and efficiency. Using technologies such as microservices, APIs, containers, standardized protocols, and/or cloud services, it’s easier to integrate modernized applications within complex IT environments.

Support for Innovation

Legacy applications often make it difficult to incorporate newer technologies, hindering innovation. Modernizing applications allows organizations the ability to leverage emerging technologies, such as machine learning and Internet of Things (IoT) for competitive advantage.

Database Application Modernization With Ingres NeXt

In summary, database application modernization is a strategic digital transformation initiative that can help organizations stay ahead in the digital age.  However, application modernization can be expensive and risky without the right approach.

Ingres NeXt is designed to protect existing database application investments in OpenROAD while leveraging them in new ways to add value to your business, without costly and lengthy rewrites. Flexible options to modernize your OpenROAD applications include:

  • ABF and Forms-Based Applications – Modernize ABF applications to OpenROAD frames using the abf2or migration utility and extend converted applications to mobile and web applications.
  • OpenROAD and Workbench IDE – Migrate partitioned ABF applications to OpenROAD frames.
  • OpenROAD Server – Deploy applications securely in the OpenROAD Server to retain and use application business logic.

In addition, The Ingres NeXt Readiness Assessment offers a pre-defined set of professional services that can lower your risk for application modernization and increase your confidence for a successful cloud journey. The service is designed to assist you with understanding the requirements to modernize Ingres and ABF or OpenROAD applications and to impart recommendations important to your modernization strategy formulation, planning, and implementation.