Data Management

Getting Started With Actian Zen and BtrievePython

Johnson Varughese

July 1, 2024

Actian Zen and BtrievePython

Welcome to the world of Actian Zen, a versatile and powerful edge data management solution designed to help you build low-latency embedded apps. This is Part 1 of the quickstart blog series that focuses on helping embedded app developers get started with Actian Zen. In this blog, we’ll explore how to leverage BtrievePython to run Btrieve2 Python applications, using the Zen 16.0 Enterprise/Server Database Engine.

But before we dive in, let’s do a quick introduction.

What is Btrieve?

Actian Zen Btrieve interface is a high-performance, low-level, record-oriented database management system (DBMS) developed by Pervasive Software, now part of Actian Corporation. It provides efficient and reliable data storage and retrieval by focusing on record-level operations rather than complex queries. Btrieve is known for its speed, flexibility, and robustness, making it a popular choice for applications that require high-speed data access and transaction processing.

What is BtrievePython?

BtrievePython is a modern Python interface for interacting with Actian Zen databases. It allows developers to leverage the powerful features of Btrieve within Python applications, providing an easy-to-use and efficient way to manage Btrieve records. By integrating Btrieve with Python, BtrievePython enables developers to build high-performance, data-driven applications using Python’s extensive ecosystem and Btrieve’s reliable data-handling capabilities.

This comprehensive guide will walk you through the setup on both Microsoft Server 2019 and Ubuntu V20, ensuring you have all the tools you need for success.

Getting Started With Actian Zen

Actian Zen offers a range of data access solutions compatible with various operating systems, including Android, iOS, Linux, Raspbian, and Windows (including IoT and Nano Server). For this demonstration, we’ll focus on Microsoft Server 2019, though the process is similar across different platforms.

Before we dive into the setup, ensure you’ve downloaded and installed the Zen 16.0 Enterprise/Server Database Engine for Windows or Linux on Ubuntu. Detailed installation instructions can be found on Actian’s Academy channel.

Setting Up Your Environment

Installing Python and BtrievePython on Windows:

      • Download and Install Python: Visit Python’s official website and download the latest version (we’re using Python v3.12).
      • Open Command Prompt as Administrator: Ensure you have admin rights to proceed with the installation.
      • Install BtrievePython: Execute pip install btrievePython. Note that this step requires an installed ZEN 16.0 client or Engine. If the BtrievePython installation fails, ensure you have Microsoft Visual C++ 14.0 or greater by downloading the Visual C++ Build Tools.
      • Verify Installation: Run pip list to check if BtrievePython is listed.
      • Run a Btrieve2 Python Sample: Download the sample program from the Actian documentation and run it using python btr2sample.py 9 from an admin command prompt.

Installing Python and BtrievePython on Linux (Ubuntu):

      • Install PIP: Use sudo apt install python3-pip to get PIP, the Python package installer.
      • Open a terminal window as a non-“root” user and export PATH=$PATH:/usr/local/actianzen/bin
      • Install BtrievePython: Execute sudo pip install btrievePython, ensuring a ZEN 16.0 client or Engine is present.
      • Verify Installation: Run pip show btrievePython to confirm the installation.
      • Run a Btrieve2 Python Sample: After downloading the sample from the Actian documentation, run the sample with python3 btr2sample.py 9

Visual Guide

The setup process includes several steps that are best followed with visual aids. Here are some key screenshots to help guide you through the setup:

For the Windows Setup:

Downloading and setting up Python.

Python Download Site

python download site

Command Prompt Operations: Steps to install BtrievePython.

command prompt operations for btrieve

Code snippet:

code snippet btrieve

Verification and Execution: verifying the installation and running the Btrieve2 sample application.

verification and execution btrieve

For the Linux Setup:

Installation Commands

Install Python3-pip

install python3 linux btrieve

BtrievePython Setup: BtrievePython installation.

btrieve python setup

Open a terminal window as a non-“root” user and export PATH=$PATH:/usr/local/actianzen/bin

BtrievePython Installed

btrieve python installed

Sample Execution: running the Btrieve2 sample app.

sample execution btrieve

Conclusion

This guide has provided a thorough walkthrough on using BtrievePython with Actian Zen to run Btrieve2 Python applications. Whether you’re working on Windows or Linux, these steps will help you set up your environment efficiently and get your applications running smoothly. Actian Zen’s compatibility with multiple platforms ensures that you can manage your data seamlessly, regardless of your operating system.

For further details and visual guides, refer to the Actian Academy and the comprehensive documentation. Happy coding!

Johnson Varughese headshot

About Johnson Varughese

Johnson Varughese manages Support Engineering at Actian, assisting developers leveraging ZEN interfaces (Btrieve, ODBC, JDBC, ADO.NET, etc.). He provides technical guidance and troubleshooting expertise to ensure robust application performance across different programming environments. Johnson's wealth of knowledge in data access interfaces has streamlined numerous development projects. His Actian blog entries detail best practices for integrating Btrieve and other interfaces. Explore his articles to optimize your database-driven applications.
Data Management

Edge Computing With Actian Zen: Paving the Way for the Future

Kasey Nolan

June 26, 2024

Edge Computing with Actian Zen: Paving the Way for a Sustainable Future

Consider your morning commute–taking your kids to school, your morning coffee run, hurrying to the office–how much time do you spend in the car? And how much money are you spending filling up your tank? Or maybe you’re like me and desperately trying not to think about your carbon footprint every time you drive 20 minutes (each way!) to the grocery store.

Now imagine how it would be if your office was just a block or two down the street, daycare right next door, and your grocery store in-between. Imagine the time savings, cost savings, and reduction in your personal carbon emissions if you could do everything you need, but without having to travel as far. If you could snap your fingers to make it happen, would you?

That’s the question being asked across the world of data processing. Businesses are increasingly seeking efficient and sustainable ways to manage and process their data across the world. End-users are less patient and the sheer volume of data being transferred from one endpoint to the next has massive implications for energy consumption and overall latency.

One solution to this is edge computing, which is the data processing equivalent of reducing your commute from an hour to two minutes. Not only does edge computing use fewer resources and energy, but it’s faster and more efficient, making it a greener choice for managing data.

Understanding Edge Computing

Before delving into the sustainability benefits, it’s crucial to understand what edge computing is. Edge computing is a distributed computing framework where data is processed closer to where it is generated, rather than relying on a centralized data center or the cloud. If you’ve ever used the Target App for shopping, you may notice it’ll give a little warning for items that are low in stock. “Only 2 left at your store!” Retailers like Target use edge enabled sensors to track products on shelves in real-time, automating inventory management for a more reliable picture of what’s available locally.

If not for IoT sensors and edge computing, you likely wouldn’t get a real-time view of inventory– data would be collected via barcode scans, transferred to a centralized data center that could be states away, batch processed, and then synchronized with inventory systems. This could take minutes, hours, or even days depending on the company, and this process is rife with problems like latency, network reliability, bandwidth constraints, and high infrastructure costs. Not to mention that a central server represents a single point of failure–meaning if one store is down, they all are. Not a great experience for shoppers, and a great case for moving to the edge.

The Sustainability Edge

Because it’s 2024, sustainability and environmental, social, and governance (ESG) initiatives are paramount. For example, 90% of S&P 500 companies release ESG reports, and ESG initiatives are considered by 89% of investors when making investment decisions. Sticking with those high numbers, 89% of executives plan to increase their overall technology budget, and 28% say that at least one-fifth of their workforce is involved in emerging tech as part of their primary job function. That’s a huge amount of people who are actively considering both sustainability and emerging technologies in their day-to-day work, in their projections, and in their strategic initiatives.

Edge computing marries these two initiatives beautifully. For instance, 60% of companies are using edge to some degree today, and half of those have deeply integrated edge into their digital core. In fact, Forbes predicts a mass migration from the cloud to the edge in 2024. The sustainability advantages perfectly complement the cost savings and consumer benefits of edge computing as opposed to the traditional cloud.

Here are three primary ways edge computing supports ESG:

  1. Reduced Energy Consumption: Traditional data centers and cloud computing require substantial energy to power and cool the vast arrays of servers. This energy consumption not only translates into high operational costs but also contributes significantly to carbon emissions. Edge computing, on the other hand, decentralizes data processing, distributing it across multiple edge devices that are often located closer to the data source. This decentralization reduces the load on central data centers, leading to lower overall energy consumption.
  2. Optimized Bandwidth Usage: Transmitting large volumes of data to and from centralized data centers or the cloud can be bandwidth-intensive. This not only increases operational costs but also places a strain on network infrastructure. By processing data at the edge, organizations can significantly reduce the amount of data that needs to be transmitted over the network. This not only optimizes bandwidth usage but also reduces the associated energy consumption and emissions.
  3. Decreased Latency and Improved Efficiency: One of the inherent advantages of edge computing is the reduction in latency. By processing data closer to the source, edge computing eliminates the delays associated with transmitting data to distant data centers. This not only enhances the speed and responsiveness of applications but also improves overall system efficiency.

Actian Zen: A Sustainable Edge Solution

Edge computing doesn’t exist in a vacuum, and it takes the right toolkit to take advantage of all the benefits. You need to be sure you have the right database and a database management system (DBMS) that’s edge compatible.

Enter Actian Zen, a high-performance, embedded, and zero-administration DBMS designed for edge computing, IoT applications, and mobile environments. Known for its small footprint, low resource consumption, and ability to operate efficiently on a wide range of devices, Actian Zen provides a versatile and powerful DBMS that meets the needs of modern business across various industries.

Three main benefits Zen delivers include:

  1. Optimizing IT and Cloud Expenditures: Actian Zen is designed to operate efficiently on a wide range of devices, from IoT sensors to industrial gateways. Its compact size means it can be deployed on low-power devices, reducing the need for energy-intensive hardware. Additionally, by processing data locally at the edge, Actian Zen significantly reduces the need for extensive data transmission to central servers or cloud environments. This local processing minimizes bandwidth usage and decreases the load on centralized data centers, leading to lower operational costs associated with data storage and cloud services. Furthermore, the reduced reliance on large, energy-intensive data centers aligns with sustainability goals by lowering overall energy consumption and carbon emissions.
  2. Ensuring Compliance With Internal Policies and External Regulations: By enabling data processing at the edge, Actian Zen reduces the need for data transmission to centralized servers, thus saving bandwidth and energy. This local processing aligns with sustainability initiatives aimed at reducing energy consumption and emissions. Actian Zen also features role-based access, which allows for granular control over who can access and manipulate data, aligning with internal security policies and regulatory standards.
  3. Enabling Scalability and Flexibility to Accommodate Future Growth: With Actian Zen, developers can scale from a core set of libraries capable of single-user client data management to a full-fledged, enterprise-grade server. It’s capable of supporting thousands of users on multicore, VM cloud environments, or in Docker containers with Kubernetes orchestration and Helm chart deployment configuration.

Zen: The Sustainable Database Solution

As the demand for sustainable computing solutions grows, edge computing with Actian Zen emerges as a game-changer. By reducing energy consumption, optimizing bandwidth usage, and decreasing latency, Actian Zen not only enhances operational efficiency but also contributes to a greener future. If you’re looking to balance performance with sustainability, you’ll find Actian Zen’s edge computing capabilities to be a compelling choice. Embrace the power of edge computing with Actian Zen and take a step toward a more sustainable, efficient, and environmentally friendly future.

Kasey Nolan

About Kasey Nolan

Kasey Nolan is Solutions Product Marketing Manager at Actian, aligning sales and marketing in IaaS and edge compute technologies. With a decade of experience bridging cloud services and enterprise needs, Kasey drives messaging around core use cases and solutions. She has authored solution briefs and contributed to events focused on cloud transformation. Her Actian blog posts explore how to map customer challenges to product offerings, highlighting real-world deployments. Read her articles for guidance on matching technology to business goals.
Data Architecture

Is the On-Premises Data Warehouse Dead?

Actian Corporation

June 26, 2024

Is the On-Premises Data Warehouse Dead?

As organizations across all industries grapple with ever-increasing amounts of data, the traditional on-premises data warehouse is facing intense scrutiny. Data and IT professionals, analysts, and business decision-makers are questioning its viability in our modern data landscape where agility, scalability, and real-time insights are increasingly important.

Data warehouse stakeholders are asking:

  • How do on-prem costs compare to a cloud-based data warehouse?
  • Can our on-premises warehouse meet data growth and business demands?
  • Do we have the flexibility to efficiently integrate new data sources and analytics tools?
  • What are the ongoing maintenance and management needs for our on-prem warehouse?
  • Are we able to meet current and future security and compliance requirements?
  • Can we integrate, access, and store data with a favorable price performance?

Addressing these questions enables more informed decision-making about the practicality of the on-premises data warehouse and whether a migration to a cloud-based warehouse would be beneficial. As companies like yours also look to answer the question of whether the on-premises data warehouse is truly a solution of the past, it’s worth looking at various warehouse offerings. Is one model really better for transforming data management and meeting current business and IT needs for business intelligence and analytics?

Challenges of Traditional On-Premises Data Warehouses

Data warehouses that serve as a centralized data repository on-premises, within your physical environment, have long been the cornerstone of enterprise data management. These systems store vast amounts of data, enabling you to integrate and analyze data to extract valuable insights.

Many organizations continue to use these data warehouses to store, query, and analyze their data. This allows them to get a return on their current on-prem warehouse investment, meet security and compliance requirements, and perform advanced analytics. However, the downside is that these warehouses increasingly struggle to meet the demands of modern business environments that need to manage more data from more sources than ever before, while making the data accessible and usable to analysts and business users at all skill levels.

These are critical challenges faced by on-premises data warehouses:

  • Scalability Issues. A primary drawback of on-premises data warehouses is their limited scalability—at least in a fast and efficient manner. Growing data volumes and increased workloads require you to invest in additional hardware and infrastructure to keep pace. This entails significant costs and also requires substantial time. The rigidity of on-premises systems makes it difficult to quickly scale resources based on fluctuating needs such as seasonal trends, marketing campaigns, or a business acquisition that brings in large volumes of new data.
  • Limited Flexibility. As new data sources emerge, you need the ability to quickly build data pipelines and integrate the information. On-premises data warehouses often lack the flexibility to efficiently handle data from emerging sources—integrating new data sources is typically a cumbersome, time-consuming process, leading to delays in data analytics and business insights.
  • High Operational Costs. Maintaining an on-premises data warehouse can involve considerable operational expenses. That means you must allocate a budget for hardware, software licenses, electricity, and cooling the data warehouse environment in addition to providing the physical space. You must also factor in the cost of skilled IT staff to manage the warehouse and troubleshoot problems.
  • Performance Restrictions. You can certainly have high performance on-premises, yet as data volumes surge, on-prem data warehouses can experience performance bottlenecks. This results in slower query processing times and delayed insights, restricting your ability to make timely decisions and potentially impacting your competitive edge in the market.

These are some of the reasons why cloud migrations are popular—they don’t face these same issues. According to Gartner, worldwide end-user spending on public cloud services is forecast to grow 20.4% to $675.4 billion in 2024, up from $561 billion in 2023, and reach $1 trillion before the end of this decade.

Yet it’s worth noting that on-prem warehouses continue to meet the needs of many modern businesses. They effectively store and query data while offering customization options tailored to specific business needs.

On-Prem is Not Even on Life Support

Despite the drawbacks to on-premises data warehouses, they are alive and doing fine. And despite some analysts predicting their demise for the last decade or so, reality and practicality tell a different story.

Granted, while many organizations have mandates to be cloud-first and have moved workloads to the cloud, the on-prem warehouse continues to deliver the data and analytics capabilities needed to meet the requirements of today’s businesses, especially those with stable workloads. In fact, you can modernize in place, or on-prem, with the right data platform or database.

You also don’t have to take an either-or approach to on-premises data warehouses vs. the cloud. You can have them both with a hybrid data warehouse that offers a modern data architecture combining the benefits of on-premises with cloud-based data warehousing. This model lets you optimize both environments for data storage, processing, and analytics to ensure the best performance, cost, security, and flexibility.

Data Warehouse Options Cut Across Specific Needs

It’s important to remember that your organization’s data needs and strategy can be uniquely different from your peers and from businesses in other industries. For example, you may be heavily invested in your on-prem data warehouse and related tools, and therefore don’t want to move away from these technologies.

Likewise, you may have a preference to keep certain workloads on-prem for security or low latency reasons. At the same time, you may want to take advantage of cloud benefits. A modern warehouse lets you pick your option—solely on-premises, completely in the cloud, or a hybrid that effectively leverages on-prem and cloud.

One reason to take a hybrid approach is that it helps to future-proof your organization. Even if your current strategy calls for being 100% on-premises, you may want to keep your options open to migrate to the cloud later, if or when you’re ready. For instance, you may want a data backup and recovery option that’s cloud based, which is a common use case for the cloud.

Is On-Prem Right For You?

On-premises data warehouses are alive and thriving, even if they don’t receive the amount of press as their cloud counterparts. For many organizations, especially those with stringent regulatory requirements, the on-prem warehouse continues to play an essential role in data and analytics. It allows predictable cost management along with the ability to customize hardware and software configurations to fit specific business demands.

If you’re curious about the best option for your business, Actian can help. Our experts will look at your current environment along with your data needs and business priorities to recommend the most optimal solution for you.

We offer a modern product portfolio, including data warehouse solutions, spanning on-prem, the cloud, and hybrid to help you implement the technology that best suits your needs, goals, and current investments. We’re always here to help to ensure you can trust your data and your buying choices.

actian avatar logo

About Actian Corporation

Actian empowers enterprises to confidently manage and govern data at scale. Actian data intelligence solutions help streamline complex data environments and accelerate the delivery of AI-ready data. Designed to be flexible, Actian solutions integrate seamlessly and perform reliably across on-premises, cloud, and hybrid environments. Learn more about Actian, the data division of HCLSoftware, at actian.com.
Data Platform

Buyers Guide for Data Platforms 2024

Actian Corporation

June 26, 2024

Actian 2024 Ventana-Analytic Data Platforms Ranked Exemplary

Data Platforms Buyers Guide

The process of choosing the right technology for your specific business and IT needs can be complex, yet making the right decision is critical. So, how do you make an informed choice?

The product landscape changes fast, meaning the products you looked at even a few months ago may have changed significantly. And let’s face it – proof of concepts (POCs) are limited deployments with vendors showcasing their solutions for a brief period of time. You don’t want to find out later, after you’ve invested significant time and money, that a product won’t handle your specific workloads, or give you the security, scalability and price-performance you need.

You need to know upfront how it performs from both a customer and a product experience in essential categories such as performance, reliability, manageability, and validation. Likewise, you want to know that the product has a strong roadmap for your future and peer use cases are available.

The Need for Unbiased Assessments

Independent analyst reports and buying guides can help you make informed decisions. They offer unbiased, critical insights into the advantages and drawbacks of vendors’ products. The information cuts through marketing claims to help you understand how technologies, such as data platforms, truly perform to help you choose a solution with confidence.

These reports are typically based on thorough research and analysis, considering various factors such as product capabilities, customer satisfaction, and market performance. This objectivity can help you avoid the pitfalls of biased or incomplete information.

For example, the 2024 Ventana Research Buyers Guide for Data Platforms evaluated 25 data platform software providers, detailing their strengths and weaknesses. This broad perspective enables you to understand the competitive landscape and identify potential technology partners that align with your strategic goals.

The Buyers Guide is meticulously curated and structured into seven in-depth categories across Product and Customer Experience. A vendor’s overall placement is assessed through a weighted score and is only awarded to companies that meet a strict set of criteria, with the aim to streamline and aid vendor selection.

Ventana’s Market View on Data Platforms

A modern data platform allows businesses to stay competitive and innovative in a data-driven world. They manage the storage, integration, and analysis of data, ensuring a single source of truth.

Data platforms should empower all users, especially non-technical users, with actionable insights. As Ventana Research stated in its 2024 Buyers Guide for Data Platforms, “Data platforms provide an environment for organizing and managing the storage, processing, analysis, and presentation of data across an enterprise. Without data platforms, enterprises would be reliant on a combination of paper records, time-consuming manual processes, and huge libraries of physical files to record, process and store business information.”

Today’s data platforms are typically designed to be scalable and flexible, accommodating the growing and evolving data needs of your business. They support a variety of data from new and emerging sources. This versatility ensures that you can continue to leverage your data as you expand and innovate.

2024 Ventana Research Data Platforms Exemplary

Ventana’s Criteria for Choosing Data Platforms

Ventana notes that buying decisions should be based on research. “We believe it is important to take a comprehensive, research-based approach, since making the wrong choice of data platforms technology can raise the total cost of ownership, lower the return on investment and hamper an enterprise’s ability to reach its full performance potential,” according to Ventana.

Three key evaluation criteria from the 2024 Ventana Buyers Guide for Data Platforms are:

  1. Assess Your Primary Workload Needs and Future-Proof Them for GenAI. Determine whether your primary focus is on operational or analytic workloads, or both. Operational workloads include finance, supply chain, and marketing applications, whereas analytical workloads include business intelligence (BI) and data science. Ventana predicts that by 2027, personalized experiences driven by GenAI will increase the demand for data platforms capable of supporting hybrid operational and analytical processing.
  2.  Evaluate Your Main Data Storage and Management Criteria. Determine the capabilities you need, then evaluate data platforms that align with those requirements. Criteria often includes the core database management system, performance and query functionality, the ability to integrate data and ensure quality, whether the platform offers simple platform usability and manageability, and if it meets cost, price performance, and return on investment requirements.
  3. Consider Support for Data Workers in Multiple Roles. Consider the types of data you need to manage along with the key functionalities required by your users, from database administrators to data engineers to data scientists. According to Ventana, data platforms must support a range of users with different needs – across technology and business teams.

Have Confidence in Your Data Platform

In the rapidly evolving tech landscape, making informed choices is more important than ever. Analyst reports are invaluable resources that provide objective, comprehensive insights to guide those decisions.

Actian is providing complimentary access to the 2024 Ventana Research Data Platforms Buyers Guide. Read the report to learn more about what Ventana has to say about Actian and our positioning as Exemplary.

If you’re in the market for a single, unified data platform that’s recognized by an analyst firm as handling both operational and analytic workloads, let’s talk so you can have confidence in your buying decision.

actian avatar logo

About Actian Corporation

Actian empowers enterprises to confidently manage and govern data at scale. Actian data intelligence solutions help streamline complex data environments and accelerate the delivery of AI-ready data. Designed to be flexible, Actian solutions integrate seamlessly and perform reliably across on-premises, cloud, and hybrid environments. Learn more about Actian, the data division of HCLSoftware, at actian.com.
Databases

The Rise of Embedded Databases in the Age of IoT

Kunal Shah

June 24, 2024

The Rise of Embedded Databases in the Age of IoT

The Internet of Things (IoT) is rapidly transforming our world. From smart homes and wearables to industrial automation and connected vehicles, billions of devices are now collecting and generating data. According to a recent analysis, the number of Internet of Things (IoT) devices worldwide is forecasted to almost double from 15.1 billion in 2020 to more than 29 billion IoT devices in 2030. This data deluge presents both challenges and opportunities, and at the heart of it all lies the need for efficient data storage and management – a role increasingly filled by embedded databases.

Traditional Databases vs. Embedded Databases

Traditional databases, designed for large-scale enterprise applications, often struggle in the resource-constrained environment of the IoT. They require significant processing power, memory, and storage, which are luxuries most IoT devices simply don’t have. Additionally, traditional databases are complex to manage and secure, making them unsuitable for the often-unattended nature of IoT deployments.

Embedded databases, on the other hand, are specifically designed for devices with limited resources. They are lightweight, have a small footprint, and require minimal processing power. They are also optimized for real-time data processing, crucial for many IoT applications where decisions need to be made at the edge, without relaying data to a cloud database.

Why Embedded Databases are Perfect for IoT and Edge Computing

Several key factors make embedded databases the ideal choice for IoT and edge computing:

  • Small Footprint: Embedded databases require minimal storage and memory, making them ideal for devices with limited resources. This allows for smaller form factors and lower costs for IoT devices.
  • Low Power Consumption: Embedded databases are designed to be energy-efficient, minimizing the power drain on battery-powered devices, a critical concern for many IoT applications.
  • Fast Performance: Real-time data processing is essential for many IoT applications. Embedded databases are optimized for speed, ensuring timely data storage, retrieval, and analysis at the edge.
  • Reliability and Durability: IoT devices often operate in harsh environments. Embedded databases are designed to be reliable and durable, ensuring data integrity even in case of power failures or device malfunctions.
  • Security: Security is paramount in the IoT landscape. Embedded databases incorporate robust security features to protect sensitive data from unauthorized access.
  • Ease of Use: Unlike traditional databases, embedded databases are designed to be easy to set up and manage. This simplifies development and deployment for resource-constrained IoT projects.

Building complex IoT apps shouldn’t be a headache. Let us show you how our embedded edge database can simplify your next IoT project.

Benefits of Using Embedded Databases in IoT Applications

The advantages of using embedded databases in IoT applications are numerous:

  • Improved Decision-Making: By storing and analyzing data locally, embedded databases enable real-time decision making at the edge. This reduces reliance on cloud communication and allows for faster, more efficient responses.
  • Enhanced Functionality: Embedded databases can store device configuration settings, user preferences, and historical data, enabling richer functionality and a more personalized user experience.
  • Reduced Latency: Processing data locally eliminates the need for constant communication with the cloud, significantly reducing latency and improving responsiveness.
  • Offline Functionality: Embedded databases allow devices to function even when disconnected from the internet, ensuring uninterrupted operation and data collection.
  • Cost Savings: By reducing reliance on cloud storage and processing, embedded databases can help lower overall operational costs for IoT deployments.

Use Cases for Embedded Databases in IoT

Embedded databases are finding applications across a wide range of IoT sectors, including:

  • Smart Homes: Embedded databases can store device settings, energy usage data, and user preferences, enabling intelligent home automation and energy management.
  • Wearables: Fitness trackers and smartwatches use embedded databases to store health data, activity logs, and user settings.
  • Industrial Automation: Embedded databases play a crucial role in industrial IoT applications, storing sensor data, equipment settings, and maintenance logs for predictive maintenance and improved operational efficiency.
  • Connected Vehicles: Embedded databases are essential for connected car applications, storing vehicle diagnostics, driver preferences, and real-time traffic data to enable features like self-driving cars and intelligent navigation systems.
  • Asset Tracking: Embedded databases can be used to track the location and condition of assets in real-time, optimizing logistics and supply chain management.

The Future of Embedded Databases in the IoT

As the IoT landscape continues to evolve, embedded databases are expected to play an even more critical role. Here are some key trends to watch:

  • Increased Demand for Scalability: As the number of connected devices explodes, embedded databases will need to be scalable to handle larger data volumes and more complex workloads.
  • Enhanced Security Features: With growing security concerns in the IoT, embedded databases will need to incorporate even more robust security measures to protect sensitive data.
  • Cloud Integration: While embedded databases enable edge computing, there will likely be a need for seamless integration with cloud platforms for data analytics, visualization, and long-term storage.

The rise of the IoT has ushered in a new era for embedded databases. Their small footprint, efficiency, and scalability make them the perfect fit for managing data at the edge of the network. As the IoT landscape matures, embedded databases will continue to evolve, offering advanced features, enhanced security, and a seamless integration with cloud platforms.

At Actian, we help organizations run faster, smarter applications on edge devices with our lightweight, embedded database – Actian Zen. And, with the latest release of Zen 16.0, we are committed to helping businesses simplify edge-to-cloud data management, boost developer productivity and build secure, distributed IoT applications.

Additional Resources:

Kunal Shah - Headshot

About Kunal Shah

Kunal Shah is a product marketer with 15+ years in data and digital growth, leading marketing for Actian Zen Edge and NoSQL products. He has consulted on data modernization for global enterprises, drawing on past roles at SAS. Kunal holds an MBA from Duke University. Kunal regularly shares market insights at data and tech conferences, focusing on embedded database innovations. On the Actian blog, Kunal covers product growth strategy, go-to-market motions, and real-world commercial execution. Explore his latest posts to discover how edge data solutions can transform your business.
Data Intelligence

Data Shopping Part 2 – The Data Shopping Experience

Actian Corporation

June 24, 2024

data shopping experience

Just as shopping for goods online involves selecting items, adding them to a cart, and choosing delivery and payment options, the process of acquiring data within organizations has evolved in a similar manner. In the age of data products and data mesh, internal data marketplaces enable business users to search for, discover, and access data for their use cases.

In this series of articles, get an excerpt from our Practical Guide to Data Mesh and discover all there is to know about data shopping as well as the platform’s Data Shopping experience in its Enterprise Data Marketplace:

  1. How to shop for data products.
  2. The Data Shopping experience.

In our previous article, we discussed the concept of data shopping within an internal data marketplace, addressing elements such as data product delivery and access management. In this article, we will explore the reason behind the Actian Data Intelligence Platform’s decision to extend its data shopping experience beyond internal boundaries, as well as how our interface, Actian Studio, enables the analysis of the overall performance of your data products.

Data Product Shopping

In our previous article, we discussed the complexities of access rights management for data products due to the inherent risks of data consumption. In a decentralized data mesh, the data product owner assesses risks, grants access, and enforces policies based on the data’s sensitivity, the requester’s role, location, and purpose. This may involve data transformation or additional formalities, with delivery ranging from read-only access to fine-grained controls.

In a data marketplace, consumers trigger a workflow by submitting access requests, which data owners evaluate and determine access rules for, sometimes with expert input. For the marketplace, we have chosen not to integrate this workflow directly into the solution but rather to interface with external solutions.

The idea is to offer a uniform experience for triggering an access request but to accept that the processing of this request may be very different from one environment to another, or even from one domain to another within the same organization – This principle is inherited from classical marketplaces. Most marketplaces offer a unique experience for making a purchase but connect to other systems for the operational implementation of delivery – the modalities of which can vary widely depending on the product and the seller.

This decoupling between the shopping experience and the operational implementation of delivery seems essential to us for several reasons.

The main reason is the extreme variability of the processes involved. Some organizations already have operational workflows, relying on a larger solution (data access requests are integrated into a general access request process, supported, for example, by a ticketing tool such as ServiceNow or Jira). Others have dedicated solutions supporting a high level of automation but whose deployment is not yet widespread. Still, others rely on the capabilities of their data platform, and some even on nothing at all – access is obtained through direct requests to the data owner, who handles them without a formal process. This variability is evident from one organization to another but also within the same organization – structurally, when different domains use different technologies, or temporally when the organization decides to invest in a more efficient or secure system and must gradually migrate access management to this new system.

Decoupling, therefore, allows offering a consistent experience to the consumer while adapting to the variability of operational methods.

For a data marketplace customer, the shopping experience is very simple. Once the data product(s) of interest is identified, they trigger an access request by providing the following information:

  1. Who they are – This information is already available.
  2. Which data product they want to access – This information is also already available, along with the metadata needed for decision-making.
  3. What they intend to use the data for – This is crucial since it drives risk management and compliance requirements.

With the Actian Data Intelligence Platform, once the access request is submitted, it is processed in another system, and its status can be tracked from the marketplace – this is the direct equivalent of order tracking found on e-commerce sites.

From the consumer’s perspective, the data marketplace provides a catalog of data products (and other digital products) and a simple, universal system for gaining access to these products.

For the producer, the marketplace plays a fundamental role in managing their product portfolio.

Enhance Data Product Performance With Actian Studio

As mentioned earlier, in addition to the e-commerce system, which is intended for consumers, a classical marketplace also offers tools dedicated to sellers, allowing them to supervise their products, respond to buyer inquiries, and monitor the economic performance of their offerings. And other tools, intended for marketplace managers, to analyze the overall performance of products and sellers.

Actian Data Intelligence Platform’s Enterprise Data Marketplace integrates these capabilities into a dedicated back-office tool, Actian Studio. It allows for managing the production, consolidation, and organization of metadata in a private catalog and deciding which objects will be placed in the marketplace – which is a searchable space accessible to the widest audience.

These activities primarily fall under the production process – metadata are produced and organized together with the data products. However, it also allows for monitoring the use of each data product, notably by providing a list of all its consumers and the uses associated with them.

This consumer tracking helps establish the two pillars of data mesh governance:

  • Compliance and risk management – By conducting regular reviews, certifications, and impact analyses during data product changes.
  • Performance management – The number of consumers, as well as the nature of the uses made of them, are the main indicators of a data product’s value. Indeed, a data product that is not consumed has no value.

As a support tool for domains to control the compliance of their products and their performance, the the Actian Data Intelligence Platform’s Enterprise Data Marketplace also offers comprehensive analysis capabilities of the mesh – the lineage of data products, scoring, and evaluation of their performance, control of overall compliance and risks, regulatory reporting elements, etc.

This is the magic of the federated graph, which allows for exploiting information at all scales and provides a comprehensive representation of the entire data landscape.

actian avatar logo

About Actian Corporation

Actian empowers enterprises to confidently manage and govern data at scale. Actian data intelligence solutions help streamline complex data environments and accelerate the delivery of AI-ready data. Designed to be flexible, Actian solutions integrate seamlessly and perform reliably across on-premises, cloud, and hybrid environments. Learn more about Actian, the data division of HCLSoftware, at actian.com.
Data Management

The Consequences of Poor Data Quality: Uncovering the Hidden Risks

Traci Curran

June 23, 2024

Costly Consequences of Poor Data Quality

Summary

Poor data quality quietly drains millions in revenue, productivity, and trust. This blog outlines the hidden financial, operational, and compliance risks that stem from inaccurate or incomplete data.

  • The average business loses $15 million annually due to poor data quality; in the U.S., this impact reaches $3.1 trillion across the economy.
  • Employees spend up to 27% of their time correcting bad data, slowing decision-making, and increasing operational costs.
  • Poor data undermines compliance efforts, damages brand reputation, and leads to missed market opportunities.

The quality of an organization’s data has become a critical determinant of its success. Accurate, complete, and consistent data is the foundation upon which crucial decisions, strategic planning, and operational efficiency are built. However, the reality is that it is a pervasive issue, with far-reaching implications that often go unnoticed or underestimated.

Defining Poor Data Quality

Before delving into the impacts of poor data quality, it’s essential to understand what constitutes subpar data. Inaccurate, incomplete, duplicated, or inconsistently formatted information can all be considered poor data quality. This can stem from various sources, such as data integration challenges, data capture inconsistencies, data migration pitfalls, data decay, and data duplication.

The Hidden Costs of Poor Data Quality

  1. Loss of Revenue
    Poor data quality can directly impact a business’s bottom line. Inaccurate customer information, misleading product details, and incorrect order processing can lead to lost sales, decreased customer satisfaction, and damaged brand reputation. Gartner estimates that poor data quality costs organizations an average of $15 million per year.
  2. Reduced Operational Efficiency
    When employees waste time manually correcting data errors or searching for accurate information, it significantly reduces their productivity and the overall efficiency of business processes. This can lead to delayed decision-making, missed deadlines, and increased operational costs.
  3. Flawed Analytics and Decision-Making
    Data analysis and predictive models are only as reliable as the data they are based on. Incomplete, duplicated, or inaccurate data can result in skewed insights, leading to poor strategic decisions that can have far-reaching consequences for the organization.
  4. Compliance Risks
    Stringent data privacy regulations, such as the General Data Protection Regulation (GDPR) and the California Consumer Privacy Act (CCPA), require organizations to maintain accurate and up-to-date personal data. Failure to comply with these regulations can result in hefty fines and reputational damage.
  5. Missed Opportunities
    Poor data quality can prevent organizations from identifying market trends, understanding customer preferences, and capitalizing on new product or service opportunities. This can allow competitors with better data management practices to gain a competitive edge.
  6. Reputational Damage
    Customers are increasingly conscious of how organizations handle their personal data. Incidents of data breaches, incorrect product information, or poor customer experiences can quickly erode trust and damage a company’s reputation, which can be challenging to rebuild.

Measuring the Financial Impact of Poor Data Quality

  1. Annual Financial Loss: Organizations face an average annual loss of $15 million due to poor data quality. This includes direct costs like lost revenue and indirect costs such as inefficiencies and missed opportunities​ (Data Ladder)​.
  2. GDP Impact: Poor data quality costs the US economy approximately $3.1 trillion per year. This staggering figure reflects the extensive nature of the issue across various sectors, highlighting the pervasive economic burden​ (Experian Data Quality)​​ (Anodot)​.
  3. Time Wasted: Employees can waste up to 27% of their time dealing with data issues. This includes time spent validating, correcting, or searching for accurate data, significantly reducing overall productivity​ (Anodot)​.
  4. Missed Opportunities: Businesses can miss out on 45% of potential leads due to poor data quality, including duplicate data, invalid formatting, and other errors that hinder effective customer relationship management and sales efforts​ (Data Ladder)​.
  5. Audit and Compliance Costs: Companies may need to spend an additional $20,000 annually on staff time to address increased audit demands caused by poor data quality. This highlights the extra operational costs that come with maintaining compliance and accuracy in financial reporting​ (CamSpark)​.

Strategies for Improving Data Quality

Addressing poor data quality requires a multi-faceted approach encompassing organizational culture, data governance, and technological solutions.

  1. Fostering a Data-Driven Culture
    Developing a workplace culture that prioritizes data quality is essential. This involves establishing clear data management policies, standardizing data formats, and assigning data ownership responsibilities to ensure accountability.
  2. Implementing Robust Data Governance
    Regularly auditing data quality, cleaning and deduplicating datasets, and maintaining data currency are crucial to maintaining high-quality data. Automated data quality monitoring and validation tools can greatly enhance these processes.
  3. Leveraging Data Quality Solutions
    Investing in specialized data quality software can automate data profiling, cleansing, matching, and deduplication tasks, significantly reducing the manual effort required to maintain data integrity.

The risks and costs associated with poor data quality are far-reaching and often underestimated. By recognizing the hidden impacts, quantifying the financial implications, and implementing comprehensive data quality strategies, organizations can unlock the true value of their data and position themselves for long-term success in the digital age.

Traci Curran headshot

About Traci Curran

Traci Curran is Director of Product Marketing at Actian, focusing on the Actian Data Platform. With 20+ years in tech marketing, Traci has led launches at startups and established enterprises like CloudBolt Software. She specializes in communicating how digital transformation and cloud technologies drive competitive advantage. Traci's articles on the Actian blog demonstrate how to leverage the Data Platform for agile innovation. Explore her posts to accelerate your data initiatives.
Data Integration

Introducing Actian’s Enhanced Data Quality Solutions

Traci Curran

June 18, 2024

Introducing Actian's enhanced data quality solutions

We are pleased to announce that data profiling is now available as part of the Actian Data Platform. This is the first of many upcoming enhancements to make it easy for organizations to connect, manage, and analyze data. With the introduction of data profiling, users can load data into the platform and identify focus areas, such as duplicates, missing values, and non-standard formats, to improve data quality before it reaches its target destination.

Why Data Quality Matters

Data quality is the cornerstone of effective data integration and management. High-quality data enhances business intelligence, improves operational efficiency, and fosters better customer relationships. Poor data quality, on the other hand, can result in costly errors, compliance issues, and loss of trust.

Key Features of Actian’s Enhanced Data Quality Solutions

  1. Advanced Data Profiling
    Our advanced data profiling tools provide deep insights into your data’s structure, content, and quality. You can quickly identify anomalies, inconsistencies, and errors by analyzing your data sources and leveraging pre-defined rule sets to detect data problems. Users can also create rules based on the use case to ensure data is clean, correct, and ready for use.
    Data Quality Overview
  2. Data Cleansing and Enrichment
    Actian’s data cleansing and enrichment capabilities ensure your data is accurate, complete, and up-to-date. Our automated processes isolate data that does not meet quality standards so data teams can act before data is moved to its target environment.
  3. Data Quality Monitoring
    With real-time data quality monitoring, you can continuously assess the health of your data. Our solution provides ongoing validation, enabling you to monitor deviations from predefined quality standards. This continuous oversight helps you maintain data integrity for operational and analytics use.
    Data Quality Run History
  4. Flexible Integration Options
    Actian’s data quality solutions seamlessly integrate with various data sources and platforms. Whether you’re working with on-premises databases, cloud-based applications, or hybrid environments, our tools can connect, cleanse, and harmonize your data across all systems.
  5. User-Friendly Interface and Dashboards
    Our intuitive interface makes managing data quality tasks easy for users of all skill levels. Detailed reporting and dashboards provide clear visibility into data quality metrics, enabling you to track improvements and demonstrate compliance with data governance policies.

Transform Your Data into a Strategic Asset

Actian’s enhanced Data Quality solutions empower you to transform raw data into a strategic asset. Ensuring your data is accurate, reliable, and actionable can drive better business outcomes and gain a competitive edge.

Get Started Today

Don’t let poor data quality hold your business back. Discover how Actian’s enhanced Data Quality solutions can help you achieve your data management goals. Visit our Data Quality page to learn more and request a demo.

Stay Connected

Follow us on social media and subscribe to our newsletter for the latest updates, tips, and success stories about data quality and other Actian solutions.

Traci Curran headshot

About Traci Curran

Traci Curran is Director of Product Marketing at Actian, focusing on the Actian Data Platform. With 20+ years in tech marketing, Traci has led launches at startups and established enterprises like CloudBolt Software. She specializes in communicating how digital transformation and cloud technologies drive competitive advantage. Traci's articles on the Actian blog demonstrate how to leverage the Data Platform for agile innovation. Explore her posts to accelerate your data initiatives.
Data Management

Actian Zen 16.0 Introduces New Data Sync Utility

Emma McGrattan

June 17, 2024

Actian Zen 16.0 Delivers

We are thrilled to announce the general availability of Actian Zen 16.0, delivering up to 50% faster query processing, flexible cloud deployment options, improved developer productivity, and a new data synchronization utility called EasySync.

More than 13,000 organizations across the globe trust Actian Zen as their embedded edge database for making fast, confident decisions. With this release, Actian is committed to helping businesses simplify edge-to-cloud data management, boost developer productivity, and build secure, distributed IoT apps.

Actian Zen’s latest release solidifies its position as the go-to database for building low-latency embedded applications. These applications enable real-time data access, optimize operations, and deliver valuable insights faster than ever before. The Zen 16.0 release helps embedded edge developers bring more efficiency at the edge with the following capabilities:

Curious how the new capabilities can help? Let an Actian representative show you!

Let’s dive into the ways Zen 16.0 empowers users with the new capabilities.

Execute Queries Up to 50% Faster With Actian Zen

You can run faster, smarter applications on edge devices with Zen 16.0. Zen accesses frequently used data that’s stored in the L2 cache, speeding up results for queries using this data. Common queries, such as those for frequently used reports or analysis, will experience significantly faster results.

Another technique boosting query performance is page read-ahead, which makes it much faster to scan large data files. When a query is executed, the Zen MicroKernel engine anticipates the data and preloads pages from the data file into memory. This optimization mechanism allows the database engine to not read from the disk as often, enabling faster results.

Having ultra-fast data retrieval is perfect for applications requiring immediate insights from edge devices. This capability ensures real-time analytics and decision-making, enhancing the overall efficiency and responsiveness of your operations. For example, Tsubakimoto Chain Company, a global machinery manufacturer, relies on Actian Zen as the embedded database, sorting up to 10,000 items per hour on their high-speed material handling systems.

Deploy Your Way With Zen Container SDK

With containerization, developers can quickly set up and use Actian Zen, running in Docker containers, with Kubernetes orchestration and Helm Chart configuration. This makes deployment and management across various environments, including on-premises, cloud, and hybrid, much easier.
The containerization of Zen supports ARM 32 and ARM 64 processors for wider deployment options. The ARM architecture is increasingly prevalent in various devices, from smartphones to Internet of Things (IoT) gadgets. Container support for ARM allows developers to target a broader range of platforms with their applications.

Elevate Developer Experiences Leveraging a Btrieve 2 Python Package

The Btrieve2 Python SDK has gained popularity within the Python community. With this release, developers can now leverage the performance and flexibility of Btrieve databases from Python using the Btrieve2 Python package:

  • Simplified Btrieve integration. The Btrieve2 Python package streamlines the process of working with Btrieve databases from Python applications. Developers can leverage familiar Python syntax for database operations, reducing the learning curve and development time.
  • Broader developer reach. Availability on PyPI makes the Btrieve2 package easily discoverable and installable using the familiar pip command. This expands the potential user base for Btrieve-compatible applications.
  • Simplified distribution and management. PyPI provides a centralized repository for package distribution and version management. You can easily share and update your Btrieve2 package, ensuring users have access to the latest version.

Zen 16.0 also boosts developer productivity with features such as LIKE with ESCAPE syntax and literal matching for concise, readable queries. Additionally, Zen now supports JSON nested-object queries to simplify data retrieval from JSON formats, allowing developers to focus on core logic and accelerate development cycles. Lastly, SQL query logging improves performance debugging effectiveness by revealing database interactions, aiding in identifying bottlenecks and optimizing query performance.

Enable Real-Time Data Streaming With Zen and Apache Kafka

Real-time data streaming – particularly in Kafka – is a popular method for moving data from the edge to the cloud, and vice versa. Zen support for Kafka allows you to benefit from streaming-based edge applications.

Combining Zen replication features with Apache Kafka can create a real-time data pipeline. Zen acts as the source database, replicating changes to a secondary database for analytical workloads. Kafka serves as a high-throughput messaging system, efficiently streaming data updates to analytics engines for immediate processing and insights.

Zen’s support for Kafka also allows you to build apps for real-time data processing. This is crucial for scenarios requiring immediate responses to data updates such as fraud detection or sensor data analysis.

Move and Sync Data Easier Using Zen EasySync

A pre-built data synchronization utility called EasySync saves time and effort compared to custom replication logic, allowing you to focus on core application functionality. EasySync lets you move and copy data easier than ever:

  • Data consistency and availability. Zen offers a robust replication mechanism for ensuring data is kept synchronized across multiple servers or geographically dispersed locations. This minimizes downtime and data loss risk in cases of hardware failures, network outages, or planned maintenance.
  • Reduced development complexity. By providing a pre-built data synchronization solution, Actian Zen saves you time and effort compared to implementing custom replication logic from scratch or paying for a separate data sync solution. This allows you to focus on core application functionality.

In Industrial IoT (IIOT) environments, the ability to replicate data to the database from handheld devices without requiring a gateway and without creating new code opens new use cases and opportunities. This enables real-time data collection and faster decision-making for process control, remote monitoring, and field service.

Drive Better Outcomes at the Edge

Zen simplifies edge-to-cloud data management with secure, scalable storage and seamless cloud synchronization. We listened to customer feedback and looked at market trends to ensure Zen continues to deliver new and sustainable value for your IoT and edge devices.
For example, you asked us to create longer index keys with more descriptive names. We delivered with index keys longer than 255 characters, enabling you to create more granular indexes that target the data needed for specific queries. You benefit from improved query speed, especially for complex searches or filtering, while being able to create data models with more expressive and descriptive field names to improve code readability and maintainability.

You can use Zen Mobile, Zen Edge, and Zen Enterprise to support modernization efforts, optimize embedded apps, and simplify edge-to-cloud data management. The surge in data from IoT and edge devices, alongside rapidly growing data volumes, makes extracting actionable insights a key differentiator.

Empower your team to achieve embedded edge intelligence with Zen 16.0. Packed with productivity-boosting features and flexible deployment options, Zen 16.0 helps you build the future of IoT.

Get started today!

emma mcgrattan blog

About Emma McGrattan

Emma McGrattan is CTO at Actian, leading global R&D in high-performance analytics, data management, and integration. With over two decades at Actian, Emma holds multiple patents in data technologies and has been instrumental in driving innovation for mission-critical applications. She is a recognized authority, frequently speaking at industry conferences like Strata Data, and she's published technical papers on modern analytics. In her Actian blog posts, Emma tackles performance optimization, hybrid cloud architectures, and advanced analytics strategies. Explore her top articles to unlock data-driven success.
Data Intelligence

Data Shopping Part 1 – How to Shop for Data Products

Actian Corporation

June 17, 2024

Businessman using a computer to document management concept, online documentation database and digital file storage system or software, records keeping, database technology, file access, doc sharing.

Just as shopping for goods online involves selecting items, adding them to a cart, and choosing delivery and payment options, the process of acquiring data within organizations has evolved in a similar manner. In the age of data products and data mesh, internal data marketplaces enable business users to search for, discover, and access data for their use cases.

In this series of articles, get an excerpt from our Practical Guide to Data Mesh and discover all there is to know about data shopping as well as the Actian Data Intelligence Platform’s Data Shopping experience in its Enterprise Data Marketplace:

  1. How to shop for data products.
  2. The Data Shopping experience.

As mentioned above, all classic marketplaces offer a very similar “checkout” experience, which is familiar to many people. The selected products are placed in a cart, and then, when validating the cart, the buyer is presented with various delivery and payment options.

The actual delivery is usually done outside the marketplace, providing tracking functionalities. Delivery can be immediate (for digital products) or deferred (for physical products). Some marketplaces have their own logistics system, but most of the time, delivery is the responsibility of the seller. The delivery time is an important element of customer satisfaction – the shorter it is, the more satisfied users are.

How does this shopping experience translate into an Enterprise Data Marketplace? To answer this question, we need to consider what data delivery means in a business context and, for that, focus on the data consumer.

The Delivery of Data Products

data product offers one or more consumption protocols – these are its outbound ports. These protocols may vary from one data product to another, depending on the nature of the data – real-time data, for example, may offer a streaming protocol, while more static data may offer an SQL interface (and instructions for using this interface from various programming languages or in-house visualization tools).

For interactive consumption needs, such as in an application, the data product may also offer consumption APIs, which in turn may adhere to a standard (REST, GraphQL, OData, etc.). Or simply download the data in a file format.

Some consumers may integrate the data product into their own pipelines to build other data products or higher-level uses. Others may simply consume the data once, for example, to train an ML model. It is up to them to choose the protocol best suited to their use case.

Whatever protocols are chosen, they all have one essential characteristic: they are secure. This is one of the universal rules of governance – access to data must be controlled, and access rights supervised.

With few exceptions, the act of purchase therefore simply involves gaining access to the data via one of the consumption protocols.

Access Rights Management for Data Products

However, in the world of data, access management is not a simple matter, and for one elementary reason: consuming data is a risky act.

Some data products can be desensitized – somehow removing personal or sensitive data that poses the greatest risk. But this desensitization cannot be applied to the entire product portfolio: otherwise, the organization forfeits the opportunity to leverage data that is nonetheless highly valuable (such as sensitive financial or HR data, commercial data, market data, customer personal data, etc.). In one way or another, access control is therefore a critical activity for the development and widespread adoption of the data mesh.

In the logic of decentralization of the data mesh, risk assessment and granting access tokens should be carried out by the owner of the data product, who ensures its governance and compliance. This involves not only approving the access request but also determining any data transformations needed to conform to a particular use. This activity is known as policy enforcement.

Evaluating an access request involves analyzing three dimensions:

  • The data themselves (some carry more risk than others) – the what.
  • The requester, their role, and their location (geographical aspects can have a strong impact, especially at the regulatory level) – the who.
  • The purpose – the why.

Based on this analysis, the data may be consumed as is, or they may require transformation before delivery (data filtering, especially for data not covered by consent, anonymization of certain columns, obfuscation of others, etc.). Sometimes, additional formalities may need to be completed – for example, joining a redistribution contract for data acquired from a third party, or complying with retention and right-to-forget policies, etc.

Technically, data delivery can take various forms depending on the technologies and protocols used to expose them.

For less sensitive data, simply granting read-only access may suffice – this involves simply declaring an additional user. For sensitive data, fine-grained permission control is necessary, at the column and row levels. Most modern data platforms support native mechanisms to apply complex access rules through simple configuration – usually using data tags and a policy enforcement engine. Setting up access rights involves creating the appropriate policy or integrating a new consumer into an existing policy. For older technologies that do not support sufficiently granular access control, it may be necessary to create a specific pipeline to transform the data to ensure compliance, store them in a dedicated space, and grant the consumer access to that space.

This is, of course, a lengthy and potentially costly approach, which can be optimized by migrating to a data platform supporting a more granular security model or by investing in a third-party policy enforcement solution that supports the existing platform.

Data Shopping in an Internal Data Marketplace

In the end, in a data marketplace, data delivery, which is at the heart of the consumer experience, translates into a more or less complex workflow, but its main stages are as follows:

  • The consumer submits an access request – describing precisely their intended use of the data.
  • The data owner evaluates this request – in some cases, they may rely on risk or regulatory experts or require additional validations – and determines the required access rules.
  • An engineer in the domain or in the “Infra & Tooling” team sets up the access – this operation can be more or less complex depending on the technologies used.

Shopping for the consumer involves triggering this workflow from the marketplace.

For the Actian Data Intelligence Platform’s marketplace, we have chosen not to integrate this workflow directly into the solution but rather to interface with external solutions.

In our next article, discover the Actian Data Intelligence Platform Data Shopping experience and the technological choices that set us apart.

actian avatar logo

About Actian Corporation

Actian empowers enterprises to confidently manage and govern data at scale. Actian data intelligence solutions help streamline complex data environments and accelerate the delivery of AI-ready data. Designed to be flexible, Actian solutions integrate seamlessly and perform reliably across on-premises, cloud, and hybrid environments. Learn more about Actian, the data division of HCLSoftware, at actian.com.
Databases

Embracing Database Modernization: Future Proofing Your Business

Actian Corporation

June 12, 2024

Embracing Database Modernization with Actian

Modernizing your database and apps to align with evolving business needs can improve efficiency, security, compliance, and user experiences while reducing costs and enhancing scalability.

Modernizing your IT infrastructure is the process of updating and even transforming your technologies, systems, and processes to better meet your current and future needs. Taking a strategic approach to modernization, including modernizing your database and apps, can deliver a range of benefits that include:

  • Informed Decision-Making. The ability to integrate all relevant data for transactional processing and timely, accurate insights for better decision-making.
  • Improved Efficiency. Automation, cloud computing, and advanced analytics can be leveraged to reduce operational costs and improve productivity.
  • Simplified Compliance. Modern systems are typically better equipped than legacy technologies to meet compliance and regulatory requirements.
  • Robust Security. Modern IT systems often offer enhanced security features and provide regular updates that protect against threats, helping ensure your data is secure.
  • Easy Scalability. Modern infrastructures, especially those that are cloud-based, provide immediate scalability to handle changing workloads.
  • Improved Customer Experiences. State-of-the-art IT systems support chatbots, personalized marketing, and real-time transactions for enhanced customer experiences.
  • Reduced Costs. Upfront investments to modernize can be significant, but they reduce technical debt and offer longer-term savings due to efficiency, less maintenance, and reduced downtime.

A strategic approach to modernization that aligns with evolving business and IT needs helps ensure you can capture and optimize the right data and make it usable across your organization. The recently enhanced Actian Ingres can play a pivotal role in your modernization journey.

Innovation Demands Modernization

Organizations like yours must have a solid data foundation for driving innovation—and innovate at a pace that allows you to seize trends, meet shifting customer preferences, and offer breakthrough products and features before your competitors do. This requires a high-performance database that delivers trusted, rapid insights without expecting you to use a multitude of different tools.

A modern approach to your database—and your IT infrastructure as a whole—can open new opportunities, such as bringing increased levels of automation, easier system integrations, and the ability to modernize at your pace in the environment you choose, whether it’s on-premises, in the cloud, or a hybrid setting.

The problem with many legacy systems is that they can’t easily integrate with other systems, making it difficult to seamlessly share data, are not scalable to handle growing data volumes, and require IT help to add data pipelines and utilize the data. All of these issues create barriers to rapid insights and limit your ability to take a data-driven approach to innovation, decision-making, personalized customer engagement, and other business areas.

The linchpin for success ultimately comes down to your approach to data. And it’s why a modern database that supports better management and utilization of data—without ongoing IT help—is required.

Benefits of Database Modernization

A database with modern features and capabilities delivers benefits such as fast data querying, high levels of efficiency, advanced security, and seamless data manageability without requiring advanced skills. Database modernization can deliver:

  • Flexible Modernization Paths. Your platform should give you the flexibility and agility to modernize according to your needs. For example, if you choose, you should be able to modernize in-place for better performance and security, or migrate data workloads to the cloud or multiple clouds to meet company mandates for a cloud-first approach to data.
  • Phased Approach to the Cloud. If you want to move to the cloud, it should be at your pace, allowing you to migrate as you’re ready. This way, you can move data backup and recovery capabilities to the cloud, which is a common cloud use case, but keep other workloads on-premises until you’re ready to move them to the cloud. A phased approach supports a smooth transition with minimal disruption.
  • Advanced Capabilities. Modernization entails more than upgrading technologies. It encompasses aligning technology with business priorities to enable you to reach desired outcomes faster—and have confidence in the results. A modern database with user-friendly capabilities lets you deliver new value across your organization while fostering a data-driven culture.
  • Optimized User Experiences. A modern database provides fast, reliable access to data. Features such as automated scaling and easy integration with various applications, along with the ability to support complex queries and large datasets, lead to more engaging user experiences and increased productivity.

Bridging the Skills Gap

Modernization efforts simplify data access and management while reducing the time spent on manual tasks such as wrangling and prepping your data. A successful modernization approach also bridges the skills gap for analysts and other data users by making data easy to access and use. The result is more people throughout the company being able to utilize data, which helps unlock the full potential of your data.

A modern approach to building apps complements a modern database by enabling rapid app development, scalability, and integration with cloud services for increased agility and a faster path to innovation. Modernizing your database in addition to app building processes can help you better predict market changes, shorten the timeframe to market, accelerate data-driven innovation, and maintain a competitive advantage.

Actian offers a solution to deliver modern apps quickly. OpenROAD is a database-centric, object-oriented, 4GL rapid application development tool for developing and deploying mission-critical, n-tier business applications. It simplifies app modernization by letting you reuse your existing business logic, making it much easier to offer modern user interfaces.

Trusted Support for Database Modernization

Modernizing your database and applications delivers myriad benefits, yet you must take an eyes-wide-open approach. Complex interdependencies, your data infrastructure, operating systems, and hardware can pose risks when modernizing, so you must consider how they will be impacted.

Actian offers professional services tailored to modernization needs through our Ingres NeXt Initiative to transform your mission-critical Actian Ingres database and OpenROAD applications into open, extensible platforms while reducing risk and accelerating modernization. The expert support ensures a smooth modernization journey while preserving existing investments.

actian avatar logo

About Actian Corporation

Actian empowers enterprises to confidently manage and govern data at scale. Actian data intelligence solutions help streamline complex data environments and accelerate the delivery of AI-ready data. Designed to be flexible, Actian solutions integrate seamlessly and perform reliably across on-premises, cloud, and hybrid environments. Learn more about Actian, the data division of HCLSoftware, at actian.com.
Data Management

Actian Ingres 12.0: Modernize Your Way – Trusted, Reliable, and Adaptable

Douglas Dailey

June 11, 2024

Disaster recovery with Actian Ingres

Our trusted and reliable database delivers performance and flexibility, empowering customers to modernize at their own pace.

As the director of product management for Actian, I’m thrilled to share first-hand insights into the latest enhancements to Actian Ingres. This major release embodies our commitment to customer-driven innovation and reinforces our position as a trusted technology partner.

Actian Ingres 12.0 builds upon the core strengths that have made Ingres a go-to transactional database for decades. We’ve invested heavily in performance, security, and cloud-readiness to ensure it meets customers’ modernization needs.

Actian Ingres offers modernization at your own pace, in a low-risk fashion, using any of the following options:

Choice and Flexibility

This release is all about giving customers the power of choice. Whether you’re committed to on-premises deployments, ready to embrace the cloud, or are looking for a hybrid solution, Actian Ingres 12.0 adapts to your modernization strategy.

We have options for Lift/Shift to VM, containerization via Docker and Kubernetes, and plans for bring your own license (BYOL) on the AWS Marketplace. If customers want to take a phased approach, customers have several options. Customers can move first to Linux on-premises, then to virtual machines (VMs) in the cloud, and finally to containers. We’re here to help and want customers to know we have a cloud story to help them in their journey.

Core Enhancements

We understand that familiarity and reliability are crucial to our users. That’s why Actian Ingres 12.0 strengthens core capabilities alongside exciting new features. We’ve doubled down on investments in these areas to ensure that Ingres remains a database that delivers new and sustainable value; this commitment keeps it relevant for the long term.

Reliability and security are paramount for our customers. Ingres 12.0 strengthens our ability to prevent brute force and Denial of Service (Dos) cyber-attacks, and DBMS security for user privileges to better protect users, roles, and groups.

We’ve added User Defined Function (UDF) support for Python and Javascript, offering a powerful way to extend the functionality of a database or streamline processes.  The use of containers offers an isolated execution environment to keep the DBMS secure.

The X100 analytics engine attracts attention for its superior performance where users have seen significant performance gains for OLAP related activities through the use of X100 tables by emphasizing their speed and efficiency.

X100 Analytics Table

Most notably, we introduced table and schema cloning in this release. This translates into a huge savings for warehouse-oriented customers and eliminates overhead for storage and latency without data duplication. Imagine a simple SQL-based table clone command that can clone not just one, but many tables in a single executed statement, and opens new possibilities for future data sharing and analytics down the line.

Cloud Enablement

Cloud adoption can be complex, but we’re here to make the journey smooth. Migrations can be challenging, which is why we provide support every step of the way. Ingres 12.0 is more adaptive to meet current and emerging business challenges while helping customers who want to move to the cloud to do so at their own pace.

This release brings a long-awaited backup to cloud capability for Actian Ingres that appeals to most data protection strategies. For many organizations, the ability to backup and restore data as part of an off-site disaster recovery strategy is their first objective. This type of backup strengthens business continuity.

Users already deploy Ingres on Linux using Docker and leverage Kubernetes to simplify orchestration. With Ingres 12.0 we now support disaster recovery using IngresSync, a DR utility formerly only available through Professional Services. IngresSync allows users to set up a read-only standby server. Yet another reason to have more confidence stepping into the cloud knowing you can distribute workloads and have disaster recovery options.

Performance Matters

Our development team was granted 5 patents with an additional 3 currently pending. This is the type of innovation that helps to differentiate us in areas of performance optimization. These patents touched advances in User Defined Functions (UDFs), index optimization, and continued differentiation with the in-memory storage, tracking, and merging of changes stored in X100 Positional Delta Trees (PDT). This is a tremendous show of passion for perfection by our amazing developers.

We invested in additional performance testing and standardization on industry TPC-H, TPC-DS, and TPC-C benchmarks, making strides release over release, and even more so, when it comes to complex X100 queries. These types of investments uncover various edge cases and costing scenarios that we can improve so users of any workload type can benefit. Of course, mileage varies.

Customers also benefit from more efficient workload management tailored to their specific business needs. Workload Manager 2.0 offers the capability to establish priority-driven queues, enabling resources to be allocated based on predefined priorities and user roles. During peak workload periods, the system can intelligently handle incoming queries by prioritizing specific queues and users, guaranteeing that important tasks are handled promptly while upholding overall system performance and efficiency.

For example, if business leaders require immediate information for a quarterly report, their queries are prioritized accordingly. Conversely, in situations where real-time transactions are crucial, prioritization is adjusted to maintain system efficiency.

Modernize With Confidence

Modernizing applications can be daunting. OpenROAD, a database-centric rapid application development (RAD) tool for developing and deploying business apps, continues to make this process easier with improvements to abf2or and WebGen utilities shipped with the product.

Empowering customers to transform their apps and up-level them for the web and mobile helps them stay current in a rapidly evolving developer space. This area of work can be the most challenging of all but having the ability to convert “green screen” applications to OpenROAD, and then on to web/mobile is a great starting point.

OpenROAD users can expect to see a new gRPC-based architecture for the OpenROAD Server. This architecture helps to reduce administration, enhance concurrency support, and is more lightweight because of its use of HTTP/2 and protocol buffers. Our developers were excited to move forward with this project and see it as a big jump from COM/DCOM.

The new gRPC architecture is also microservices-friendly and able to be packaged into a separate container. Because of this, we’ve got our sights set on containerized deployment of the Server in the cloud. In the meantime, we’ve distributed Docker files with this release so that customers can do some discovery and exploration.

Driven by Customer Feedback

Actian Ingres 12.0 can help customers expand their data capabilities footprint, explore new use cases, and reach their modernization goals faster. We’ve focused on enabling customers to strategically grow their business using a trusted database that keeps pace with new and emerging business needs.

We want customer feedback as we continue to innovate. Many of the database enhancements are based on direct customer input. We talked with users across industries about what features and capabilities customers like, and what customers wanted to see added. Their feedback was incorporated into our product roadmap, which ensures that Ingres continues to meet their evolving requirements. Plus, with our commitment to best-in-class support and services, every customer can be assured that we’re here to help them, no matter where customers are on their modernization journey.

Ingres is more than just a database. It’s a trusted enabler to help customers become future-fit and innovate faster without barriers. Whether you’re up leveling your version to 12.0 for the new capabilities and improvements, migrating to the cloud, modernizing applications, or leveraging built-in X100 capabilities for real-time analytics against co-located transactional data, Ingres 12.0 has something for everyone.

Additional Resources:

Doug Dailey headshot

About Douglas Dailey

Douglas Dailey is Director of Product Management for Actian's cloud and on-prem databases, tools, and connectivity solutions. Over 15 years, Doug has built data virtualization platforms and steered technology investments in IBM Netezza, DB2 Replication, and Informix. He's led sessions at major data events (e.g., IBM's Think conference) and authored Whitepapers on data hub and data fabric topologies. On the Actian blog, Doug focuses on hybrid data strategies, replication, and emerging use cases. Check out his latest articles to understand modern data architectures.