Databases

Smart Stores, Savvy Shoppers: Data’s Role in Reinventing Retail

Kasey Nolan

August 12, 2024

businessperson looking at data in retail

Navigating the Complexity of Modern Retail With Data

In today’s digital age, retail is evolving at a breakneck pace. Gone are the days when a great product and a welcoming smile are enough to secure customer loyalty. Modern shoppers demand seamless, personalized experiences, whether they’re browsing online from their couch or strolling through a brick-and-mortar store.  

Customer loyalty has also evolved. In the past, shoppers would often stick with a single brand or store out of habit or familiarity. However, today’s consumers are more informed and have more choices at their fingertips. Loyalty is no longer guaranteed by proximity or tradition; it must be earned through consistent, high-quality, and personalized experiences. 

To stay competitive, retailers need to harness the power of data to anticipate needs, optimize operations, and create memorable shopping experiences that keep customers coming back—across every channel and each interaction.  

Leveraging Data to Improve Customer Acquisition and Loyalty

To improve customer acquisition and loyalty, retailers must leverage a variety of data types that often exist in different silos within the retail environment. 

1. Behavioral Data

Behavioral data is all about tracking customers’ online browsing history, click patterns, and purchase history on websites and mobile apps. For example, understanding which products a customer frequently views but does not purchase can help to craft targeted promotions.  

In stores, IoT devices and sensors can track how customers move through physical aisles, identifying popular paths and frequently visited sections. This information allows retailers to optimize store layouts and product placements to enhance the shopping experience and increase sales. 

2. Transactional Data

Analyzing purchase history provides insights into customer preferences and buying habits. Retailers can identify trends, like which products are frequently bought together, or which times of year certain items are in high demand. This data aids in inventory management, ensuring that popular products are always available to meet customer demand.  

3. Demographic Data

Collecting demographic information such as age, gender, location, and income levels helps retailers segment their customer base and create targeted marketing campaigns. Understanding the geographic distribution of customers can inform decisions about where to open new stores or focus advertising, while data on age group preference can allow retailers to tailor their marketing messages to the right audience.  

4. Psychographic Data

Psychographic data is all about customer interests, values, and lifestyle choices. Retailers can gather this information through online / browsing behavior, social media interactions, and other engagement tools. By aligning marketing messages with customers’ values and interests, retailers can build stronger emotional connections and brand loyalty. 

5. Feedback Data

Finally, customer feedback collected through reviews, surveys, and direct interactions offers invaluable insights into customer satisfaction and areas for improvement. Positive reviews can be leveraged in marketing campaigns to build trust and attract new customers. Negative feedback can highlight pain points and opportunities for improvement. By addressing customer concerns promptly, retailers can improve their products and services and boost customer loyalty and retention.  

Connect, Manage, and Analyze With Confidence Using the Actian Data Platform

Knowing what data to look for is only part of the solution. Integrating it to get a full view of your business is another issue entirely. Retailers often struggle with data scattered across various systems like POS, CRM, and e-commerce platforms, and need help connecting, managing, and analyzing the data points to make fast, accurate, data-driven decisions. This entails capturing data in both on-premises systems and in the cloud. That’s why retailers need a hybrid platform that enables: 

Connecting Data

Imagine a customer browsing your online store, adding items to their cart, and later deciding to complete the purchase in a physical store. With connected data, you can track their journey seamlessly, offering personalized recommendations and ensuring inventory is synchronized across channels. This level of integration creates a cohesive shopping experience that delights customers and drives loyalty. 

The Actian Data Platform provides this solution by seamlessly connecting these data sources, providing a unified view of operations. This integration not only streamlines workflows but ensures that all departments have access to accurate and up-to-date information. 

Managing Data

Managing vast amounts of data can be daunting, but the Actian Data Platform makes it easy. The platform’s ability to handle data from multiple sources means you can manage everything from sales transactions and customer profiles, to inventory levels and supply chain logistics. Secure data management also protects sensitive customer information—like customer names—while still allowing you to target customers for marketing activities, building trust and confidence in your brand. 

Analyzing Data

The true power of data lies in its analysis. The Actian Data Platform supports analytics capabilities that transform raw data into meaningful insights. Retailers can identify trends, forecast demand, and make data-driven decisions that improve their bottom line. Whether it’s optimizing inventory or personalizing marketing campaigns, the possibilities are endless. 

Drive the Future of Retail With Confidence

The Actian Data Platform is a game-changer for the retail industry, offering unparalleled capabilities in connecting, managing, and analyzing data. By leveraging this powerful tool, retailers can achieve greater efficiency, enhance customer experiences, and accelerate strategic growth. Actian’s commitment to innovation and excellence ensures that businesses like yours are equipped to meet the challenges of today’s data-driven world. Discover the future of retail with Actian with a custom demo. 

Kasey Nolan

About Kasey Nolan

Kasey Nolan is Solutions Product Marketing Manager at Actian, aligning sales and marketing in IaaS and edge compute technologies. With a decade of experience bridging cloud services and enterprise needs, Kasey drives messaging around core use cases and solutions. She has authored solution briefs and contributed to events focused on cloud transformation. Her Actian blog posts explore how to map customer challenges to product offerings, highlighting real-world deployments. Read her articles for guidance on matching technology to business goals.
Data Integration

Connect Disparate Data Sources With Confidence

Derek Comingore

August 8, 2024

abstract concept of connecting data sources with confidence

Now more than ever, businesses in every vertical are inundated with vast amounts of data coming at them from various sources. And those sources keep growing as data is created by an ever-expanding number of apps, systems, and devices. Whether it’s customer interactions, supply chain operations, or financial transactions, data is the lifeblood of the modern enterprise.

However, the sheer volume and variety of data that’s available creates a significant challenge. You must ensure information is accessible, accurate, trusted, and actionable. This is where data integration has become crucial.

As I shared during a recent TDWI conference in San Diego, unifying data from multiple sources enables you to utilize the full potential of all your data to drive informed decision-making and strategic growth of the business. This includes hybrid data integration, which connects data from across cloud and on-premises environments.

Four Business Reasons to Integrate Your Hybrid Data

Unifying disparate data sources while ensuring quality and compliance are essential for success. Following proven approaches for integration, implementing a robust data integration strategy that supports your growth objectives, and using a modern data platform are all required to connect your data.

Reasons to bring your data together in a single platform include:

1. Overcoming Data Silos

Silos isolate data sets, making them inaccessible to other parts of your organization. These silos can arise from different software systems, geographic locations, or employees using their own data because they don’t trust or can’t easily access enterprise data. Data silos hinder collaboration and lead to incomplete insights. Data integration breaks down these silos, providing a unified view that enhances collaboration, decision-making, and comprehensive analysis.

2. Ensuring Data Consistency & Quality

With data flowing from so many sources, maintaining consistency and quality becomes a daunting task. Inconsistencies and inaccuracies in data can lead to flawed analysis and poor decision-making. By contrast, comprehensive data integration ensures that data is standardized, trusted, and accurate, providing a single source of truth that gives you confidence in your outcomes. Consistent, high-quality data is critical for accurate reporting and reliable business intelligence.

3. Enhancing Operational Efficiency

A unified view of critical information allows analysts and decision-makers to identify trends, optimize processes, and allocate resources more effectively. Integrated data also streamlines workflows, reduces redundancies, and minimizes errors, leading to improved operational efficiency. That’s because data integration helps you operate more smoothly, have agility to respond to market changes, and maintain a competitive edge.

4. Supporting Compliance & Security

In our era of stringent regulatory requirements, ensuring compliance is essential. Modern data integration platforms offer robust security features and compliance controls, helping you manage sensitive data across various environments. This includes implementing data quality rules, orchestration workflows, and secure data transfers, which are essential for maintaining regulatory compliance and protecting data integrity.

Four Benefits of Hybrid Data Integration

The ability to master data integration and achieve seamless operations across cloud, on-premises, or hybrid environments can unlock significant value across the business. With hybrid data integration, you realize these benefits:

1. Improved Organizational Decision-Making

Connected hybrid data provides a comprehensive view of business operations, enabling data-driven decision-making. By having access to accurate, up-to-date data, business leaders can make more informed choices that drive strategic growth and competitive advantage. When hybrid data is fully integrated, decision-making increases across all aspects of the business.

2. Increased Efficiency & Cost Savings

Bringing together data pipelines reduces the time and resources required for ongoing data management. This efficiency, coupled with automated data processes, translates into cost savings, reduced manual intervention, and optimized resource utilization. Plus, integrated data reduces the need for multiple data management tools, especially when using the right platform, which further lowers costs.

3. Enhanced Collaboration & Coordination

Data integration encourages data sharing across various departments and systems. When you have a data platform that offers easy data integration and accessibility, analysts and organizational teams can seamlessly share data and work together using the same information. Enhanced coordination leads to better alignment of efforts, more cohesive strategies, and improved overall performance, which also improves trust in your data.

4. Barrier-Free Access to Valuable Insights

Integrated data offers richer, more contextual insights than single data sets. This lets you uncover details that may have previously been hidden. These details give you a better understanding of customers, markets, and internal operations. As a result, you can make informed decisions, develop highly targeted strategies, and respond more effectively to changing market conditions.

Four Best Practices to Integrate Hybrid Data

One of the main questions I get asked during presentations is how to get started with data integration—especially with data spanning the cloud and on-premises systems. Many analysts and other data users are accustomed to complex processes that require IT help or advanced skill sets.

That is no longer the case! With the right strategy and data platform, hybrid data integration is easier than you may think. Here are four steps to ensure success:

1. Assess Your Data Integration Needs

Determining your organization’s specific needs is the essential first step. You’ll want to identify the data sources that need to be integrated, the types of data being handled, and the business processes that will benefit from integration. This assessment helps you choose the right data integration tools and strategy.

2. Pick the Right Data Platform

Select a robust data platform that simplifies data integration processes and makes it easy to build data pipelines to new sources. Also look for a platform that offers flexibility, scalability, and ease of use. Features such as codeless API integration, pre-built connectors, and data profiling capabilities significantly streamline the integration process and reduce the time to value. 

3. Ensure Data Quality & Governance

Comprehensive integration should not come at the expense of data quality. Maintaining quality is a continuous process that entails enacting data quality rules, performing regular data profiling, and establishing governance policies to ensure integrated data remains accurate and reliable. This approach helps mitigate data inconsistencies and ensures compliance with internal and regulatory standards.

4. Benefit From Automated Processes

Automating data integration processes greatly reduces manual efforts and minimizes errors. Integration tools and data pipeline orchestration can automate data workflows. Automation enhances efficiency while also enabling real-time data integration to deliver  timely insights.

Consider a Complete Data Platform That Simplifies Integration

Data integration is a necessity for businesses that want to thrive in our data-driven world. It requires a modern platform that allows you to connect data in hybrid environments without using a variety of tools. For example, the Actian Data Platform offers end-to-end integration, data warehousing capabilities, and analytics across your entire hybrid environment.

This single, unified data platform offers real-time insights along with superior price performance. Users across all skill levels can connect, manage, and analyze data using a fully integrated suite of data solutions, eliminating the need for multiple tools or manual code.

We can meet you wherever you are on your data journey while making data easy to access and use. Our platform can help you go from data to decisions with confidence, enabling you to:

  • Increase revenue
  • Reduce costs
  • Mitigate risk
  • Win market share
  • Support a data-driven culture

With the Actian Data Platform, you also benefit from native integration and data quality, flexible deployment, and ease of use. In addition, our dashboards give you visibility into activities so you can see information in an easy-to-understand format. Curious how the platform can transform data integration and management at your organization? Get a custom demo. I think you’ll be impressed.

derek comingore headshot

About Derek Comingore

Derek Comingore has over two decades of experience in database and advanced analytics, including leading startups and Fortune 500 initiatives. He successfully founded and exited a systems integrator business focused on Massively Parallel Processing (MPP) technology, helping early adopters harness large-scale data. Derek holds an MBA in Data Science and regularly speaks at analytics conferences. On the Actian blog, Derek covers cutting-edge topics like distributed analytics and data lakes. Read his posts to gain insights on building scalable data pipelines.
Data Analytics

A Day in the Life of a Marketing Operations Specialist

Savannah Bruggeman

August 2, 2024

marketing operations specialist showing day to day

My day begins early, fueled by a strong cup of coffee, a protein smoothie, and a quick glance at the day’s agenda. As a marketing operations specialist, my role revolves around leveraging data to drive strategic decisions to improve our marketing efforts. I need a holistic, cross-channel view across the entire global marketing organization. I also need to be able to trust my data, having the confidence to know that it’s giving me the most accurate and up-to-date information.

The first task is usually a review of content performance metrics. This morning, I’m diving into the performance of content we created to support the new Actian Zen 16.0 launch. I not only need to be able to slice and dice content metrics such as views, clicks, and scroll depth, but I also have to be able to layer in lead acquisition information to see if I can attribute any new leads to the launch content. To do this effectively, I need de-siloed, integrated data that I can trust, so having a platform that allows me to connect a multitude of sources together is imperative.

Using Real-Time Dashboard to Spot Trends

Mid-morning is typically spent in a strategy meeting with the marketing team. For example, today I pulled up and shared real-time dashboards to present the latest performance trends and customer behaviors. We discussed optimizing our current launch efforts and brainstormed new strategies based on the data-driven insights I presented to make informed decisions that optimize our marketing efforts and resources.

By lunchtime, it’s time to step away from my computer, grab another cup of coffee, and have lunch. Knowing that my data is being integrated, stored, and managed, and dashboards are up to date allows me to feel good about taking 15 minutes to myself, sitting outside, and playing with my cat.

The afternoon is dedicated to taking a deeper dive into campaign content performance. I look through a number of sources to understand how content performs in various markets and channels. This segmentation helps tailor our messaging for upcoming campaigns, ensuring that we target the right audience.

The Need for Trusted, Easy-to-Use Data

Actian products play a pivotal role in my daily routine. The Actian Data Platform allows me to unify all my marketing data into a single dataset in a warehouse that is built for easy, no-code reporting and analytics. Plus, the pre-built marketing connectors and APIs to marketing data sources allow self-serve, so I don’t have to wait or rely on IT to get the insights I need. Most importantly, there is no fear of stale or duplicative data with native data quality rules. My critical dashboards are reliable and function as expected.

Reliable Tools are a Marketer’s Best Friend

Wrapping up my day, I feel confident that the Actian Data Platform has empowered me and others across our global marketing team to make informed decisions and optimize our marketing strategies effectively. With its efficiency and reliability, the Actian Data Platform is an indispensable tool in my daily workflow, driving better outcomes for our marketing initiatives.

Customers are using our products for similar use cases. Learn about how the AA uses the Actian Data Platform to make split-second decisions to deliver faster results to their customers.

savannah bruggeman headshot

About Savannah Bruggeman

Savannah Bruggeman is a Marketing Operations Specialist at Actian, bringing a data-driven mindset to campaign optimization. A recent Loyola University Chicago graduate, Savannah has quickly integrated fresh ideas into Actian's marketing processes. She specializes in marketing tech, analytics, and streamlining lead generation workflows. Her blog contributions cover marketing automation, lead management, and performance tracking. Explore her articles for actionable insights on driving marketing ROI.
Data Management

Your Guide to Application Modernization With HCL Informix®

Nick Johnson

August 1, 2024

guide to application modernization with hcl informix

The imperative to modernize extends across all aspects of the IT landscape. Companies face an urgent need to enhance business agility, break down organizational silos, accelerate innovation, reduce time-to-market, optimize costs, and transform their IT workforce. Achieving these goals requires strategic decisions about how and where to modernize. Your organization needs to leverage the full spectrum of available tools and services and rethink its approach to developing, deploying, operating, and maintaining applications.

When done right, application modernization can transform your company and unlock new revenue sources with new or expanded use cases. But this does not happen overnight. It takes a concerted, organization-wide effort to rethink legacy systems, adopt hybrid architectures, and embrace DevOps best practices. You can achieve greater agility, scalability, and cost savings by gradually migrating applications to microservices, optimizing infrastructure, and automating deployment pipelines.

Utilize HCL Informix to Create Your Application Modernization Plan

Insufficient research, discovery, and planning is the most common modernization mistake. Often, businesses will wait to modernize their applications until something breaks. But you don’t have to learn the hard way. It just takes a little planning.

Define Your Strategic Goals

What “success” looks like may differ across your organization. Maybe you’re most concerned with maintaining the stability and performance of your existing infrastructure, but your research and development teams may want to test and explore new technologies and concepts. Establishing your goals and objectives and separating the “must-haves” from the “nice-to-haves” is essential. Make sure to run the goals up to a leadership level to set expectations.

Audit Your Applications

Conduct thorough discovery by auditing your existing applications and determining which level of modernization is appropriate for each. Compare applications to current standards and best practices in security, availability, scalability, infrastructure automation, monitoring, proactive failure prevention, and disaster recovery. Use the audit to determine what types of applications you want to migrate, enhance, re-build, or build from scratch.

Determine Strategic Importance

Not all applications are equally important to your organization’s success. You will want to modernize your applications with the highest business-criticality first or those that best align with your business strategy. Those could be the applications that drive the most revenue or the ones where security and data privacy are of utmost importance. Rank these applications and work with one of our Actian partners or specialists to devise a unique modernization strategy for each mission-critical application.

Establish Your Data Landscape

Choose an architecture (cloud, on-prem, hybrid), operating system (Windows, Linux, etc.), and data management services to combine with your updated HCL Informix® instance. Devise a comprehensive architecture diagram, including critical integrations and connectors, to map any potential vulnerabilities that must be addressed.

Get Ready for a Cultural Shift

Modernization is a team sport. While your team will be familiar with your HCL Informix database and tools like HCL Informix 4GL and the Informix Warehouse Accelerator, there may be a steep learning curve to managing a cloud-native or hybrid application. Additionally, there is a whole host of tools, data warehouses, and microservices available from cloud providers that your team may have to learn. Devise a training program uniquely for your database administrators, developers, architects, IT, and any other role that will be impacted, so they can proficiently manage your apps once migrated and modernized.

For more best-practice approaches to modernizing your Informix applications, download our free eBook.

> Get the eBook

For additional best practices, or to customize a strategy for your organization, connect with one of our Actian partners or specialists.

About Actian

Actian makes data easy. We deliver cloud, hybrid cloud, and on-premises data solutions that simplify how people connect, manage, and analyze data. We transform business by enabling customers to make confident, data-driven decisions that accelerate their organization’s growth. Our data platform integrates seamlessly, performs reliably, and delivers at industry-leading speeds. Learn more about Actian, a division of HCLSoftware: www.actian.com.

Informix® is a trademark of IBM Corporation in at least one jurisdiction and is used under license.

nick johnson headshot

About Nick Johnson

Nick Johnson is a Senior Product Marketing Manager at Actian, driving the go-to-market success for HCL Informix and Actian Zen. With a career dedicated to shaping compelling messages and strategies for databases, Nick brings a wealth of experience from his impactful work at leading technology companies, including Neo4j, Microsoft, and SAS.
Data Platform

Understanding the Value of a True Data Platform

Phil Ostroff

July 31, 2024

understanding the value of a true data platform

There are many existing and emerging database solutions throughout the market today. Some offer specific functional capabilities, such as data integration, data warehousing and data analytics, whereas others offer a combination of these services, plus additional capabilities, under what is generally called a “platform.” But what really makes a true data “platform” these days? And how does the answer to this question help you, as someone approaching the market looking for a data platform, decide what’s best from a cost and functionality perspective?

What Makes a True Data Platform?

This is such an interesting question. The data journey itself consists of several stages, with handoff points that data moves through to ultimately get to a usable state. There are many use cases and requirements when it comes to what you can do with data, and why and where you can deploy it. You also need to take into account the different industries that are affected by various local or regional requirements. Additionally, there are different user types or personas who have different data needs.

Still, the data journey remains mostly the same across organizations, and many vendors in the market today claim to provide data “platforms” that support this journey. In reality, they are really only providing data warehousing with other capabilities “bolted on” via partnerships or hastily executed acquisitions. This leaves the data journey fragmented with the potential for increasing costs, reduced data quality, and increased risk.

A true data platform goes beyond only storing information. Within a single offering, it acts as a central hub, seamlessly ingesting data from various sources, organizing it efficiently, and making it readily available for analysis. This empowers users across the organization, from data scientists to business analysts, to unlock valuable insights that drive better decision-making. This is regardless of industry, geographic interest of the business, or the stakeholder involved.

What Are the Advantages of a True Data Platform?

A true platform, one that covers data integrations, warehousing and analytics, brings forth the following key advantages:

Streamlined Workflow

A single platform creates a smooth flow of data from its various sources (via integration) to the data warehouse for storage and then directly to the analytics tools for insights. This eliminates the need to export and import data between separate systems, saving time and effort, while avoiding the need for a variety of tools.

Improved Data Consistency

With everything on one platform, data maintains consistency throughout the process. This reduces errors caused by data transformations or discrepancies between different systems.

Faster Time to Insights

By eliminating the need to move data between multiple platforms, you can get to valuable insights much faster. This allows for quicker decision-making based on real-time data.

Simplified Management

Managing a single platform is significantly easier than managing a collection of separate tools. This reduces the IT burden and frees up resources for other tasks.

Enhanced Data Quality

A unified platform often has built-in features for data cleansing and transformation, ensuring the quality of data used for analysis.

Reduced Costs

While a single platform may have a higher upfront cost than individual solutions, it can be more cost-effective in the long run. You’ll save on licensing fees for multiple products and potentially reduce IT maintenance costs.

What Risks Do You Run if You Choose Something Else?

If you look closely at some database solution offerings, you will discover that they are not true platforms. Rather, they are components of an ecosystem: a key offering (typically data warehouse) merged with third-party data integrations and analytics tools.,

These are not true platforms due to the interdependency of each system’s connectivity with other systems. This can present a series of issues:

Data Silos and Inconsistencies

Separate systems can create data silos, where information gets trapped and isn’t readily available for analysis. This can lead to inconsistencies and discrepancies in the data across different tools.

Complex Workflows

Moving data between separate systems can be a complex and time-consuming process, involving manual steps and data transformations. This can slow down the process of getting insights from your data.

Increased Costs

The cost of licensing and maintaining multiple tools can add up quickly. Additionally, the need for additional IT resources to manage these separate systems can further increase costs.

Delayed Insights

The complexity of data movement between systems can lead to delays in getting insights from your data. This can hinder your ability to make timely decisions based on real-time information.

Reduced Data Quality

The process of moving data between systems can introduce errors and inconsistencies. Without built-in data quality checks, it’s harder to ensure the accuracy of your data for analysis.

Management Challenges

Managing and maintaining multiple data tools requires significant IT expertise. This can be a burden for smaller organizations or those with limited IT resources.

Security Concerns

Each data handoff point from one solution to another presents a weakness that could be exploited by hackers. Reducing security threats is extremely important, especially given the frequency and sophistication of recent security breaches.

Potential Customer Service Issues

Dealing with multiple vendor solutions across the data journey may lead to disparate or disconnected customer experiences. This, in turn, leads to additional time to fix issues and causes frustration.

Additionally, specific businesses may decide to make their own “platforms” by piecing together various solutions that they feel best address their own data journey requirements. In speaking with several of these businesses, we’ve learned they have run into most, if not all, of the issues listed above.

What they believe are time- and money-saving approaches to creating a system that supports their data journey becomes an expensive, heavily decentralized and ungovernable nightmare. Data gets siloed quickly and different teams handle different aspects of the data journey, which can cause a breakdown in communications and procedures, and things slowly spiral out of control.

Consider the Actian Data Platform for All Data Requirements

The Actian Data Platform provides end-to-end integration, data warehouse and analytics capabilities across your entire environment at unmatched price-performance. The platform allows you to collect, manage and analyze data with one solution, eliminating the disadvantages noted above.

Key benefits of the Actian Data Platform include:

Data Quality Monitoring

Turn your data into a trusted, strategic asset with built-in data quality rules and transformations. Enjoy features like automatic rule generation, reusable rules and rule sets, and intuitive dashboards.

High Concurrency

With Actian’s high concurrency capabilities, you can support a multitude of simultaneous queries, transactions and analytical tasks without sacrificing performance.

Vector Processing and CPU Cache Maximization

With Vector processing and CPU cache maximization at its core, the data platform delivers scalability, high-performance and data processing speed for real-time analytic workloads.

Advanced Columnar Storage

By embracing advanced columnar storage, the Actian Data Platform empowers users to derive insights from their data at unparalleled speeds, making it a robust choice for data analytics, reporting and business intelligence applications.

REAL Real-Time Analytics

The patented technology of the Actian Data Platform allows you to keep your analytics dataset up-to-date without affecting downstream query performance – ideal for speedy analytic outcomes.

Separation of Compute and Storage

Unlike traditional monolithic systems in which computing and storage are tightly coupled, Actian’s architecture decouples these components, allowing you to independently scale compute and storage resources based on specific needs.

Don’t just take our word for it! Our platform was ranked “Exemplary” in the recent Ventana Data Platforms Buyer’s Guide (2024). This guide is based on actual product testing, looking at both the product and customer experience for each solution  featured in the study.

Our platform performed exceptionally well in the “Manageability” category, a key functional area that makes a true platform stand out. Ventana’s team stated that “The growing importance of simplifying Manageability is critical and should be a priority for all software provider evaluations.” Actian also scored highly in customer service categories, an area in which other well-known vendors failed to score at all.

ventana chart showing how Actian Data Platform received exemplary marks

The Actian Data Platform provides all of the capabilities you need to confidently take on the data journey required for your business, regardless of your industry, geography and end-user requirements.

So, if you’re looking at various offerings  that are either ecosystem-type solutions packaged together as a “platform” or if you’re thinking of putting together several separate vendor solutions to meet your needs, think about how the Actian Data Platform can save you time, money and effort while giving you complete confidence in your data.

Consider a demo of the platform. You won’t be disappointed!

Phil Ostroff Headshot

About Phil Ostroff

Phil Ostroff is Director of Competitive Intelligence at Actian, leveraging 30+ years of experience across automotive, healthcare, IT security, and more. Phil identifies market gaps to ensure Actian's data solutions meet real-world business demands, even in niche scenarios. He has led cross-industry initiatives that streamlined data strategies for diverse enterprises. Phil's Actian blog contributions offer insights into competitive trends, customer pain points, and product roadmaps. Check out his articles to stay informed on market dynamics.
Data Security

On-Premises vs. Cloud Data Warehouses: 7 Key Differences

Actian Corporation

July 31, 2024

blue connectors representing on-premises versus cloud data warehouses

In the constantly evolving landscape of data management, businesses are faced with the critical decision of choosing between on-premises and cloud data warehouses. This decision impacts everything from scalability and cost to security and performance.

Understanding deployment options is crucial for data analysts, IT managers, and business leaders looking to optimize their data strategies. At a basic level, stakeholders need to understand how on-premises and cloud data warehouses are different—and why those differences matter.

Having a detailed knowledge of the advantages and disadvantages of each option allows data-driven organizations to make informed buying decisions based on their strategic goals and operational needs. For example, decision-makers often consider factors such as:

  • Control over the data environment.
  • Security and compliance needs.
  • The ability to customize and scale.
  • Capital expenditure vs. operational expense.
  • Maintenance and management of the data warehouse.

The pros and cons, along with their potential impact on data management and usage, should be considered when implementing or expanding a data warehouse. It’s also important to consider future needs to ensure the data warehouse meets current, emerging, and long-term data requirements.

Location is the Biggest—But Not the Only—Differentiator

Modern on-premises data warehouses have been enabling enterprises since the 1980s. Early versions had the ability to integrate and store large data volumes and perform analytical queries.

By the 2010s, as organizations became more data-driven, data volumes began to explode—giving rise to the term “big data”—and technology advanced for data storage, processing power, and faster analytics. The data warehouse, usually residing in on-premises environments, became a mainstay for innovative businesses. During this time, the public cloud became popular and cloud data storage also became available, including cloud-based data warehouses.

The biggest difference between an on-premises data warehouse and a cloud version is where the infrastructure is hosted and managed. An on-premises data warehouse has the infrastructure physically located within the organization’s facilities, whereas cloud versions leveraged storage in hyperscaler environments.

With on-premises data warehouses, the company is responsible for purchasing, setting up, and maintaining the hardware and software—which requires the proper skillset and resources to perform effectively. With a cloud data warehouse, the infrastructure is hosted by a cloud service provider. The provider manages the hardware and software, including maintenance, updates, and scaling. Here’s a look at other key differences.

7 Primary Differences and Their Business Impact

The fundamental differences in data location have several implications:

Overall Cost Structure

On-premises data warehouses typically require a significant upfront capital expenditure for hardware and software. This is in addition to ongoing costs for maintenance, upgrades, power, and cooling.

Cloud solutions operate on a subscription or pay-as-you-go model, which essentially avoids large capital expenditures and instead uses operational expenses. Having cloud service providers handle routine maintenance, backups, and disaster recovery can reduce the operational burden on an organization’s internal IT teams. The cloud option can ultimately be more cost-effective for stable, predictable workloads that do not have unpredictable cost implications.

Scalability

Scaling an on-premises data warehouse can be complex and time intensive, often requiring the organization to install additional hardware. Cloud data warehouses offer near-infinite scalability, allowing organizations to quickly and easily scale up or down based on demand—this is one of the primary benefits of the cloud option.

Deployment and Management

With an on-premises data warehouse, deployment can be time-consuming, involving a physical setup and extensive configurations that can take weeks or months. Managing the data warehouse also requires specialized IT staff to handle day-to-day operations, security, and troubleshooting.

The cloud speeds up deployment, often requiring just a few clicks to provision resources. The cloud provider largely handles management, freeing up internal IT staff for other tasks. Because cloud data warehouses can be up and running quickly, organizations can start deriving value sooner.

Control and Customization

Operating the data warehouse on-premises gives organizations complete control over their data and infrastructure. This gives extensive options for customization to meet specific business and data needs.

One trade-off with cloud solutions is that they do not offer the same level of control and customization compared to on-premises infrastructure. As a result, organizations may face limitations when fine-tuning specific configurations and ensuring complete data sovereignty in the cloud.

Flexibility to Meet Workloads

An on-premises data warehouse is typically limited by its physical infrastructure and the capacity that was initially implemented—unless the environment is expanded. Upgrades and changes can be cumbersome and slow, which contrasts with the flexibility offered by a cloud-based data warehouse, which allows for quick adjustments to computing and storage resources to meet changing workload demands.

Security and Compliance

Security is managed internally with on-premises solutions, giving organizations full control, but also full responsibility. Compliance with changing industry regulations may require significant effort and resources to stay current. At the same time, organizations in industries such as finance and healthcare, where data privacy and security are paramount, may want to keep data on-prem for security reasons.

With cloud data warehouses, security and compliance of the physical hardware is managed by the cloud service provider, which often has security certifications in place. However, organizations must ensure they choose a provider that meets their specific compliance requirements and have internal staff that is knowledgeable about cloud configuration to ensure cloud infrastructure is configured correctly as part of the shared responsibility model.

Performance and Latency

These are two critical factors for data warehousing, especially when seconds—or even milliseconds—matter. On-premises solutions are known for their high performance due to their dedicated resources, while latency is minimized because data processing occurs locally. Cloud solutions may experience latency issues, but they benefit from the continuous optimization and upgrades provided by cloud vendors.

Make Informed Buying Decisions With Confidence

When deciding between on-premises and cloud data warehouses, organizations should consider specific requirements for current and future usage. Considerations include:

  • Data Volume and Growth Projections. Cloud solutions are better suited for businesses expecting rapid data growth because they offer immediate scalability.
  • Regulatory and Compliance Needs. On-premises solutions may be beneficial for organizations with strict compliance requirements because they offer complete control over data security, access, and compliance measures. This helps ensure that sensitive information is handled according to specific regulatory standards.
  • Budget and Financial Considerations. Cloud solutions offer lower initial costs and financial flexibility, which is beneficial for organizations with limited capital.
  • Business Agility. The cloud’s ability to rapidly scale and deploy resources makes it a good option for organizations that prioritize agility. Scalability allows them to respond swiftly to market changes, efficiently manage workloads, and accelerate the development and deployment of new applications and services.
  • Performance Requirements. On-premises solutions may be preferred by businesses needing high performance and low latency for workloads. Due to the proximity of data storage and computing resources, along with dedicated hardware, on-prem data warehouses can offer a performance advantage, although it’s important to note that cloud versions can offer real-time insights, too.

Consider Both Approaches With a Hybrid Solution

Choosing between on-premises and cloud data warehouses involves weighing the benefits and trade-offs of each option. Although the primary difference between on-premises and cloud data warehouses is the location and management of the infrastructure, the distinction cascades into other areas. This impacts myriad factors, such as costs, scalability, flexibility, security, and more.

By understanding key differences, data professionals, IT managers, and business decision-makers can make informed choices that align with their strategic goals. Organizations can ensure optimal data management and business success while having complete confidence in their data outcomes.

Organizations that want the benefits of both on-prem and cloud data warehouses can take a hybrid approach. A hybrid cloud data warehouse combines the scalability and flexibility of cloud with the control and security of on-premises solutions, enabling organizations to efficiently manage and analyze large volumes of data. This approach allows for seamless data integration and optimizes costs by utilizing current on-premises investments while benefiting from the scalability and flexibility offered by the cloud.

What does the future of data warehousing look like? Visit the Actian Academy for a look at where data warehousing began and where it is today.

 

actian avatar logo

About Actian Corporation

Actian empowers enterprises to confidently manage and govern data at scale. Actian data intelligence solutions help streamline complex data environments and accelerate the delivery of AI-ready data. Designed to be flexible, Actian solutions integrate seamlessly and perform reliably across on-premises, cloud, and hybrid environments. Learn more about Actian, the data division of HCLSoftware, at actian.com.
Data Integration

3 Key Considerations for Crafting a Winning Data Quality Business Case

Traci Curran

July 30, 2024

person crafting a winning data quality business case

In today’s rapidly evolving digital landscape, the integrity and reliability of your data can make or break your business. High data quality is not just a nice to have; it’s fundamental for informed decision-making, effective data management, and maintaining a competitive edge.

Ensuring your data’s accuracy, consistency, and reliability can significantly enhance operational efficiency and drive strategic initiatives. You must have confidence in your data. However, making the case for investments in the right technology to improve data quality can be challenging. It requires a well-crafted business case that clearly demonstrates its value and expected return on investment.

Understanding Data Quality

Data quality encompasses several key attributes:

  • Accuracy: How well data reflects real-world entities or events.
  • Consistency: The uniformity of data across different systems.
  • Completeness: The presence of all required data fields.
  • Timeliness: The availability of up-to-date information.
  • Validity: Adherence to specific formats and business rules.
  • Uniqueness: Absence of duplicate entries.

High-quality data offers numerous benefits, including improved efficiency, better customer satisfaction, enhanced compliance and risk management, and more effective use of emerging technologies like Generative AI (GenAI).

Recognizing the Need for Strong Data Quality

In today’s data-driven world, recognizing the need for strong data quality is crucial for any business aiming to stay competitive and efficient. Prioritizing data quality should be at the top of your agenda.

Watch for these indicators of potential data quality problems:

  • Discrepancies in data reports.
  • Poor marketing email delivery rates.
  • Declining business development efficacy.
  • Missing fields in CRM systems.
  • Increased customer or vendor complaints.
  • Inventory management issues.
  • Rising data storage and processing costs.
  • Increasing email opt-outs.

Benefits of High-Quality Data

High-quality data can transform your business operations, making them more efficient and driven by reliable insights. For instance, Actian customers like Ceva Logistics and Ebix Health rely on high-quality data to ensure that every decision is based on accurate, up-to-date, and complete information, enabling better customer relations and streamlined operations.

3 Steps to Craft a Winning Data Quality Business Case

1. Assess Current Data Quality

Start by conducting a thorough data quality assessment. Use data profiling tools to examine and understand the content, structure, and relationships within your data. This step involves reviewing data at both column and row levels and identifying patterns, anomalies, and inconsistencies, which will provide valuable insights into the quality of your data. Data auditing should also be part of this process, assessing the accuracy and completeness of data against predefined rules or standards. This initial assessment will help you pinpoint the specific areas where your data quality needs improvement.

2. Align Data Quality With Business Objectives

Next, ensure that your data quality initiatives align with your business objectives. Identify the link between business processes, key performance indicators (KPIs), and data assets. Engage with data and analytics leaders to capture their expectations and understand what is considered the “best fit” for the organization. This alignment guarantees that the data quality improvements you plan directly contribute to your business’s overall success and strategic goals.

3. Track Progress and Measure Impact

Finally, it’s crucial to track the progress of your data quality initiatives and measure their impact. Develop an organization-wide shared definition of data quality, identify specific quality metrics, and ensure continuous measurement of these metrics. Implement a data quality dashboard that provides all stakeholders with a comprehensive snapshot of data quality, helping them see past trends and design future process improvements. Regularly communicate the results and improvements to stakeholders to maintain transparency and foster a culture of continuous improvement in data quality.

By following these steps, you’ll craft a winning business case for data quality that highlights the necessity for investment and aligns closely with your strategic business goals, ensuring sustained support and success. You’ll also build confidence in your data and decision making.

To learn more, download our new guide, 3 Considerations for Creating a Winning Data Quality Business Case, and start your journey to trusted data.

Traci Curran headshot

About Traci Curran

Traci Curran is Director of Product Marketing at Actian, focusing on the Actian Data Platform. With 20+ years in tech marketing, Traci has led launches at startups and established enterprises like CloudBolt Software. She specializes in communicating how digital transformation and cloud technologies drive competitive advantage. Traci's articles on the Actian blog demonstrate how to leverage the Data Platform for agile innovation. Explore her posts to accelerate your data initiatives.
Data Management

The Developer’s Guide to Choosing the Right Embedded Database

Kunal Shah

July 29, 2024

choosing the right embedded database

In today’s digital landscape, applications are increasingly complex, demanding efficient data management solutions. Embedded databases, with their lightweight footprint and high performance, have become essential tools for developers building applications for various platforms, from mobile devices to edge computing environments. However, the plethora of options available can be overwhelming. This guide aims to equip developers with the knowledge to select the ideal embedded database for their specific needs.

Understanding Embedded Databases

An embedded database is a database management system (DBMS) integrated directly into an application, rather than running as a separate process. This architecture offers several advantages, including:

  • Performance: Reduced network latency and overhead.
  • Reliability: No external dependencies.
  • Security: Data resides within the application’s boundaries.
  • Flexibility: Tailored to specific application requirements.

However, embedded databases also come with limitations, such as scalability and concurrent access capabilities. It’s crucial to understand these trade-offs when making a selection.

Key Considerations for Database Selection

Before diving into specific database options, let’s outline the key factors to consider when choosing an embedded database:

  • Data Model: Determine whether your application requires a key-value, document, or relational data model.
  • Data Volume and Complexity: Evaluate the size and structure of your dataset.
  • Performance Requirements: Assess the required read and write speeds, transaction throughput, and latency.
  • Storage Constraints: Consider the available storage space on the target platform.
  • Concurrency: Determine the number of concurrent users or processes accessing the database.
  • ACID Compliance: Evaluate if your application requires strict ACID (Atomicity, Consistency, Isolation, Durability) guarantees.
  • Platform Compatibility: Ensure the database supports your target platforms (e.g., mobile, embedded systems, cloud).
  • Development and Maintenance Effort: Consider the learning curve and ongoing support requirements.

Types of Embedded Databases

1. Key-Value Stores

    • Ideal for simple data structures with fast read and write operations.
    • Use cases: Caching, configuration settings, user preferences.

2. Document Stores

    • Suitable for storing complex, hierarchical data structures.
    • Use cases: Content management systems, IoT data, application state management.

3. Relational Databases:

    • Offer structured data storage with ACID compliance.
    • Use cases: Financial applications, inventory management, analytics.

4. Time-Series Databases:

    • Optimized for handling time-stamped data with high ingestion and query rates.
    • Use cases: IoT sensor data, financial time series, application performance monitoring.

Database Selection for Embedded App Development

Mobile Apps

  • Prioritize performance, low storage footprint, and offline capabilities.
  • Consider document stores or embedded versions of document stores
  • Optimize for battery life and device resources.

IoT Devices

  • Focus on low power consumption, high performance, and limited storage.
  • Key-value stores or embedded time-series databases are often suitable.
  • Consider data compression and encryption for security.

Database Selection for Edge-to-Cloud Data Management

Edge Processing

  • Emphasize low latency, high throughput, and offline capabilities.
  • Time-series databases or embedded document stores can be effective.
  • Consider data aggregation and filtering at the edge to reduce cloud load.

Data Synchronization

  • Choose a database that supports efficient data replication and synchronization.
  • Consider hybrid approaches combining embedded and cloud databases.
  • Ensure data consistency and integrity across environments.

Conclusion

Selecting the right embedded database is crucial for the success of your application. By carefully considering the factors outlined in this guide and evaluating the specific requirements of your project, you can make an informed decision. 

Remember that the right embedded database is the one that meets your application’s needs while optimizing performance, security, and developer productivity. 

At Actian, we help organizations run faster, smarter applications on edge devices with our lightweight, embedded database – Actian Zen. Optimized for embedded systems and edge computing, Zen boasts small-footprint with fast read and write access, making it ideal for resource-constrained environments.

With seamless data synchronization from edge to cloud, Zen is fully ACID compliant supporting SQL and NoSQL data access leveraging popular programming languages allowing developers to build low-latency embedded apps.

Additional Resources:

Kunal Shah - Headshot

About Kunal Shah

Kunal Shah is a product marketer with 15+ years in data and digital growth, leading marketing for Actian Zen Edge and NoSQL products. He has consulted on data modernization for global enterprises, drawing on past roles at SAS. Kunal holds an MBA from Duke University. Kunal regularly shares market insights at data and tech conferences, focusing on embedded database innovations. On the Actian blog, Kunal covers product growth strategy, go-to-market motions, and real-world commercial execution. Explore his latest posts to discover how edge data solutions can transform your business.
Data Management

Enhance Financial Decisions With Real-Time Data Processing

Actian Corporation

July 26, 2024

Actian Zen datapoints showing Intelligent Edge Era

Article by Ashley Knoble and Derek Comingore

Cloud computing has been a dominant computing model dating back to 2002 when Amazon Web Services (AWS) launched. In 2012, Cisco coined the term “Fog Computing,” which is a form of distributed computing that brings computation and data persistence closer to the edge.

Fog computing, also known as edge computing, set the stage for the current Intelligent Edge era. The Intelligent Edge is the convergence of both machine learning and edge computing, resulting in intelligence being generated where data is born. The benefits of the Intelligent Edge are many, including:

  • Reduced bandwidth consumption.
  • Accelerated time-to-insights.
  • Smart devices that take automated actions.

The Intelligent Edge requires TinyML (Tiny Machine Learning) and traditional analytics running on smaller, less powerful devices. With smaller devices comes reduced disk capacities. Hence, software install footprints must be reduced.

Harnessing a single data management platform that accommodates a variety of intelligent edge use cases is preferred for consistency, reduced security surface, and data integration efficiencies. With increased data management and analytics on edge devices, security needs also increase. Security features such as data encryption quickly become required.

Embedded Databases for Edge Computing

Unlike traditional databases, embedded databases are ideal for edge computing environments for key reasons that include:

  • Small Footprint. Embedded databases require minimal storage and memory, making them ideal for devices with limited resources. This allows for smaller form factors and lower costs for edge devices.
  • Low Power Consumption. Embedded databases are designed to be energy efficient, minimizing the power drain on battery-powered devices, which is a critical concern for many edge applications.
  • Fast Performance. Real-time data processing is essential for many edge applications. Embedded databases are optimized for speed, ensuring timely data storage, retrieval, and analysis at the edge.
  • Reliability and Durability. Edge devices often operate in harsh environments. Embedded databases are designed to be reliable and durable, ensuring data integrity even in case of power failures or device malfunctions.
  • Security is paramount in the edge landscape. Embedded databases incorporate robust security features to protect sensitive data from unauthorized access.
  • Ease of Use. Unlike traditional databases, embedded databases are designed to be easy to set up and manage. This simplifies development and deployment for resource-constrained edge projects.

Introducing Actian Zen–An Embedded Database for Use Cases at the Edge

Actian Zen is our best-in-class multi-model embedded database for disruptive intelligent edge applications. With Zen, both partners and customers build intelligent applications running directly on and near the edge.

Additionally, traditional server and cloud-based deployments are supported. This results in a cohesive end-to-end data architecture for efficient data integration and reduced security vulnerability. Intelligent edge and edge-to-cloud applications can be deployed with confidence.

Analytics can be run directly where the data is being generated, utilizing Zen’s database technology. Actian Zen saves organizations time and simplifies what is otherwise a complicated and fragmented data architecture. Customers and partners obtain millisecond query response times with Zen’s microkernel database engine. And with native ANSI SQL support, users easily connect their favorite dashboard and data integration tools.

The Family of Proven Zen Products

Zen is a feature rich intelligent edge database designed to solve a wide spectrum of industry use cases and workloads. As such, Actian offers Zen in three specific editions tailored for custom and unique use cases.

  • Zen Mobile is designed for smart IoT and mobile devices. Deployment is achieved via direct application, embedding as a lightweight library.
  • Zen Edge offers an edition custom tailored for edge gateways and complex industrial devices.
  • Zen Enterprise enables customers and partners to solve their largest data management workloads and challenges. Zen Enterprise accommodates thousands of concurrent users while offering flexible deployment options including traditional on-premises and cloud environments.

Key Features and Benefits for Edge Environments

By leveraging Zen, companies gain immediate access to business and operational insights. Both partners and customers reduce total cost of ownership (TCO), save expense via lesser dependence on cloud computing and storage technologies, and improve sustainability.

Employee training is also reduced by using a single cohesive data platform. In parallel, when data must be propagated to the cloud, Zen provides a rich set of data access APIs supported by popular development frameworks and platforms.

Harness Edge Intelligence Today

With the arrival of the Intelligent Edge era comes a new set of technology and business requirements. Actian Zen, a lightweight multi-model embedded database, is at the forefront of the Intelligent Edge era. And, with the latest release of Zen 16.0, we are committed to helping companies simplify and solve for both intelligent edge and edge-to-cloud applications.

Get started today by contacting us or downloading the Actian Zen Evaluation Edition.

actian avatar logo

About Actian Corporation

Actian empowers enterprises to confidently manage and govern data at scale. Actian data intelligence solutions help streamline complex data environments and accelerate the delivery of AI-ready data. Designed to be flexible, Actian solutions integrate seamlessly and perform reliably across on-premises, cloud, and hybrid environments. Learn more about Actian, the data division of HCLSoftware, at actian.com.
Actian Life

Get to Know Actian’s 2024 Interns

Katie Keith

July 24, 2024

Actian's 2024 interns

We want to celebrate our interns worldwide and recognize the incredible value they are bringing to our company. As a newly inducted intern myself, I am honored to have the opportunity to introduce our incredible new cohort of interns!

Andrea Brown headshot

Andrea Brown (She/Her)
Clouds Operations Engineer Intern

Andrea is a Computer Science major at the University of Houston-Downtown. She lives in Houston and in her free time enjoys practicing roller skating and learning French. Her capstone project focuses on using Grafana for monitoring resources and testing them with k6 synthetics.

What she likes most about the intern program so far is the culture. “Actian has done such a great job cultivating a culture where everyone wants to see you succeed,” she notes. “Everyone is helpful and inspiring.” From the moment she was contacted for an internship to meeting employees and peers during orientation week, she felt welcome and knew right away she had made the right choice. She has no doubt this will be a unique and unforgettable experience, and she is looking forward to learning more about her capstone project and connecting with people across the organization.

Claire Li headshot

Claire Li (She/Her)
UX Design Intern

Claire is based in Los Angeles and is studying interaction design at ArtCenter College of Design. For her capstone project, she will create interactive standards for the Actian Data Platform and apply them to reusable components and the onboarding experience to enhance the overall user experience.

“Actian fosters a positive and supportive environment for interns to learn and grow,” she says.

Claire enjoys the collaborative atmosphere and the opportunity to tackle real-world challenges. She looks forward to seeing how she and her fellow interns will challenge themselves to problem-solve, present their ideas, and bring value to Actian in their unique final presentations. Outside of work, she spends most of her weekends hiking and capturing nature shots.

Prathamesh Kulkarni headshot

Prathamesh Kulkarni (He/Him)
Cloud QA Intern

Prathamesh is working toward his master’s degree in Computer Science at The University of Texas at Dallas. He is originally from Pune, India.

His capstone project aims to streamline the development of Actian’s in-house API test automation tool and research the usability of GitHub Copilot in API test automation.

By automating these tasks, he and his team can reduce manual effort and expedite the creation of effective and robust test automation solutions. The amazing support he has received and the real value of the work he has been involved in have been highlights of his internship so far. He says it’s been a rewarding experience to apply what he has learned in a practical setting and see the impact of his contributions.

A fun fact about him is that he loves washing dishes—it’s like therapy to him, and he even calls himself a professional dishwasher! He is also an accomplished Indian classical percussion musician, having graduated in that field.

Marco Brodkorb headshot

Marco Brodkorb
Development Vector Intern

Hailing from Thuringia, Germany, Marco is working on his master’s degree in Computer Science at Technische Universität Ilmenau. He began his work as an Actian intern by writing unit tests and then began integrating a new compression method for strings called FSST.

He is working on integrating a more efficient range join algorithm that uses ad hoc generated UB-Trees, as part of his master thesis.

Naomi Thomas headshot

Naomi Thomas (She/Her)
Education Team Intern

Naomi is from Florida and is a graduate student at the University of Central Florida pursuing a master’s degree in Instructional Design & Technology. She has five years of experience working in the education field with an undergraduate degree in Education Sciences.

For her capstone project, Naomi is diving into the instructional design process to create a customer-facing course on DataConnect 12.2 for Actian Academy. She is enjoying the company culture and the opportunity to learn from experienced instructional designers and subject matter experts. “Everyone has been incredibly welcoming and supportive, and I’m excited to be working on a meaningful project with a tangible impact!” she says.

A fun fact about her is that she has two adorable dogs named Jax and King. She enjoys reading and collecting books in her free time.

Linnea Castro headshot

Linnea Castro (She/Her)
Cloud Operations Engineer Intern

Linnea is majoring in Computer Science at Washington State University. She is working with the Cloud Operations team to convert Grafana observability dashboards into source code—effective observability helps data tell a story, while converting these dashboards to code will make the infrastructure that supports the data more robust.

She has loved meeting new people and collaborating with the Cloud team. Their morning sync meetings bring together people across the U.S. and U.K. She says that getting together with the internship leaders and fellow interns during orientation week set a tone of connection and possibility that continues to drive her each day. Linnea is looking forward to continuing to learn about Grafana and get swifter with querying. To that end, she is eager to learn as much as she can from the Cloud team and make a meaningful contribution.

She has three daughters who are in elementary school and is a U.S. Coast Guard veteran. Her favorite book is “Mindset” by Dr. Carol Dweck because it introduced her to the concept and power of practicing a growth mindset.

Alain Escarrá García headshot

Alain Escarrá García (He/Him)
Development Vector Intern

Alain is from Cuba and just finished his first year of bachelor studies at Constructor University in Bremen, Germany, where he is majoring in Software, Data, and Technology. Working with the Actian Vector team, his main project involves introducing microservice architecture for user-defined Python functions. In his free time, he enjoys music, both listening to it and learning to play different instruments.

Matilda Huang headshot

Matilda Huang (She/Her)
CX Design Intern

Matilda is pursuing her master’s degree in Technology Innovation at the University of Washington. She is participating in her internship from Seattle. Her capstone project focuses on elevating the voice of our customers. She aims to identify friction points in our current feedback communication process and uncover areas of opportunity for CX prioritization.

Matilda is enjoying the opportunity to collaborate with members from various teams and looks forward to connecting with more people across the company.

Liam Norman headshot

Liam Norman (He/Him)
Generative AI Intern

Liam is a senior at Harvard studying Computer Science. His capstone project involves converting natural language queries into SQL queries to assist Actian’s sales team.

So far, his favorite part of the internship was meeting the other interns at orientation week. A fun fact: In his free time, he likes to draw cartoons and play the piano.

Laurin Martins headshot

Laurin Martins (He/Him)
Development Vector Intern

Laurin is from a small village near Frankfurt, Germany, called Langebach and is studying for a master’s degree in IT at TU Ilmenau. His previous work for Actian includes his bachelor thesis “Multi-key Sorting in Vectorized Query Execution.”

After that, he completed an internship to implement the proposed algorithms for a wide variety of data types. He is currently working on his master’s thesis titled “Elastic Query Processing in Stateless x100.” He plans to further develop the ideas and implementation presented in his master’s thesis in a Ph.D. program in conjunction with TU Ilmenau.

In his free time, he discovered that Dungeons and Dragons is a great evening board game to play with friends. He is also the lead for the software development at a startup company (https://healyan.com)

Kelsey Mulrooney headshot

Kelsey Mulrooney (She/Her)
Cloud Security Engineer Intern

Kelsey is from Wilmington, Delaware, and majoring in Cybersecurity at the Rochester Institute of Technology. She is involved in implementing honeypots—simulated systems designed to attract and analyze hacker activities.

Kelsey’s favorite part about the internship program so far is the welcoming environment that Actian cultivates. She looks forward to seeing how much she can accomplish in the span of 12 weeks. Outside of work, Kelsey enjoys playing percussion, specifically the marimba and vibraphone.

Justin Tedeschi headshot

Justin Tedeschi (He/Him)
Cloud Security Engineer Intern

Justin is from Long Island, New York, and an incoming senior at the University of Tampa. He’s majoring in Management Information Systems with a minor in Cybersecurity. At Actian, he’s learning about vulnerabilities in the cloud and how to spot them, understand them, and also prevent them.

The internship program allows access to a variety of resources, which he’s definitely taking advantage of, including interacting with people he finds to be knowledgeable and understanding. A fun fact about Justin is that he used to be a collegiate runner—one year at the University of Buffalo, a Division 1 school, then another year at the college he’s currently attending, which is Division 2.

Guillermo Martinez Alacron
Development Vector Intern

Hailing from Mexico, Guillermo is studying Industrial Engineering and participating in an exchange at TU Ilmenau in Germany. As part of his internship, he is working on the design and implementation of a quality management system in order to obtain the ISO 9001 certification for Actian. He enjoys Star Wars, rock music, and sports—and is especially looking forward to the Olympics!

Joe Untrecht headshot

Joe Untrecht (He/Him)
Cloud Operations Engineer Intern

Joe is from Portola Valley, California, which is a small town near Palo Alto. He is heading into his senior year at the University of Wisconsin-Madison, majoring in Computer Science. He loves and cannot recommend this school enough. One interesting fact about him is that he loves playing Hacky Sack and is about to start making custom hacky sacks. Another interesting fact is that he loves all things Star Wars and believes “Revenge of the Sith” is clearly the best movie. His favorite dessert is cookies and milk.

His capstone project involves cloud resource monitoring. He has been learning how to use the various services on Amazon Web Services, Google Cloud, and Microsoft Azure while practicing how to visualize the data and use the services on Grafana. He has had an immense amount of fun working with these platforms and doesn’t think he has ever learned more than in the first three weeks of his internship. He views the internship as a great opportunity to improve his skills and build new ones. He is “beyond grateful” for this opportunity and excited to continue learning about Actian and working on his capstone project.

Jon Lumi headshot

Jon Lumi (He/Him)
Software Development Intern

Jon is from Kosovo and is a second-year Computer Science student at Constructor University in Bremen, Germany. He is working at the Actian office in Ilmenau, Germany, and previously worked as a teaching assistant at his university for first-year courses.

His experience as an Actian intern has been nothing short of amazing because he has not only had the opportunity to grow professionally through the guidance of supervisors and the challenges he faced, but also to learn in a positive and friendly environment. Jon is looking forward to learning and experiencing even more of what Actian offers, and having a good time along the way.

Davis Palmer headshot

Davis Palmer (He/Him)
Engineering Intern, Zen Hardware

Davis is double majoring in Mechanical Engineering and Applied Mathematics. He’s also earning a minor in Computer Science at Texas A&M University.

His capstone project consists of designing and constructing a smart building with a variety of IoT devices with the Actian Zen team. He “absolutely loves” the work he has been doing and all the people he has interacted with. Davis is looking forward to all of the intern events for the rest of the summer.

Matthew Jackson headshot

Matthew Jackson (He/Him)
Engineering Intern, Zen Hardware

Matthew is working with the Actian Zen team. He grew up only a few miles from Actian’s office in Round Rock, Texas. Going into his junior year at Colorado School of Mines in Golden, Colorado, he’s working on two majors: Computer Science with a focus on Data Science, and Electrical Engineering with a focus on Information & Systems Sciences (ISS).

Outside of school, he plays a bit of jazz and other genres as a keyboardist and trumpeter. He is a huge fan of playing winter sports like hockey, skiing, and snowboarding. This summer at Actian, he is working alongside another hardware engineering intern for Actian Zen, Davis Palmer, to build a smart model office building to act as a tech demo for Zen databases. His part of the project is performing all the high-level development, which includes conducting web development, developing projects with facial recognition AI, and other tasks at that level of abstraction. He is super interested in the project assigned to him and is excited to see where it goes… 

Fedor Gromov
Development Vector Intern

Fedor is from Russia and working at the Actian office in Germany. He is attending a master’s program at Constructor University of Bremen and studying Computer Science. He’s working on adding ONNX microservice support to a microservices team. His current hobby is bouldering.

Katie Keith headshot

Katie Keith (She/Her)
Employee Experience Intern

Katie is from Vail, Colorado, and an upcoming senior at Loyola University in Chicago. She is receiving her BBA in Finance with a minor in Psychology. For her capstone project, she is working with the Employee Experience team to put together a Pilot Orientation Program for the new go-to-market strategy employees.

She has really enjoyed Actian’s company culture and getting to learn from her team. Katie is looking forward to cheering on her fellow interns during their capstone presentations at the completion of the internship program. In her free time, she enjoys seeing stage productions and reading. She is super thankful to be part of the Actian team!

Katie Keith headshot

About Katie Keith

Katie Keith is pursuing a BBA in Finance at Loyola University in Chicago, contributing to Actian's Employee Experience team. She has collaborated on a Pilot Orientation Program for new go-to-market employees, leveraging her academic research and interpersonal skills. Katie has studied the intersection of psychology and business, providing unique perspectives on employee engagement. Her blog entries at Actian reflect her passion for organizational development and onboarding. Stay tuned for her insights on creating impactful employee experiences.
Data Integration

Efficient Integrations: How to Slash Costs and Boost Efficiency

Traci Curran

July 23, 2024

Efficient Integrations with Actian

In today’s dynamic global business climate, the drive for efficiency and cost reduction has never been more pressing. The key to unlocking these gains lies in efficient integrations, which optimize data workflows and streamline operations. With the increasing complexity and volume of data, the need for seamless integration across various platforms and systems can profoundly impact both top-line growth and bottom-line savings. Efficient integrations enhance operational efficiency and pave the way for innovation and competitive advantage. Your organization can significantly improve financial performance by harnessing the power of data integration and leveraging the right technology.

To create efficient integrations within your organization, focus on several key areas: optimizing business operations, leveraging automation to enhance efficiency, implementing cost-effective reporting and analytics, and using cloud integration to reduce expenses. Each of these components is crucial for developing a strategy that reduces costs and increases efficiency. By understanding and adopting these integration practices, you’ll streamline data workflows and set the foundation for scalable growth and improved business agility. Let’s explore how transforming your approach to integration can turn challenges into opportunities for optimization and innovation.

Optimizing Business Operations for Efficient Integration

Streamlining Data Management

  1. Adopt Best Practices: Implementing data management best practices ensures streamlined operations and aids in decision-making. By eliminating data silos, seamless data integration becomes possible, presenting a coherent perspective of your business operations.
  2. Harness Automation: The synergy of data analytics and integration workflow automation transforms raw data into actionable insights, reshaping decision-making processes.
  3. Enhance Accessibility: Ensuring data accessibility is critical. Modern BI tools provide row-level security, allowing tailored data access while maintaining confidentiality. This enables employees to access relevant data promptly, fostering a proactive approach in all business endeavors.

Enhanced Business Insights

  1. Utilize BI Tools: Business Intelligence (BI) tools transform large datasets into actionable insights, facilitating strategic planning and resource optimization. These tools provide a comprehensive overview of various business aspects, enhancing decision-making capabilities.
  2. Leverage Data Analytics: Data analytics is pivotal in decoding customer behavior and steering companies toward smarter decisions. It helps identify areas of excess and untapped resources, allowing for more effective resource allocation.
  3. Continuous Improvement: Business process improvement should be continuous as businesses evolve and expand. Implementing Data and application tools can provide insights into potential bottlenecks and optimization opportunities, improving operational efficiency.

Automation and Efficiency

Reducing Manual Work With Automation

  1. Streamline Repetitive Tasks: Automation technologies significantly reduce the time spent on repetitive tasks such as data entry and scheduling, which are often cited as productivity killers. By automating these tasks, employees can focus on more strategic activities contributing to the organization’s growth.
  2. Enhance Workflow Efficiency: Implementing automation can eliminate the need for manual intervention in routine tasks, allowing processes to operate more smoothly and reliably. This speeds up operations and reduces the risk of errors, making workflows more efficient.

Improving Process Accuracy

  1. Minimize Human Errors: One of the most significant advantages of automation is its ability to perform tasks with high precision. Automated systems are less prone to the lapses in concentration that affect human workers, ensuring that each task is performed accurately and consistently.
  2. Increase Data Integrity: Automation minimizes human errors in data handling, from entry to analysis, enhancing the reliability of business operations. This improved accuracy is crucial for making informed decisions and maintaining high-quality standards across the organization.

Cost-Effective Reporting and Analytics

Simplifying Reporting

  1. Refinement of Business Information Management Systems: Simplifying your business information management systems can reduce complexities, leading to up to a 15% cost reduction in reporting and governance across your organization.
  2. Automation of Reporting Processes: By automating manual steps in your reporting process, you can achieve quicker, more responsive, and more accurate financial reporting. This frees up resources and minimizes the scope for human error, allowing for better decision-making and potential spending reductions.
  3. Enhanced Data Integrity and Accuracy: Implementing workflow automation reduces errors and increases data integrity, crucial for accurate reporting and informed decision-making.

Utilizing Data Warehousing

Cloud-Based Solutions: Transitioning to cloud-based data warehousing solutions like Actian can offer scalability, flexibility, and significant cost savings by reducing the operational pain points associated with traditional hardware.

Cost Optimization Strategies: Employing data compression, optimized ETL processes, and consumption-based pricing models in data warehousing can control expenses and align costs with usage, thereby reducing overall storage and management costs.

Data and application integration solutions offer substantial benefits that can transform your organization. By streamlining operations, enhancing data accessibility, and fostering real-time decision-making, these solutions drive efficiency and innovation. They enable seamless communication between systems, reduce redundancy, and improve data accuracy. Furthermore, integrating disparate applications and data sources provides a unified view of business processes, empowering your organization to respond swiftly to market changes and customer needs. Ultimately, embracing data and application integration is a strategic move that supports growth, scalability, and a competitive edge in today’s fast-paced business environment.

Traci Curran headshot

About Traci Curran

Traci Curran is Director of Product Marketing at Actian, focusing on the Actian Data Platform. With 20+ years in tech marketing, Traci has led launches at startups and established enterprises like CloudBolt Software. She specializes in communicating how digital transformation and cloud technologies drive competitive advantage. Traci's articles on the Actian blog demonstrate how to leverage the Data Platform for agile innovation. Explore her posts to accelerate your data initiatives.
Data Management

Real-Time Data Processing With Actian Zen and Kafka Connectors

Johnson Varughese

July 17, 2024

data processing with actian zen and apache kafka

Welcome back to the world of Actian Zen, a versatile and powerful edge data management solution designed to help you build low-latency embedded apps. In part 1 , we explored how to leverage BtrievePython to run Btrieve2 Python applications, using the Zen 16.0 Enterprise/Server Database Engine. 

This is Part 2 of the quickstart blog series that focuses on helping embedded app developers get started with Actian Zen. In this blog post, we’ll walk through setting up a Kafka demo using Actian Zen, demonstrating how to manage and process real-time financial transactions seamlessly. This includes configuring environment variables, using an orchestration script, generating mock transaction data, leveraging Docker for streamlined deployment, and utilizing Docker Compose for orchestration.

Introduction to Actian Zen Kafka Connectors

In the dynamic world of finance, processing and managing real-time transactions efficiently is a must-have. Actian Zen’s Kafka Connectors offer a robust solution for streaming transaction data between financial systems and Kafka topics. The Actian Zen Kafka Connectors facilitate seamless integration between Actian Zen databases and Apache Kafka. These connectors support both source and sink operations, allowing you to stream data out of a Zen Btrieve database into Kafka topics or vice versa.

Source Connector

The Zen Source connector streams JSON data from a Zen Btrieve database into a Kafka topic. It employs change capture polling to pick up new data at user-defined intervals, ensuring that your Kafka topics are always updated with the latest information from your Zen databases.

Sink Connector

The Zen Sink connector streams JSON data from a Kafka topic into a Zen Btrieve database. You can choose to stream data into an existing database or create a new one when starting the connector.

Setting Up Environment Variables

Before diving into the configuration, it’s essential to set up the necessary environment variables. These variables ensure that your system paths and library paths are correctly configured, and that you accept the Zen End User License Agreement (EULA).

Here’s an example of the environment variables you need to set:

export PATH="/usr/local/actianzen/bin:/usr/local/actianzen/lib64:$PATH"
export LD_LIBRARY_PATH="/usr/local/actianzen/lib64:/usr/lib64:/usr/lib"
export CLASSPATH="/usr/local/actianzen/lib64"
export CONNECT_PLUGIN_PATH='/usr/share/java'
export ZEN_ACCEPT_EULA="YES"

Configuring the Kafka Connectors

The configuration parameters for the Kafka connectors are provided as key-value pairs. These configurations can be set via a properties file, the Kafka REST API, or programmatically. Here’s an example JSON configuration for a source connector:

{
    "name": "financial-transactions-source",
    "config": {
        "connector.class": "com.actian.zen.Kafka.connect.source.BtrieveSourceConnector",
        "db.filename.param": "transactions.mkd",
        "server.name.param": "financial_db",  
        "poll.interval.ms": "2000",
        "tasks.max": "1",
        "topic": "transactionLog",
        "key.converter": "org.apache.Kafka.connect.storage.StringConverter",
        "value.converter": "org.apache.Kafka.connect.storage.StringConverter",
        "topic.creation.enable": "true",
        "topic.creation.default.replication.factor": "-1",
        "topic.creation.default.partitions": "-1"
    }
}

You can also define user queries for more granular data filtering using the JSON query language detailed in the Btrieve2 API Documentation. For example, to filter for transactions greater than or equal to $1000:

"\"Transaction\":{\"Amount\":{\"$gte\":1000}}"

Orchestration Script: kafkasetup.py

The kafkasetup.py script automates the process of starting and stopping the Kafka connectors. Here’s a snippet showing how the script sets up connectors:

import requests
import json
def main():
    requestMap = {}
    requestMap["Financial Transactions"] = ({
        "name": "financial-transactions-source",
        "config": {
            "connector.class": "com.actian.zen.kafka.connect.source.BtrieveSourceConnector",
            "db.filename.param": "transactions.mkd",
            "server.name.param": "financial_db",  
            "poll.interval.ms": "2000",
            "tasks.max": "1",
            "topic": "transactionLog",
            "key.converter": "org.apache.kafka.connect.storage.StringConverter",
            "value.converter": "org.apache.kafka.connect.storage.StringConverter",
            "topic.creation.enable": "true",
            "topic.creation.default.replication.factor": "-1",
            "topic.creation.default.partitions": "-1"
        }
    }, "8083")
    for name, requestTuple in requestMap.items():
        input("Press Enter to continue...")
        (request, port) = requestTuple
        print("Now starting " + name + " connector")
        try:
            r = requests.post("http://localhost:"+port+"/connectors", json=request)
            print("Response:", r.json)
        except Exception as e:
            print("ERROR: ", e)
    print("Finished setup!...")
    input("\n\nPress Enter to begin shutdown")
    for name, requestTuple in  requestMap.items():
        (request, port) = requestTuple
        try:
            r = requests.delete("http://localhost:"+port+"/connectors/"+request["name"])
        except Exception as e:
            print("ERROR: ", e)
if __name__ == "__main__":
    main()

When you run the script, it prompts you to start each connector one by one, ensuring everything is set up correctly.

Generating Transaction Data With data_generator.py

The data_generator.py script simulates financial transaction data, creating transaction records at specified intervals. Here’s a look at the core function:

import sys
import os
import signal
import json
import random
from time import sleep
from datetime import datetime
sys.path.append("/usr/local/actianzen/lib64")
import btrievePython as BP    
class GracefulKiller:
    kill_now = False
  def __init__(self):
    signal.signal(signal.SIGINT, self.exit_gracefully)
    signal.signal(signal.SIGTERM, self.exit_gracefully)
  def exit_gracefully(self, *args):
    self.kill_now = True
def generate_transactions():
    client = BP.BtrieveClient()
    assert(client != None)
    collection = BP.BtrieveCollection()
    assert(collection != None)
    collectionName = os.getenv("GENERATOR_DB_URI")
    rc = client.CollectionCreate(collectionName)
    rc = client.CollectionOpen(collection, collectionName)
    assert(rc == BP.Btrieve.STATUS_CODE_NO_ERROR), BP.Btrieve.StatusCodeToString(rc)
    interval = int(os.getenv("GENERATOR_INTERVAL"))
    kill_condition = GracefulKiller()
    while not kill_condition.kill_now:
        transaction = {
            "Transaction": {
                "ID": random.randint(1000, 9999),
                "Amount": round(random.uniform(10.0, 5000.0), 2),
                "Currency": "USD",
                "Timestamp": str(datetime.now())
            }
        }
        print(f"Generated transaction: {transaction}")
        documentId = collection.DocumentCreate(json.dumps(transaction))
        if documentId < 0:
            print("DOCUMENT CREATE ERROR: " + BP.Btrieve.StatusCodeToString(collection.GetLastStatusCode()))
        sleep(interval)
    rc = client.CollectionClose(collection)
    assert(rc == BP.Btrieve.STATUS_CODE_NO_ERROR), BP.Btrieve.StatusCodeToString(rc)
if __name__ == "__main__":
    generate_transactions()

This script runs an infinite loop, continuously generating and inserting transaction data into a Btrieve collection.

Using Docker for Deployment

To facilitate this setup, we use a Docker container. Here’s the Dockerfile that sets up the environment to run our data generator script:

FROM actian/zen-client:16.00
USER root
RUN apt update && apt install python3 -y
COPY --chown=zen-svc:zen-data data_generator.py /usr/local/actianzen/bin
ADD _btrievePython.so /usr/local/actianzen/lib64
ADD btrievePython.py /usr/local/actianzen/lib64
USER zen-svc
CMD ["python3", "/usr/local/actianzen/bin/data_generator.py"]

This Dockerfile extends from the Actian Zen client image, installs Python, and includes the data generation script. By building and running this Docker container, we can generate and stream transaction data into Kafka topics as configured.

Docker Compose for Orchestration

To manage and orchestrate multiple containers, including Kafka, Zookeeper, and our data generator, we use Docker Compose. Here’s the docker-compose.yml file that brings everything together:

version: '3.8'
services:
  zookeeper:
    image: wurstmeister/zookeeper:3.4.6
    ports:
      - "2181:2181"
  kafka:
    image: wurstmeister/kafka:2.13-2.7.0
    ports:
      - "9092:9092"
    environment:
      KAFKA_ZOOKEEPER_CONNECT: zookeeper:2181
      KAFKA_ADVERTISED_LISTENERS: PLAINTEXT://kafka:9092
      KAFKA_LISTENER_SECURITY_PROTOCOL_MAP: PLAINTEXT:PLAINTEXT
      KAFKA_LOG_RETENTION_HOURS: 1
      KAFKA_MESSAGE_MAX_BYTES: 10485760
      KAFKA_BROKER_ID: 1
    volumes:
      - /var/run/docker.sock:/var/run/docker.sock
  actianzen:
    build: .
    environment:
      GENERATOR_DB_URI: "transactions.mkd"
      GENERATOR_LOCALE: "Austin"
      GENERATOR_INTERVAL: "5"
    volumes:
      - ./data:/usr/local/actianzen/data

This docker-compose.yml file sets up Zookeeper, Kafka, and our Actian Zen data generator in a single configuration. By running docker-compose up, we can spin up the entire stack and start streaming financial transaction data into Kafka topics in real-time.

Visualizing the Kafka Stream

To give you a better understanding of the data flow in this setup, here’s a diagram illustrating the Kafka stream:

actian zen database with kafka source conenctor

In this diagram, the financial transaction data flows from the Actian Zen database through the Kafka source connector into the Kafka topics. The data can then be consumed and processed by various downstream applications.

Kafka Connect: Kafka Connect instances are properly joining groups and syncing. Tasks and connectors are being configured and started as expected.

Financial Transactions: Transactions from both New York and San Francisco are being processed and logged correctly. The transactions include a variety of credit and debit actions with varying amounts and timestamps.

Zen and Kafka Connectors

Conclusion

Integrating Actian Zen with Kafka Connectors provides a powerful solution for real-time data streaming and processing. By following this guide, you can set up a robust system to handle financial transactions, ensuring data is efficiently streamed, processed, and stored. This setup not only demonstrates the capabilities of Actian Zen and Kafka but also highlights the ease of deployment using Docker and Docker Compose. Whether you’re dealing with financial transactions or other data-intensive applications, this solution offers a scalable and reliable approach to real-time data management.

For further details and visual guides, refer to the Actian Academy and the comprehensive documentation. Happy coding!

Johnson Varughese headshot

About Johnson Varughese

Johnson Varughese manages Support Engineering at Actian, assisting developers leveraging ZEN interfaces (Btrieve, ODBC, JDBC, ADO.NET, etc.). He provides technical guidance and troubleshooting expertise to ensure robust application performance across different programming environments. Johnson's wealth of knowledge in data access interfaces has streamlined numerous development projects. His Actian blog entries detail best practices for integrating Btrieve and other interfaces. Explore his articles to optimize your database-driven applications.