Data Intelligence

Data Governance Framework | S02-01 – Organizing Your Data Office

Actian Corporation

May 17, 2021

organizing your data office

This is the first episode of the second season of “The Effective Data Governance Framework” series.

Divided into three parts, this second part will focus on Adaptation. This consists of: 

  • Organizing your Data Office
  • Building a data community  
  • Creating Data Awareness

For this first episode, we will give you the keys to building your data personas and setting up a clear and well-defined Data Office. 

Season 1: Alignment

Evaluate your Data maturity

Specify your Data strategy

Getting sponsors

Build a SWOT analysis

Season 2: Adapting

Organize your Data Office

Organize your Data Community

Creating Data Awareness

Season 3: Implementing Metadata Management With a Data Catalog

The importance of metadata

6 weeks to start your data governance journey

In the first season, we shared our best practices to help you align your data strategy with your company. For us, it is essential to:

  • Assess the maturity of your data.
  • Specify your Data Strategy by building OKRs.
  • Get sponsorship.
  • Build an effective SWOT analysis.

In this first episode, we will teach you how to build your Data Office.

The Evolution of Data Offices in Companies

We believe in Agile Data Governance.

Previous implementations of data governance within organizations have rarely been successful. The Data Office often focuses too much on technical management or a strict control of data.

For data users who strive to experiment and innovate around data, Data Office behavior is often synonymous with restrictions, limitations, and cumbersome bureaucracy.

Some will have gloomy visions of data locked up in dark catacombs, only accessible after months of administrative hassle. Others will recall the wasted energy at meetings, updating spreadsheets and maintaining wikis, only to find that no one was ever benefiting from the fruits of their labor.

Companies today are conditioned by regulatory compliance to guarantee data privacy, data security, and to ensure risk management.

That said, taking a more offensive approach towards improving the use of data in an organization by making sure the data is useful, usable and exploited is a crucial undertaking.

Using modern organizational paradigms with new ways of interacting is a good way to set up an efficient Data Office flat organization.

Below are the typical roles of a Data Office, although very often, some roles are carried out by the same person:

  • Chief data officer
  • Data related Portfolio/Program/Project managers
  • Data Engineers / Architects
  • Data scientists
  • Data analysts
  • Data Stewards

Creating Data Personas

An efficient way of specifying the roles of Data Office stakeholders is to work on their personas.

By conducting one on one interviews, you will learn a lot about them: context, goals and expectations. The OKRs map is a good guide for building those by asking accurate questions.

Here is an example of a persona template:

Some Useful Tips:

  • Personas should be displayed in the office of all Data Office team members.
  • Make it fun, choose an avatar or a photo for each team member, write a small personal and professional bio, list their intrinsic values and work on the look and feel.
  • Build one persona for each person, don’t build personas for teams
  • Be very precise in the personas definition interviews, rephrase if necessary.
  • Treat people with respect and consider all ideas equally.
  • Print them and put them on the office walls for all team members to see.

Building Cross-Functional Teams

In order to get rid of Data and organizational silos, we recommend you organize your Data Office in Feature Teams (see literature on the Spotify feature teams framework on the internet).

The idea is to build cross functional teams to address a specific feature expected by your company.

The Spotify Model Defines the Following Teams:

Squads

Squads are cross-functional, autonomous teams  that focus on one feature area. Each Squad has a unique mission that guides the work they do. 

In season 1, episode 2, in our OKRs example, the CEO has 3 OKRs and the first OKR (Increase online sales by 2%) has generated 2 OKRs:

  • Get the Data Lake ready for growth, handled by the CIO
  • Get the data governed for growth, handled by the CDO.

There would then be 2 squads:

  • Feature 1: get the Data Lake ready for growth
  • Feature 2: get data governed for growth.

Tribes

At the level below, multiple Squads coordinate within each other on the same feature area. They form a Tribe. Tribes help build alignment across Squads. Each Tribe has a Tribe Leader who is responsible for helping coordinate across Squads and encouraging collaboration.

In our example, for the Squad in charge of the feature “Get Data Governed for growth”, our OKRs map tells us that there is a Tribe in charge of “Get the Data Catalog ready”.

Chapter

Even though Squads are autonomous, it’s important that specialists (Data Stewards, Analysts) align on best practices. Chapters are the family that each specialist has, helping to keep standards in place across a discipline.

Guild

Team members who are passionate about a topic can form a Guild, which essentially is a community of interest (for example: data quality). Anyone can join a Guild and they are completely voluntary. Whereas Chapters belong to a Tribe, Guilds can span different Tribes. There is no formal leader of a Guild. Rather, someone raises their hand to be the Guild Coordinator and help bring people together.

Here is an example of a Feature Team organization:

Don’t miss next week’s SE02 E01:

Building your Data Community, where we will help you adapt your organization in order to become more data-driven.

actian avatar logo

About Actian Corporation

Actian empowers enterprises to confidently manage and govern data at scale. Actian data intelligence solutions help streamline complex data environments and accelerate the delivery of AI-ready data. Designed to be flexible, Actian solutions integrate seamlessly and perform reliably across on-premises, cloud, and hybrid environments. Learn more about Actian, the data division of HCLSoftware, at actian.com.
Data Intelligence

Data Governance Framework | S01-E04 – SWOT Analysis

Actian Corporation

May 9, 2021

episode 4- SWOT analysis cover

This is the fourth episode of our series “The Effective Data Governance Framework”. Split into three seasons, this first part will focus on Alignment: understanding the context, finding the right people, and preparing an action plan for your data-driven journey. 

This episode will give you the keys to building a concrete and actionable SWOT analysis.

Season 1: Alignment

Evaluate your data maturity

Specify your data strategy

Getting sponsors

Build a SWOT analysis

Season 2: Adapting

Organize your data office

Organize your data community

Creating data awareness

Season 3: Implementing Metadata Management With a Data Catalog

The importance of metadata

6 weeks to start your data governance journey

In our previous episode, we discussed the different means to obtain the right level of sponsorship to ensure endorsement from decision makers.

This week, we will teach you how to build a concrete and actionable SWOT analysis to assess the company Data Governance Strategy in the best possible way.

What is a SWOT Analysis?

Before we give our tips and tricks on building the best SWOT analysis possible, let’s go back and define what a SWOT analysis is. 

A SWOT analysis is a technique used to determine and define your Strengths, Weaknesses, Opportunities, and Threats (SWOT). Here are some examples:

Strengths

This element addresses the things your company or department does especially well. This can be a competitive advantage or a particular attribute on your product or service. An example of a “strength” for a data-driven initiative would be “Great data culture” or “Data shared across the entire company”. 

Weaknesses

Once your strengths are listed, it is important to list your company’s weaknesses. What is holding your business or project back? Taking our example, a weakness in your data or IT department could be “Financial limitations”, “Legacy technology”, or even “Lack of a CDO”. 

Opportunities

Opportunities refer to favorable external factors that could give an organization a competitive advantage. Few competitors in your market, emerging needs for your product.. all of these are opportunities for a company. In our context, an opportunity could be “Migrating to the Cloud” or “Extra budget for data teams”. 

Threats

The final element of a SWOT analysis is Threats – everything that poses a risk to either your company itself or its likelihood of success or growth. For a data team, a threat could be “Stricter regulatory environment for data” for example.

How to Start Building a Smart SWOT Analysis

Building a good SWOT analysis means adopting a democratic approach that will ensure you don’t miss important topics.

There are 3 principles you should follow:

Gather the Right People

Invite different parts of your Data Governance Team stakeholders from Business to IT, CDO and CPO representatives. You’ll find that different groups within your company will have entirely different perspectives that will be critical to making your SWOT analysis successful.

Throw Your Ideas Against the Wall

Doing a SWOT analysis consists, in part, in brainstorming meetings. We suggest giving out sticky-notes and encouraging the team to generate ideas on their own to start things off. This prevents group thinking and ensures that all voices are heard.

This first ceremony should be no more than 15 minutes of individual brainstorming, Put all the sticky-notes up on the wall and group similar ideas together. 

You can allot additional time to enable anyone to add notes at this point if someone else’s idea sparks a new thought.

Rank the Ideas

It is now time to rank the ideas. We suggest giving a certain number of points to each participant. Each participant will rate the ideas by assigning points to the ones they consider most relevant. You will then be able to prioritize them with accuracy.

actian avatar logo

About Actian Corporation

Actian empowers enterprises to confidently manage and govern data at scale. Actian data intelligence solutions help streamline complex data environments and accelerate the delivery of AI-ready data. Designed to be flexible, Actian solutions integrate seamlessly and perform reliably across on-premises, cloud, and hybrid environments. Learn more about Actian, the data division of HCLSoftware, at actian.com.
Data Platform

Actian on Google Cloud Delivers High-Speed and Pay-for-What-You-Need

Actian Corporation

May 6, 2021

Sync Your Data From Edge-to-Cloud with Actian Zen EasySync

The Actian Data Platform (formerly Avalanche), with the fastest columnar analytics engine available in the market today, is now generally available on Google Cloud Platform (GCP) via Google Cloud and the Google Marketplace. For organizations that need the flexibility to scale dynamically yet pay only for the resources they need, that want an ultra-fast, MPP columnar database engine capable of delivering faster, deeper analytical insights while making it easier than ever for end users to self-access the data they need to do their jobs, this is big news. If ever you needed a good reason to embrace the power of the cloud, this is it.

Google Cloud is the smallest of the big three providers but is growing fast. The experience of building on Google Cloud is hard to beat as well, Google has taken a lot of time to build a great developer experience and we expect that given the great experience Actian has had in building on Google, many others will find similar benefits to using Google Cloud to facilitate their movement to the cloud.

Use Cases – Where Actian on GCP Shines

Interest in data analysis only grows as organizations become more sophisticated and seek greater insights—into their customers’ inclinations, the efficiency of their business processes, and their ability to seize short-lived opportunities. Organizations of all sizes, in a wide range of geographies and industries — including financial services, automotive, healthcare, and retail/e-commerce — are taking advantage of the ultra-high performance and scalability that Actian on GCP offers right out of the virtual box. Some examples are below:

  1. A Fortune 10 financial data services company uses Actian on GCP alongside their trading applications to deliver an edge for its customers, enabling them to perform ad hoc analytics on significant amounts of data returning requested results with sub-second response time.
  2. A top Western European company in the automotive industry has used Actian on GCP to accelerate the delivery — and increase the accuracy — of price quotations for prospects and customers. Data involved in risk assessment, for example, accident reports, driving records, and so forth can change rapidly, so the ability to deliver quotations with up-to-the-moment data provides more accurate risk assessments, delivering a competitive advantage in a highly competitive industry.
  3. Service providers in the healthcare claims payment space leverage Actian operational analytics to extract insights from high volumes of claim data and provide decision-makers with the most up-to-date data for analytical review to eliminate errors, abuse and fraud. Elsewhere in the healthcare sector, innovative clinical trial service units rely on Actian on GCP to analyze large volumes of clinical trial data for patterns and trial performance insights.
  4. A leading French retailer relies on the highly performant analytics engine of Actian on GCP to increase marketing effectiveness and efficiency. The company’s ability to draw insights more rapidly from its own data enables it to optimize marketing spend and gain a better ROI.

Ultra-High Performance

GigaOM, a leading independent organization known for benchmarking database performance, recently put Actian to the test against competing cloud products, including AWS Redshift, Snowflake, Azure, and BigQuery.  In terms of price/performance, Actian beat every one of them. Handily.

And you don’t need to take our word for it: Read the full GigaOM Report.

So, What’s New About Actian?

Under the hood, the Actian database and integration engines are tried and tested. They have been evolving for years in response to real-world demands. Actian on GCP is the latest iteration of this proven platform, and it incorporates not only cutting-edge technological advances, including advances in security and user management.

Some of the exciting new features just announced:

  1. Separation of Storage and Compute – Users of Actian on GCP can scale compute and storage resources independently and automatically. Google Cloud Storage provides all the storage you need – and you only pay for the storage you need.
  2. Google Marketplace Integration – Forget about complicated and time-consuming scripted installations. Actian is now containerized and available directly through Google Marketplace. This has some great benefits, including the ability to subscribe and set up Actian on GCP as a managed SaaS offering with just a few clicks. With the ability to procure Actian through Google, retirement of Google commits can be used for Actian.
  3. Google SSO – Google Account users can gain rapid access to Actian on GCP via Google Single Sign On (SSO).
  4. Role-Based User Management – Actian now provides a GUI-based user management system that enables fine-grained, role-based permission management based on user roles. Actian administrators can use SQL to manage even more fine-grained permissions, but the GUI-based management system provides a fast and easy way to set up role-based user profiles.
  5. Kubernetes Containerization Management – All backend Actian services are now containerized and managed by the Google Kubernetes Engine (GKE). GKE automatically handles the orchestration and management of its underlying pods/containers, which dramatically accelerates backend service execution. Operations such as deleting a warehouse from Actian on GCP can take only seconds—whereas the same operation on AWS or Azure would take minutes. Such optimizations ensure that organizations pay only for what they really need (and don’t prolong resource use unnecessarily). Kubernetes containerization also paves the way for an even better experience in Actian where patches/updates/upgrades are concerned. Read more about why Kubernetes is cool here: Take full advantage of the separation of compute and storage resources with Actian on GKE.
  6. Query Result Caching – Now enabled by default for all warehouses, query result caching accelerates access to insight. After a query is executed against persistent data, its result is placed in cache. If the same query is run again, Actian returns the cached result rather than re-running the query. This significantly improves data warehouse performance and frees up resources to support the execution of novel queries.
  7. REST API – A new REST API enables users to load Actian warehouses directly. The API has a direct link to the underlying data warehouse engine, thus enabling ultra-fast loads to a warehouse.
  8. 1AU Instance Availability – For testing purposes or those use cases that require only a single node (rather than a multi-node cluster), a single Actian Unit (1AU) instance of Actian is available on GCP. The 1AU instance includes the full Actian feature set, including the recent enhancements described above.

Current Features of the Actian Data Platform, and a Few to Come Shortly

Amidst the excitement about the new features and benefits Actian is announcing with the GA release of Actian on GCP, it’s worth stepping back to remember all the other exciting features that have recently been announced—including a robust user interface for managing the lifecycle of data warehouses (now available across Google Cloud, Azure, and AWS). Actian also recently saw a refreshed platform UI for its built-in Query Editor—as well as the ability to use multiple tabs, improved methods for viewing information about database tables, and a host of new charting capabilities.

While Actian on GCP includes pre-installed sample data to help you kick-start the evaluation process, you can also load your own data using out-of-the–box templates designed to connect to data sources such as Salesforce or to data sources on your own desktop such as Excel spreadsheets. Getting data into Actian on GCP is easy, as is connecting to Actian on GCP from your favorite BI and visualization tools (such as Looker, Tableau, and Power BI) is even easier as Actian provides connectivity information directly in the platform.

As mentioned earlier, Actian Data Platform now utilizes Kubernetes in its backend and is a complete re-write of the backend infrastructure. The new backend enables both automation of management functions such as patching and updates while also enabling better utilization of resources. For example, deleting a warehouse in AWS and Azure would take a few minutes, while in Actian on Google Cloud, it takes only a few seconds. These small enhancements help our customers better pay for what they actually use.

Stay tuned for more updates, as Actian is heavily investing in Actian and looking to bring even more great features to the platform very soon.  Want to hear more, join our virtual Hybrid Data Conference on May 25th in North American Eastern and May 27th in Central European time zones from 11 AM to 4 PM, respectively.  I’ll go into much of what was briefly touched on above in far more detail along with many of my Actian colleagues.  You can find out more information here.

actian avatar logo

About Actian Corporation

Actian empowers enterprises to confidently manage and govern data at scale. Actian data intelligence solutions help streamline complex data environments and accelerate the delivery of AI-ready data. Designed to be flexible, Actian solutions integrate seamlessly and perform reliably across on-premises, cloud, and hybrid environments. Learn more about Actian, the data division of HCLSoftware, at actian.com.
Data Intelligence

What are the Differences Between a Data Analyst and a Business Analyst?

Actian Corporation

April 29, 2021

data analyst vs business analyst

The roles of a Data Analyst and a Business Analyst are very often unclear, even though their missions are very different. Their functions are more complementary than not, let’s have a look at these two highly sought-after profiles.

Data is now at the heart of all decision-making processes. According to a study conducted by IDC on behalf of Seagate, the volume of data generated by companies worldwide is expected to reach 175 Zetabytes by 2025…

In this context, collecting information is no longer enough. What’s important is the ability to conclude this data to make informed decisions. 

However, the interpretation methods used and the way to exploit data can be very different. The ever-changing nature of data has created new domains of expertise with titles and functions that are often misleading or confusing.

What separates the missions of the Data Analyst to those of the Business Analyst may seem tenuous. And yet, their functions, roles, and responsibilities are very different…and complementary.

Business Analyst & Data Analyst: A Common Ground

If the roles of a Business Analyst and of a Data Analyst are sometimes unclear, it is because their missions are inherently linked to creating value with enterprise information.

What distinguishes them is the nature of this information.

While a Data Analyst works on numerical data, coming from the company’s information systems, the Business Analyst can exploit both numerical and non-numerical data.

A data analyst must ensure the processing of data within the company to extract valuable analytic trends that enable teams to adapt to the organization’s strategy. The business analyst then provides answers to concrete business issues based on a sample of data that may exceed the data portfolio generated by the company.

A Wide Range of Skills

Data Analysts must have advanced skills in mathematics and statistics. A true expert in databases and computer language, this data craftsman often holds a degree in computer engineering or statistical studies.

The Business Analyst, on the other hand, has a less data-oriented profile (in the digital sense of the term). If they use information to fulfill their missions, they will always be in direct contact with management and all of the company’s business departments. Although the Business Analyst may have skills in algorithms, SQL databases or even master XML language, they are not necessarily an essential prerequisite.

A Business Analyst must therefore be able to demonstrate a real know-how to communicate, listen, hear and understand the company’s challenges. For a Data Analyst on the other hand, technical skills are essential. SQL language, Python, Data modeling and Power BI, IT and analytics expertise will allow them to exploit the data in an operational dynamic.

The Differences in Responsibilities and Objectives

The Data Analyst’s day-to-day work consists above all of enhancing the company’s data assets. To this end, he or she will be responsible for data quality, data cleansing and data optimization.

Their objective? To provide internal teams with usable databases in the best conditions and to identify all the improvement levers likely to impact the data project. 

The Business Analyst will benefit from the work of the Data Analyst and will contribute to making the most of it by putting the company’s native data into perspective with peripheral data and information. By reconciling and enhancing different sources of information, the Business Analyst will contribute to the emergence of new market, organizational or structural opportunities to accelerate the company’s development.

In short, the Data Analyst is the day-to-day architect of the company’s data project. The Business Analyst is the one who intervenes, in the long run, on the business strategy. To meet this challenge, he or she bases his or her action on the quality of the data analyst’s work.

actian avatar logo

About Actian Corporation

Actian empowers enterprises to confidently manage and govern data at scale. Actian data intelligence solutions help streamline complex data environments and accelerate the delivery of AI-ready data. Designed to be flexible, Actian solutions integrate seamlessly and perform reliably across on-premises, cloud, and hybrid environments. Learn more about Actian, the data division of HCLSoftware, at actian.com.
Data Intelligence

Data Governance Framework | S01-E03 – Getting Sponsorship

Actian Corporation

April 28, 2021

This is the third episode of our series “The Effective Data Governance Framework”. Split into three seasons, this first part will focus on Alignment: understanding the context, finding the right people, and preparing an action plan for your data-driven journey. This third episode will give you the keys to getting good sponsorship for your data projects.

Season 1: Alignment

Evaluate your Data maturity

Specify your Data strategy

Getting sponsors

Build a SWOT analysis

Season 2: Adapting

Organize your Data Office

Organize your Data Community

Creating Data Awareness

Season 3: Implementing Metadata Management With a Data Catalog

The importance of metadata

6 weeks to start your data governance journey

In the previous episode, we discussed how best to use OKRs to draft your enterprise data strategy, ensure focus, accountability and engagement from the stakeholders with as much transparency as possible and negotiate objectives at all levels.

To a certain extent, the OKRs should help you get good sponsorship.

In this third episode, we will share insights on how best to get sponsorship.

In order to trigger an Effective Data Governance Initiative, you will need to go through the following steps, with caution:

Step 1: Identify Potential Sponsors

The first step consists in identifying all the potential sponsors and setting up one to one (or one to many if you involve many colleagues) meetings to ensure endorsements and move forward on the Data Governance you want to put in place. You have learned a lot from the OKR meetings and now have the substance to ensure their support.

Step 2: Prepare Your Storytelling

The second step is to prepare a story for each sponsor. Again, based on the workshops you were involved in on the company Data Strategy, you should be able to draft a personalized story.

You have 3 forms of storytelling which can be combined if needed:

  • Use a testimony and real story to strengthen yours.
  • Use a metaphor to illustrate the data concepts when they feel too complex.
  • Use a “springboard” story from a specific characteristic to give the big picture.

Step 3: Present Yourself

The third step consists in getting ready to describe who you are, what you do and why you do it through the prism of every sponsor.

Step 4: Asking for Money

The fourth step consists in getting ready to ask for the money. Asking for money involves proposing different scenarios with different outcomes, a detailed analysis on the costs, a quantitative view on the financial benefits and then a ROI analysis.

Step 5: Commit to Deliverables

The fifth step is to commit to deliverables. There won’t be endorsement if you don’t commit to tangible deliverables, results as well as a time frame.

How to Maximize Your Chances for Getting the Sponsors Aligned:

Ask for More Than You Need

Don’t sell yourself short and be prepared for a cut in your funding expectations and prepare accordingly.

Get a Champion

In the list of sponsors, try to build a good relationship with one in particular and ask for help and insights to maximize your chances of winning.

Be Impeccable in All Aspects

When you’re courting a sponsor, always keep to your word, always be on time or early for an appointment. Let him or her know you are a person of integrity. Don’t forget to share the OKRs Map in which the sponsor is involved down to your own OKR.

Be Brief and Sharp

Ask for what you want, but don’t take up a lot of potential sponsors’ time doing it.

Get Commitments

At the end of the sponsorship process, you should be able to get the following outcomes:

  • Get understanding and alignment.
  • Get funding (means and resources).
  • Get help in removing impediments (and build a fast track in the organization hurdles).
  • Get a schedule to organize feedback ceremonies.
actian avatar logo

About Actian Corporation

Actian empowers enterprises to confidently manage and govern data at scale. Actian data intelligence solutions help streamline complex data environments and accelerate the delivery of AI-ready data. Designed to be flexible, Actian solutions integrate seamlessly and perform reliably across on-premises, cloud, and hybrid environments. Learn more about Actian, the data division of HCLSoftware, at actian.com.
Data Platform

Actian Data Platform on Google Cloud: For the Data-Driven Enterprise

Actian Corporation

April 27, 2021

hybrid cloud technology concept with data warehouses

Today, Actian is excited to announce the delivery of Actian Data Platform on Google Cloud Marketplace. Actian is initially available in the us-east1 region, which will be expanded to include US West, German, and Irish cloud regions in the coming months. If you are interested in trying out Actian Data Platform on Google Cloud, please check out our listing here.

Actian is the highest-performance warehouse available on Google Cloud and beats the alternative on price performance by a factor of 8 – 12x. Our price-performance advantage takes advantage of the fact that Google Cloud has the best backhaul and high-bandwidth networking infrastructure of any cloud provider, ensuring lightning-fast data access even across regions. Leveraging this superior networking and storage capabilities in Google Cloud, Actian delivers 20% superior query performance compared to other cloud providers.

We have been collaborating closely with Google Cloud’s architects and storage teams over the past few months. From the standpoints of ease of use, integrated connectivity, and real-time decision-making, this collaboration ensures that Actian will deliver the best cloud data warehouse experience available—with the lowest TCO among all cloud data warehouses. Organizations in the midst of digital transformation can gain access to the real-time insights needed to make the operational analytics decisions that competitive advantage demands.

Actian is deployed in the form of containers and microservices on the latest compute nodes leveraging Google Kubernetes Engine (GKE), and Google Cloud Storage (GCS).  Actian supports agile data processing by enabling the rapid ingestion of data into the data warehouse. It can also be used to query data stored in Google data lakes via external tables.

Multi-Cloud and Hybrid Deployment

Because Actian can be used in a multi-cloud configuration, organizations can configure Actian to operate across multiple cloud providers. This enables organizations to finally realize the true potential of hybrid by bringing the compute power of Actian to the place where their data resides. In addition, Actian can also be deployed on-premises, allowing organizations to leverage the same database engine, the same physical data model, the same ETL/ELT tools, and the same BI tools of your choosing both in the cloud and on-prem.

Built for Google Cloud

Actian is well integrated into the Google ecosystem. Looker has been a long-time visualization partner of Actian, and now as part of Google Cloud, provides a great option for customers looking for a native BI tool on Google Cloud. We have also preconfigured connectors to pull data from Google Cloud Storage and DataProc into Actian. These complement a slew of over 200 built-in data and application sources that Actian can pull data from.

Over the next few months, we will be adding further integrations with Google tools like Google KubeFlow, and Google DataFusion.

actian avatar logo

About Actian Corporation

Actian empowers enterprises to confidently manage and govern data at scale. Actian data intelligence solutions help streamline complex data environments and accelerate the delivery of AI-ready data. Designed to be flexible, Actian solutions integrate seamlessly and perform reliably across on-premises, cloud, and hybrid environments. Learn more about Actian, the data division of HCLSoftware, at actian.com.
Data Intelligence

What is the BCBS 239?

Actian Corporation

April 26, 2021

bcbs-239-blog-zeenea

In order for banks to have complete visibility on their risk exposure, the Basel Committee defined 14 key principles that were made into a standard called BCBS 239.

Its objective? Give banks access to reliable and consolidated data. Let’s get into it.

In 2007, the world economy was teetering on the brink of collapse. Many supposedly stable banking institutions were on the edge of bankruptcy following the failure of the American bank Lehman Brothers. In response to a crisis of unprecedented violence, a wind of regulation blew over the world, giving birth to the BCBS 239, also known as the Basel Committee on Banking Supervision’s standard number 239.

Published in 2013, the Basel Committee’s standard number 239, was intended to create conditions for transparency in banking institutions by defining a clear framework for the aggregation of financial risk data. In practice, its objective is to enable financial and banking institutions to produce precise reports on the risks to which they are exposed. BCBS 239 is a binding framework but contributes to the stability of the global financial system, which was severely tested during the 2007 financial crisis.

BCBS 239: A Little History

The Basel Committee was created in 1974 at the instigation of the G10 central bank governors. As of 2009, the organization has 27 member countries and is dedicated to strengthening the safety and soundness of the financial system and establishing standards for prudential supervision.

BCBS 239 is one of the Basel Committee’s most emblematic standards because it is a barrier to the abuses that led to the 2007 crisis. 

Indeed, the growth and diversification of the activities of banking institutions, as well as the multiplication of subsidiaries within the same group, created a certain opacity that generated inaccuracies in the banks’ reporting.

Inaccuracies that could, once accumulated, represent billions of dollars of vagueness, hindering quick and reliable decision-making by managers. The critical size reached by financial institutions required to guarantee reliable decision-making based on consolidated and quality data. This is the very purpose of BCBS 239.

The 14 Founding Principles of BCBS 239

Although BCBS 239 was published in 2013, the thirty or so G-SIBs (globally systemically important institutions) that had to comply had until January 1, 2016 to do so. The national systemically important banking institutions (also called D-SIBs) had three more years to comply.

Since January 1, 2019, G-SIBs and D-SIBs must therefore comply with the 14 principles set out in BCBS 239. 

Eleven of them concern banking institutions in the first place. The other three are addressed to supervisory authorities. The 14 principles of BCBS 239 can be classified into four categories: governance and infrastructure, risk data aggregation capabilities, reporting capabilities and prudential supervision. 

Governance and Infrastructure

In the area of governance and infrastructure, there are two principles. The first is the deployment of a data quality governance system to improve financial communication and the production of more accurate and relevant reports in order to speed up and make decision-making processes more reliable. 

Risk Data Aggregation Capabilities

The second principle affects the IT infrastructure and requires banks to put in place a data architecture that enables the automation and reliability of the data aggregation chain.

The section on risk data integration capabilities brings together four key principles: data accuracy and integrity, completeness, timeliness and adaptability.

Four pillars that enable decisions to be based on tangible, reliable and up-to-date information.

Reporting Capabilities

The third component of BCBS 239 concerns the improvement of risk reporting practices.

This is an important part of the standard, which brings together five principles: the accuracy and precision of information, the completeness of information relating to the risks incurred in order to guarantee real and sincere visibility of the institution’s exposure to risks, but also the clarity and usefulness of reporting, the frequency of updating and the sincerity of distribution.

These reports must be transmitted to the persons concerned.

Supervision

The last three principles apply to the control and supervisory authorities. They set out the conditions for monitoring banks’ compliance with the first 11 principles. They also provide for the implementation of corrective actions and prudential measures and set the framework for cooperation with supervisory authorities. 

Thanks to BCBS 239, data becomes one of the levers of stability in a globalized economy.

actian avatar logo

About Actian Corporation

Actian empowers enterprises to confidently manage and govern data at scale. Actian data intelligence solutions help streamline complex data environments and accelerate the delivery of AI-ready data. Designed to be flexible, Actian solutions integrate seamlessly and perform reliably across on-premises, cloud, and hybrid environments. Learn more about Actian, the data division of HCLSoftware, at actian.com.
Data Intelligence

Data Governance Framework | S01-E02 – Data Strategy

Actian Corporation

April 23, 2021

BECOMING DATA DRIVEN S1E2 Blog

This is the second episode of our series “The Effective Data Governance Framework”.

Split into three seasons, this first part will focus on Alignment: understanding the context, finding the right people, and preparing an action plan in your data-driven journey. 

This second episode will give you the keys to putting in place an efficient enterprise data strategy through the setting up of Objective Key Results. 

Season 1: Alignment

Evaluate your data maturity

Specify your data strategy

Getting sponsors

Build a SWOT analysis

Season 2: Adapting

Organize your data office

Organize your data community

Creating data awareness

Season 3: Implementing Metadata Management With a Data Catalog

The importance of metadata

6 weeks to start your data governance journey

In our previous episode, we addressed the Data Maturity of your company through different angles.

In the form of a workshop, we shared our Data Governance Maturity Audit which enables you, through the Kiviat Diagram,  to establish your starting point.

In this episode, we help you define your company Data Strategy effectively.

What is the First Step in Defining Your Data Strategy?

We recommend that you use the OKRs (Objective Key Results) framework to build your data strategy efficiently.

Before stepping into the topic itself, let’s delve into what OKRs mean, how they are built and then share some useful tips with you.

What Exactly are OKRs?

Here, an “Objective” is something which you want to achieve (and) that is aspirational for all employees. A “Key Result” is how you plan to measure quantitatively.

We recommend you limit to 3 the number of Key Results per Objective.

There are many benefits to putting in place enterprise-wide OKRs. Their 5 key benefits are:

  • More focus.
  • More accountability.
  • More engagement.
  • Better alignment.
  • More transparency.

In the Effective Data Governance framework, OKRs are cascaded, resulting in Key Results from the Executives involved in the Data Strategy to individuals involved from an operational perspective. Whilst the Actian Data Intelligence Platform believes in a “bottom-up” approach, the OKR setting exercise is a “top-down” approach.

It is very important that, at each level, any one individual is able to understand the OKRs at the upper levels and how his or her OKRs contribute to the overall company Data Strategy.

We recommend you set up a reasonable deadline for each OKR. By proceeding this way, all deducted OKRs will be consistent with the deadlines from the highest levels. We also recommend you constantly share, display and explain the OKR Map to all the stakeholders.

You Will Ensure Engagement, Alignment and Transparency.

We suggest you negotiate the OKRs, especially their deadlines, rather than imposing them.

An Example of Setting up OKRs in Your Company

You can start with CEO OKRs on the Data Strategy if he/she is involved. At the highest level, one OKR will result in one dedicated OKR map.

On the lower levels, you can have several key results per team or employee.

For example, let’s take a CEO with 3 OKRs that impact the Data Strategy as shown below:

Then, working from the top level OKRs, you will be able to deduce the OKRs for CXOs and Top Executives like the Chief Data Officer, the Chief Information Officer, the Chief Product Officer, the VP of Sales, and so on.

For each Executive, there will be OKRs assigned to those reporting directly to them (such as heads of Analytics, heads of IT Architecture, heads of HR, etc), followed by OKRs for Teams (data governance data/IT architecture, analytics, business intelligence, data science, etc.) and finally, OKRs carried out by individuals, as shown.

Now take the OKR1 from the CEO, which relates to increasing online sales by 2% by 30/06/2021.

This OKR map shows the cascade of related OKRs carried out by C Levels and executives, teams and individuals resulting from the CEO OKR1.

As you can see in the OKR map above, we take into account the deadlines at all levels, resulting in a monthly overview of individual OKRs.

As an example, as described, The CEO OKR1 generates OKR1 for the CDO which consists of the following:

  • Objective: Have the data catalog ready for the Data Lake
  • Key Result: Have 100% of their data assets coming from the Data Lake governed
  • Deadline: March 30th, 2021

And for the level below, a data steward carries the following OKR1

  • Objective: Have all of the data assets from the Data Lake documented
  • Key Result: Have100% of the data assets available for the analytics teams
  • Deadline: March 30th, 2021

Tips on How to Best Set up Your OKRs in the Long Run

We recommend you follow OKRs every quarter for the levels 1 and 2, and then more frequently at the team and individual levels.

Any change in the deadlines may have an impact at a higher level. Rather than impacting the chain of OKRs, we suggest adapting the impact of an OKR by reducing its scope as an MVP as much as possible in order to keep the pace.

Some other tips include:

  • Select one OKR at the CEO (or a lower) level and practice before generalizing the OKRs practice,
  • Consider the OKR practice as an OKR in itself and monitor it,
  • Appoint one person in charge of the implementation of the OKRs to make sure that the team follows the agreed upon OKRs practices. That person will coach the team on the OKR processes and will administer the OKR tools (you can find some here).

The Actian Data Intelligence Platform Customer Success Team and Professional Services will help you initialize the OKR Map best suited to your Data Strategy. You will benefit from our expertise in data-related topics, especially in data governance/cataloging.

Typically, a Data Governance project, in which the platform is involved, may generate between 2 to 10 workshops (the duration of each workshop varies between 2 hours to half a day) in order to draft and initiate the Corporate Data Strategy for the first 3 to 6 months.

actian avatar logo

About Actian Corporation

Actian empowers enterprises to confidently manage and govern data at scale. Actian data intelligence solutions help streamline complex data environments and accelerate the delivery of AI-ready data. Designed to be flexible, Actian solutions integrate seamlessly and perform reliably across on-premises, cloud, and hybrid environments. Learn more about Actian, the data division of HCLSoftware, at actian.com.
Data Intelligence

Data Governance Framework | S01-E01 – Evaluate Your Maturity

Actian Corporation

April 14, 2021

BECOMING DATA DRIVEN S1E1 Blog

This is the first episode of our series “The Effective Data Governance Framework”. Split into three seasons, this first part will focus on Alignment: understanding the context, finding the right people, and preparing an action plan for your data-driven journey.  Our first episode will give you the keys on how to evaluate the maturity of your company’s data strategy for you to visualize where your efforts should lie in your data governance implementation.

Data is the Petrol of the 21st Century

With GAFA paving the way (Google, Apple, Facebook, and Amazon), data has, in recent years, become a crucial enterprise asset and has taken a substantial place in the minds of key data and business people alike.

The importance of data has been amplified by new digital services and uses that disrupt our daily lives. Traditional businesses who lag behind in this data revolution are inevitably put at a serious competitive disadvantage.

To be sure, all organizations and all sectors of activity are now impacted by the new role data represents as a strategic asset. Most companies now understand that in order to keep up with innovative startups and powerful web giants, they must capitalize on their data.

This shift in the digital landscape has led to widespread digital transformations the world over with everybody now wanting to become “Data-Driven”.

The Road to Becoming Data-Driven

In order to become data-driven, one has to look at data as a business asset that needs to be mastered first and foremost, and then exploited.

The data-driven approach is a means to collect, safeguard and maintain data assets of the highest quality whilst also tackling the new data security issues that come with the territory. Today, data consumers must have access to accurate, intelligible, complete, and consistent data in order to detect potential business opportunities, minimize time-to-market, and undertake regulatory compliance.

The road to the promised land of data innovation is full of obstacles.

Data legacy, with its heavy silos and the all too often tribal nature of data knowledge, rarely bodes well for the overall quality of data. The advent of Big Data has also reinforced the perception that the life cycle of any given data must be mastered in order for you to find your way through the massive volume of the enterprise’s stored data.

It’s a challenge that encompasses numerous roles and responsibilities, processes and tools.

The implementation of a data governance is therefore, a chapter that any data-driven company must write.

However, our belief that the approaches to data governance from recent years have not kept their promises is borne out by our own field experience along with numerous and ongoing discussions with key data players.

We strongly believe in adopting a different approach to maximize the chances of success. Our Professional Services and Customer Success teams provide our customers with the expertise they need to build effective data governance, through a more pragmatic and iterative approach that can adapt to a constantly changing environment.

We call it the Effective Data Governance Framework.

Our Beliefs on Data

Awareness of the importance of data is a long journey that every company has to make. But each journey is different: company data maturity varies a lot; expectations and obligations can also vary widely.

Overall success will come about with a litany of small victories over time.

We have organized our framework in 3 steps.

Alignment

Evaluate your Data maturity

Specify your Data strategy

Getting sponsors

Build a SWOT analysis

Adapting

Organize your Data Office

Organize your Data Community

Creating Data Awareness

Implementing Metadata Management With a Data Catalog

The importance of metadata

6 weeks to start your data governance journey

Season 1, Episode 1: Alignment

This first season is designed to help your organization align itself with your data strategy by ensuring an understanding of the overall context.

What follows will help you, and all the key sponsors, identify the right stakeholders from the get-go. This first iteration will help you evaluate the data maturity of your organization through different angles.

In the form of a workshop, our Data Governance Maturity Audit will help you visualize, through a Kiviat Diagram, your scores as shown below:

Data Maturity Audit: Important Questions to Ask

Organization

Is an organizational structure with different levels of governance (exec, legal, business, …) in place? Are there roles and responsibilities at different specified levels (governance committees, tech leaders, data stewards, …)?

Data Stewards

Are the data stewards in charge of coordinating data governance activities identified and assigned to each area or activity?

Accountabilities

Have the roles, responsibilities and accountability for decision-making, management and data security been clearly defined and communicated (to the data stewards themselves, but also to everyone involved in the business)?

The Means

Do data stewards have sufficient authority to quickly and effectively correct data problems while ensuring that their access does not violate personal or sensitive data policies?

The Requirements

Have policy priorities affecting key data governance rules and requirements been defined? Is there an agreement (formal agreement or verbal approval) on these priorities by the key stakeholders (sponsors, policy makers, exec)?

Life Cycle Management

Have standard policies and procedures for all aspects of data governance and data management lifecycle, including collection, maintenance, use and dissemination, been clearly defined and documented?

Compliance

Are policies and procedures for ensuring that all data is collected, managed, stored, transmitted, used and destroyed in such a way that confidentiality is maintained in accordance with security standards in place (GDPR for example)?

Feedback

Has an assessment been conducted to ensure the long-term relevance and effectiveness of the policies and procedures in place, including the assessment of staffing, tools, technologies and resources?

Process Visions

Do you have a mapping describing the processes to monitor compliance with its established policies and procedures?

Transparency

Have the policies and procedures been documented and communicated in an open and accessible way to all stakeholders, including colleagues, business partners and the public (eg: via a publication on your website)?

Overview
Does your organization have an inventory of all the data sources (from software packages, internal databases, data lakes, local files, …)?

Managing Sensitive Information
Does your organization have a detailed, up-to-date inventory of all data that should be classified as sensitive (ie, which is at risk of being compromised / corrupted by unauthorized or inadvertent disclosure), personal, or both?

Level of Risks
Has your data been organized according to the level of risk of disclosure of personal information potentially contained in the records?

Documentation Rules
Does your organization have a written and established rule describing what should be included in a data catalog? Is it clear how, when and how often this information is written and by whom?

Information Accessibility
Does your organization let everyone concerned by data access the data catalog? Is the data needed indexed in the catalog or not?

Global Communication
Does your organization communicate internally on the importance data can play in its strategy?

Communication Around Compliance
Does your organization communicate with its employees (at least those who are directly involved in using or manipulating data) about current regulatory obligations related to data?

Working for the Common Good
Does your organization promote the sharing of datasets (those that are harder to find and/or only used by a small group for example) via different channels?

Optimizing Data Usage
Does your organization provide the relevant people training on how to read, understand and use the data?

Promoting Innovation
Does your organization value and promote the successes and innovations produced (directly or not) by the data?

Collecting & Storing Data
Does your organization have clear information on the reason for capturing and storing personal data (operational need, R&D, legal, etc.)?

Justification Control
Does your organization have a regular verification procedure to ensure the data collected is consistent with the information mentioned above?

Anonymization
Have anonymization or pseudo-anonymization mechanisms been put in place for personal data, direct or indirect?

Detailed Procedure
Has the organization established and communicated policies and procedures on how to handle records at all stages of the data life cycle, including the acquisition, maintenance, use, archiving or destruction of records?

Data Quality Rules
Does the organization have policies and procedures in place to ensure that the data is accurate, complete, up-to-date and relevant to the users’ needs?

Data Quality Control
Does the organization conduct regular data quality audits to ensure that its quality control strategies are up-to-date and that corrective actions taken in the past have improved the quality of the data?

Data Access Policy
Are there policies and procedures in place to restrict and monitor access to data in order to limit who can access what data (including assigning differentiated access levels based on job descriptions and responsibilities)?

Are these policies and procedures consistent with local, national, … privacy laws and regulations (including the GDPR)?

Data Access Control
Have internal procedural controls been put in place to manage access to user data, including security controls, training and confidentiality agreements required by staff with personal data access privileges?

General Framework
Has a comprehensive security framework been defined, including administrative, physical, and technical procedures to address data security issues (such as access and data sharing restrictions, strong password management, regular selection and training of staff, etc.)?

Risk Assessment
Has a risk assessment been undertaken?

Does this risk assessment include an assessment of the risks and vulnerabilities related to both intentional and malicious misuse of data (eg hackers) and inadvertent disclosure by authorized users?

Risk Mitigation Plan
Is there a plan in place to mitigate the risks associated with intentional and unintentional data breaches?

Prevention
Does the organization monitor or audit data security on a regular basis?

Recovery Plan
Have policies and procedures been established to ensure the continuity of data services in the event of a data breach, loss, or another disaster (this includes a disaster recovery plan)?

Flow Regulation
Are policies in place to guide decisions on data exchange and reporting, including sharing data (in the form of individual records containing personal information or anonymized aggregate reports) internally with business profiles, analysts/data scientists, decision-makers, or externally with partners?

Usage Contracts and Legal Commitment
When sharing data, are appropriate procedures, such as sharing agreements, in place to ensure that personal information remains strictly confidential and protected from unauthorized disclosure? Note that data sharing agreements must fall in line with all applicable regulations, such as the GDPR.

These agreements can only take place if data sharing is permitted by law.

Control of Product Derivatives
Are appropriate procedures, such as obfuscation or deletion, in place to ensure that information is not inadvertently disclosed in general reports and that the organization’s reporting practices remain in compliance with the laws and regulations in force (for example, GDPR)?

Stakeholder Information
Are stakeholders, including the individuals whose data are kept, regularly informed about their rights under the applicable laws or regulations governing data confidentiality?

Our interactive toolkit will allow you to visualize where your efforts should lie when implementing a data governance strategy.

actian avatar logo

About Actian Corporation

Actian empowers enterprises to confidently manage and govern data at scale. Actian data intelligence solutions help streamline complex data environments and accelerate the delivery of AI-ready data. Designed to be flexible, Actian solutions integrate seamlessly and perform reliably across on-premises, cloud, and hybrid environments. Learn more about Actian, the data division of HCLSoftware, at actian.com.
Data Intelligence

What is a Data Product Manager?

Actian Corporation

April 8, 2021

data-product-manager

Data Product Management has been a regular topic of discussion among Data Science, Engineering, and Product Management teams over the last few years, particularly when Data Science products and Machine Learning are involved.

The role of the Data Product Manager has many similarities to that of a Software Product Manager in that a keen understanding of customer business requirements is crucial. There are, however, some key differences in their respective responsibilities and the skill sets needed.

In What Business Environment Does a Data Product Manager Usually Navigate?

It is fair to say that Machine Learning dependent products impact our daily lives. Social media platforms (Linkedin, Facebook, Twitter), Google, Uber and Airbnb have all developed highly sophisticated ML algorithms to improve the quality of their product.

Today, Data Science products are by no means the chasse gardée of the top tech companies however. They have also become a common feature in a variety of enterprise related domains such as predictive analytics, supply chain management, crime detection, fraud, high staff turnover prevention to mention but a few.

Data Product Managers are often called for when Data Science Products are involved, in other words when the core business value in the spotlight depends on Machine Learning and Artificial Intelligence.

What Does a Data Product Manager Do?

Again, the role of the Data Product Manager is analogous to most Product Management roles in that it is geared towards developing the best possible product for the customers/users. That remains the key focus for the Data Product Manager.

There are, however, some subtle differences when it comes to the remit of the Data Product Manager.

The population range that the Data Product Manager caters for is often wide and can include Data Scientists, Data Engineers, Data Analysts, Data Architects and even developers and testers. Such a diverse pool of expectations requires a solid understanding of each of these fields in order for the Data Product Manager to understand the use case for each stakeholder, not to mention strong people skills to navigate through these different universes unscathed.

To demonstrate the diverse range of skills involved in this new role, the ideal Data Product Manager will have a broad understanding of Machine Learning algorithms, Artificial intelligence and statistics. He will have some coding experience (enough to dip his toes in if needed), be good at math, understand the Big Data technologies… and have second to none communication skills.

The Data Product Manager can even be assigned the responsibility of centralizing access to Data at the enterprise level*.

Here, he might be asked to come up with new ways to manage, collect and exploit data in order to improve the usability and quality of the information. This part of the job may involve choosing a suitable Data Management software to centralize and democratize access to the data sets for all parties mentioned above, breaking down silos between teams and facilitating data access for all.

They may then choose a Data Catalog platform with a powerful knowledge graph and a simple search engine…such platforms do exist.

*How, in this instance, does the role of the Data Product Manager differ from that of the Data Steward you may ask. After all, isn’t it up to the Data Steward to curate, manage, handle permissions and make the data available to the data consumers? One way to consider the distinctions between the two roles could be to see the Data Steward as the data custodian of the data of the present and the Data Product Manager as the custodian and innovator of the data of the future.

actian avatar logo

About Actian Corporation

Actian empowers enterprises to confidently manage and govern data at scale. Actian data intelligence solutions help streamline complex data environments and accelerate the delivery of AI-ready data. Designed to be flexible, Actian solutions integrate seamlessly and perform reliably across on-premises, cloud, and hybrid environments. Learn more about Actian, the data division of HCLSoftware, at actian.com.
Data Platform

Compute and Storage Resources With Actian Data Platform on GKE

Actian Corporation

March 31, 2021

Computer vs storage

On-Premise, You’re Grounded

The emergence of the Hadoop Distributed File System (HDFS) and the ability to create a data lake of such unprecedented depths – on standard hardware no less! – was such a breakthrough that the administrative pain and the hardware costs involved with building out an HDFS-based analytic solution were acceptable casualties of innovation. Today, though, with an analytic tool like the Actian Data Platform (formerly known as Avalanche) containerized, running in the cloud, and taking advantage of Google Kubernetes Engine (GKE), there’s no reason to put up with those pains. Indeed, because Actian on GKE treats compute and storage as separate resources, organizations can gain access to the power of Actian — to meet all their analytic needs, on both a day-to-day and peak-season basis — more easily and cost-effectively than ever before.

Consider: When Hadoop first appeared, the cloud was not taken as an option for data analytics. Building out an HDFS-based data lake involved adding servers and storage resources on-premises — which also meant investments in ancillary infrastructure (networks, load balancers, and so on) as well as on-site personnel to manage and maintain the growing number of cabinets taking over the data center. The cost of analytic insight was driven still higher by the fact that all these compute and storage resources had to be deployed with an organization’s peak processing demands in mind. No matter that those peaks only occurred occasionally — at the end of the quarter or during the busy holiday shopping season — the cluster performing the analytics needed to be ready to support those demands when they arrived. Was much of that CPU power, RAM, and storage space idle during the non-peak periods? Yes, but that was the price to be paid for reliable performance during periods of peak demand.

But peak period performance was not the only element driving up the cost of an on-prem, HDFS-based data lake. If the organization needed to store large amounts of data, the distributed nature of HDFS required that organizations deploy more compute resources to manage the additional storage — even if there was already excess compute capacity within the broader analytic cluster. Additionally, no one added just a little storage when expanding capacity. Even if you only needed a few GB of additional storage, you’d deploy a new server with multiple terabytes of high speed storage, even if that meant you’d be growing into that storage space over quite a long time. Further, every organization had to figure this out for themselves which incurred significant devotion of skilled IT resources that could be used elsewhere.

Unbinding the Ties on the Ground

Actian has broken the links between compute and storage. Actian running in the cloud on GKE, scales compute and storage independently creating great opportunities and potentially great cost savings for organizations seeking flexible, high-performance, cloud-based analytical solutions.

We’ve already talked about the administrative advantages of running the Actian Data Platform as a containerized application on GKE. Actian can be deployed faster and more easily on Google GKE because all the components are ready to  go. There are no configuration scripts to run; application stacks to build in the wrong order. What we didn’t mention (or at least expand upon) in our last blog on the topic is that you don’t have to configure Actian on GKE to meet those peak-performance spike demands. You can deploy Actian with just your day-to-day performance needs in mind. Nor did we mention that you don’t need to provision storage for each worker node in the cluster.

How is this possible, you ask? Because Google’s cloud services are highly elastic — something one cannot say about an on-premises infrastructure. Though the compute resources initially allocated to an Actian cluster (measured in Actian Units, AUs) are sufficient to support daily operational workloads, invariably, they will not be sufficient to deliver the desired compute performance during demand peaks —they are, after all, configured to support day-to-day traffic demands. The elasticity of the Google cloud infrastructure is such that additional AUs can be added into the cluster when they’re needed. All you need to do is scale the AUs to match the desired performance levels and the Google compute infrastructure will take care of the rest. More AUs means more cores will be added — or subtracted — as needed. Yes, as you use more compute power during those peak periods you’ll pay more for the use of those resources, but one big advantage of the cloud is that you ultimately pay only for the compute resources you actually use. Once the peak has passed, the extra AUs can be removed, and your costs will drop back to the levels associated with your day-to-day processing demands.

Similarly, with storage, the Google cloud infrastructure will allocate as much storage space as your data requires. If you add or remove data from the system, Google increases or decreases the amount of storage allocated for your needs — instantly and automatically.

Serving Up Satisfaction

This storage elasticity becomes an even more obvious benefit when you realize that you don’t need to deploy additional HDFS worker nodes just to manage this data — even if you’re expanding your database by an extra 4, 40, or 400TB. As with added compute cores, you’ll pay more for more storage space — it’s the same pay-for-what-you-use model — but because the storage and compute components have been separated you are not required to add a dedicated server to manage storage for every TB of storage you add. GKE will always ensure that Actian has the compute resources to deliver the performance you need, you can increase and decrease the number AUs based on your performance expectations, not the limitations of a runtime architecture built with on-prem constraints in mind.

In the end, separation of compute and storage offers a huge advantage to anyone interested in serious analytics. Large companies can reduce their costs by not having to overbuild their on-prem infrastructures to accommodate the performance demands that they know will be arriving. Smaller companies can build out an analytics infrastructure that might have been unaffordable before because they don’t have to configure for peak performance demands either. For both large and small companies, Google delivers the resources that your analytics require — no more and no less — enabling Actian on Google Cloud Platform to deliver the analytical insights you require without breaking the bank.

actian avatar logo

About Actian Corporation

Actian empowers enterprises to confidently manage and govern data at scale. Actian data intelligence solutions help streamline complex data environments and accelerate the delivery of AI-ready data. Designed to be flexible, Actian solutions integrate seamlessly and perform reliably across on-premises, cloud, and hybrid environments. Learn more about Actian, the data division of HCLSoftware, at actian.com.
Data Analytics

5 Tips for Extracting More ROI From Your CRM and Marketing Tech Stacks

Actian Corporation

March 28, 2021

operational data warehouse

Tech stacks are getting more complicated by the day. Marketing Operations, Revenue Operations, Sales Operations, IT, Analytics, and Executives—we are all doing business using digital automations, including integrations, that allow us to target and interact with our customers and prospects in more meaningful and rewarding ways than ever before.

Take a moment to consider. Are you using a single unified app or platform to strategically grow your revenue? Or are your teams still operating in silos and spreadsheets while losing out on opportunities to make a bigger impact?

If you are like many strategic marketing and revenue leaders, your various specialized sales and marketing technology platforms (MarTech)—Salesforce.com, Marketo, ZenDesk, Sales Loft, and so many more—generate a lot of last-mile data analytics. You have more data and insights, but it can be a struggle to unify it into one big picture. Teams spend so much time doing their best to get that last mile of data to load or to maintain expected week-over-week growth. This leads to more IT projects, longer lead times, more business resources doing integrations or manually inputting spreadsheets, and ultimately burnout or growth slowdown.

You already know bad data can sabotage your business. But good data buried under layers of apps and reports can be just as damaging. Time and resources currently spent on compiling and reporting last-mile data can prevent your business from reaching its full potential and focus your most talented people on poorly identified and prioritized opportunities instead of driving real new revenue channels and targeting the right accounts, roles, and decision-makers.

Here are five tips to find revenue hidden in your tech stacks.

1. Do Not Buy That New Point App or CRM Module Before Getting Your House in Order

Make sure you can adequately answer the following questions before purchasing a new sales or marketing application:

  • Is your data squeaky clean, validated, and in the right marketing campaign?
  • Are sales teams able to prioritize real leads?
  • What is your Ideal Customer Profile?
  • Which Job Titles are responding to marketing and sales outreach, then taking meetings?
  • Which Job Titles are converting to opportunities? Can you see this in real time across marketing and sales data?
  • Do you have a single view of your customer and prospects, including the ability to see customer journey and experience, as a combined view of marketing and sales outreach, and engagement?
  • How frequently are you communicating with your top prospects across all channels—email, phone, chat, social media, etc.? Can you analyze that data by touchpoint and across nurture tracks?
  • Is the customer and prospect data in your CRM and MarTech systems well-understood, clean, and optimized to match your go-to-market (GTM)?
  • Can you measure your KPIs? Are they accurate? And are monitored automatically and easily visualized and reportable to all revenue/marketing leaders so that they can focus on decision-making and you can focus on actions?

If your analysts and operations teams are spending a large percentage of time on manual workloads and entries, such as updating standalone spreadsheets in Microsoft Excel, it is a sure sign that there are opportunities you should pursue to improve your operations before investing in more point applications—such as automating manual work and ensuring the optimization of existing processes inside your CRM and MarTech platforms. That said, it’s true that optimizing your CRM and MarTech stacks can only take you so far. Undoubtedly, some data will never be unified and there will always be a requirement for an outside view. But, there is a huge opportunity for revenue leaders to unify customer data in a modern cloud data analytics platform—mapped to your KPI’s and GTM—to deliver more revenue.

2. See If You Can Save on CRM or Marketing Automation Platform Fees

Once your operational house is in order, look for opportunities to remove unnecessary services and products, such as:

  • CRM storage fees for older data, or data you do not need. Offload to your unified analytics platform, where storage is typically much less expensive in self-service cloud-based utilities.
  • CRM platform consulting fees and platform fees. Avoid these costs with self-service analytics, using a unified analytics platform.
  • MarTech platform and other app cost reduction or avoidance due to optimized automation and management of customer data.

3. Double Down on One Big Thing

Focus on one big thing that will have the largest impact across your people, your processes, and how you go to market using your MarTech stack. For example, you may be able to make a larger impact with an end-to-end program which includes data cleansing, data validation, tight personas, and a customer journey mapped for the new program/sales experience.

4. Feed Your CRM and MarTech Properly

That  means good data, real-time leads, and integrated information so frontline sales and customer engagement teams have a prioritized daily list of activities, including lead and account scores that allow simple sorting in CRM reports. Share persona-mapped leads and have Program-Priority, or ‘Sales Play,’ categorized for easy handling. A centralized Revenue Operations or Marketing Operations analyst or team running automations can eliminate duplicated efforts and ensure the best data route to the correct territory and appropriate sales representative.

5. Redirect Your Resources

Now that you know your ideal customer and are saving time, money, and effort by streamlining CRM, MarTech platforms, tech services, data gathering, and analytics, it is time to redirect your resources to future revenue generation. Secure strategic funding by presenting your new revenue operations plan based on what is working in the market, supported by your enhanced command of 360-degree data. Continue to measure, improve, and act upon what is most important to your current and prospective customers.

Tackling all these can seem like a huge task. However, it is well worth the effort to ensure your business is ready to take advantage of future opportunities. In the next blog entry in this series, I’ll give you a detailed prescription of how best to address these issues and to streamline your ability to acquire, retain, and expand your customer base in pursuit of revenue optimization. However, if you’re short on patience or time, take a look at our Customer360 Revenue Optimization solution.

actian avatar logo

About Actian Corporation

Actian empowers enterprises to confidently manage and govern data at scale. Actian data intelligence solutions help streamline complex data environments and accelerate the delivery of AI-ready data. Designed to be flexible, Actian solutions integrate seamlessly and perform reliably across on-premises, cloud, and hybrid environments. Learn more about Actian, the data division of HCLSoftware, at actian.com.