Data Integration

Data Migration: A Roadmap to Success

Traci Curran

August 13, 2021

3d rendering robot learning or machine learning

Organizations initiate data migration projects for several reasons. They might need to overhaul an entire system, upgrade databases, establish a new data warehouse, support a cloud migration roadmap, or merge new data from an acquisition or other source. Data migration is also necessary when deploying another system that resides next to incumbent applications. Regardless of the exact purpose for migration, the goal is generally to enhance business performance and competitiveness.

Achieving your migration goals can be difficult. But with a solid migration strategy and implementation approach and the right set of tools, you will be well-positioned for data migration success.

Why You Should Have a Data Migration Plan and Strategy

A strategic data migration plan should include consideration of these critical factors:

Know Your Data

To have a successful migration, before migrating your data, you must know (and understand) what you’re migrating, as well as how it fits within the intended destination system. Understand how much data is moving and what that data looks like. Ask yourself what needs to migrate, what can be left behind and what might be missing. If an organization skips this “source review” step and assumes an understanding of the data, the result could be wasted time and money. Worse, an organization could encounter a critical flaw in the data mapping that halts any progress in its tracks.

Ensure Data Quality

Once you identify any issues with your source data, they must be resolved. Otherwise, potentially fatal disruptions of your migration plans could result. This may require additional software tools and third-party resources because of the scale of the work.

Define and Design the Migration

The design phase is where organizations define the data migration strategy and implementation approach – “Big Bang” (all at once) or “trickle” (a bit at a time). Your data migration plans should also include details of the technical architecture of your chosen solution and of the migration processes. By first considering the design of the strategy, the data to be moved and the destination system, you can begin to define timelines and unearth any project concerns. By the end of this step, your whole project should be documented.

Maintain and Protect Your Data

Data degrades over time and can often become unreliable. This means there must be controls in place to maintain data quality, before, during and after any integration and migration projects are undertaken.

Don’t Forget Security

During planning, it’s important to consider security plans for the data. Any data that must be protected should have protection included throughout the plan.

Build the Migration Solution

 It can be tempting to approach migration with a “just-enough” development approach. However, since you will only implement once, it’s crucial to do it correctly. A common tactic is to separate the data into subsets and build one category at a time, followed by a test. If an organization is working on a particularly large migration, then it might make sense to build and test in parallel.

Conduct a Live Test

The testing process isn’t completed after testing the code during the build phase. It’s important to test the data migration design with real data to ensure the accuracy of the implementation and completeness of the solution.

Deploy, Then Audit

After final testing, proceed with implementation as defined in the plan.  Once the implementation is live, establish a system to audit the data to ensure the accuracy of the migration.

Govern Your Data

Tracking and reporting on data quality is important because it enables a better understanding of data integrity. Clear, easy-to-follow processes and highly automated tools can greatly ease data governance and ensure greater data integrity after any successful data migration.

How a Hybrid Integration Platform Can Aid Your Migration

A hybrid data integration platform, such as Actian DataConnect, can make the process of data migration much easier and lower the risk of business-disrupting connectivity issues. With DataConnect, instead of managing numerous point-to-point interactions between applications, connections to source systems are managed through a central platform, much like a telephone switchboard. The number of connections to source systems is reduced because all consuming services and applications share the same connection managed by DataConnect. That means fewer sets of credentials to manage, fewer points of potential failure or disruption, and the ability to consolidate management of the flow of data across your organization and IT environment.

As a hybrid integration platform, Actian DataConnect can manage connections with systems both on-premises and in the cloud. By connecting your applications through DataConnect before you start your migration or cloud migration project, you can shift configuration tasks from the migration window and test to make sure that they work ahead of time.

During the data migration event, data connections for migrated apps can be updated to point to whatever application environment is active by simply updating the configuration in the DataConnect dashboard. If connection issues are encountered, Actian DataConnect enables you to see where the problems are and correct them quickly. After the data migration is completed, DataConnect can help you verify that dependencies on on-premises applications have been eliminated and the applications can be safely decommissioned.

Before you plan your next data migration project, consider how Actian Data Connect can help you accelerate migration timelines and improve the likelihood of a successful and effective data migration project. To learn more, visit www.actian.com/dataconnect.

Traci Curran headshot

About Traci Curran

Traci Curran is Director of Product Marketing at Actian, focusing on the Actian Data Platform. With 20+ years in tech marketing, Traci has led launches at startups and established enterprises like CloudBolt Software. She specializes in communicating how digital transformation and cloud technologies drive competitive advantage. Traci's articles on the Actian blog demonstrate how to leverage the Data Platform for agile innovation. Explore her posts to accelerate your data initiatives.
Data Analytics

Data: The Beating Heart of Healthcare

Actian Corporation

August 4, 2021

Data: the Beating Heart of Healthcare

Payers, health plans, and integrated health systems run on data. Integration, processing, and analytics are integral to their business process and managing of costs and risk.

The role of payers in the healthcare industry has been expanding for years. As care models have shifted from fee-for-service to coordinated and value-based care, payers and providers alike have been generating—and relying upon—more and more data to run operations smoothly. This evolution had been underway for some time, but the Covid-19 pandemic abruptly put new pressure on the system, heightening the need and requirements for better interoperability and data sharing between payers, providers, members, and patients.

Once, payers focused primarily on claims processing, payments, and plan membership management. But that was in a simpler time. Now, payers are focused on managing the relationship between members and providers and on delivering better patient health outcomes for members. They’re also focused on delivering better experiences for members and providers alike as new competitors are appearing on the horizon from unexpected sectors. Large organizations like Walmart, Amazon, and Google, are expanding their own operational profiles and starting to compete as payers in their own right. In some cases, these organizations are even taking on the role of healthcare providers and competing with traditional providers.

All these changes are putting considerable strain on payers. Add to these the erosion in commercial plan revenues—accelerated by Covid-19 through an uptick in self-care as well as increases in Medicaid and consumer direct purchasing of plans through the Affordable Care Act exchanges—which has reduced revenues and disrupted financial forecasting. During 2020, between 35 and 45% of Americans delayed or opted not to receive care, citing numerous concerns, including the pandemic-driven layoffs, both temporary and long-term, that hit many workers whose healthcare was tied to their employment.

Driving Better Outcomes With Data

To meet these challenges and to gain more predictability, to lower costs and risk while improving member and patient outcomes, organizations throughout the healthcare industry are trying to turn to data and analytics. According to a Society of Actuaries (SOA) survey, more than 90% of payer and provider executives say predictive analytics is key to effective healthcare management.

However, the goal of integrating the systems housing the relevant data and then deriving actionable insight from an analysis of this consolidated pool of data can be costly and elusive. IT is often overburdened, already tasked with integrating cumbersome legacy systems and data sources, including claims systems, walled-in provider data (EMR/EHR), connect operational systems (ERP), member and patient engagement (CRM) systems, external data such as SDOH, and new technologies such as wearables and virtual care systems. Nor are these long-needed integration projects the only demands weighing down IT. CMS and OMB are pressing ahead industry mandates to open up data access through newly finalized rules for data sharing—not only to patients but also across the healthcare ecosystem. Initially introduced by the Cares Act in response to Covid-19, these new rules require the industry to enable member access to data and facilitate interoperability between payers and health systems. Deadlines for compliance begin in 2021.

While these rules will, in theory, benefit payers, members, and providers by ensuring greater and more standardized access to data across the healthcare ecosystem, the work to achieve compliance can almost completely tie up an organization’s IT resources. This can make it even more difficult for data analysts, actuaries, and fraud and claims analysts to dive into these deep pools of data and extract actionable insights, because consolidating and accessing all this data in advance of analysis still requires the assistance of IT, which now has even less bandwidth to support users.

But there is a solution that can accelerate the integration and data sharing goals facing IT and that can enable an organization’s data scientists and analysts to consolidate and analyze critical healthcare data themselves, without having to heavily rely on the support of the IT team.

Enter the Healthcare Data Analytics Hub

The Actian Healthcare Data Analytics Hub enables payers, providers, and others in the healthcare ecosystem to gain greater insights and drive better outcomes with data. This SaaS-based hub includes native integration with the powerful data extraction, transformation, and load (ETL) features of Actian DataConnect, which enable an organization to automate and accelerate the process of accessing, pulling, and qualifying relevant data from a wide range of internal and external systems. Because connectors have already been built to link many of the systems and data repositories that healthcare organizations use, the integration work that IT has already begun becomes dramatically easier to complete. Similarly, the IT team can build the data access APIs described by CMS and ONC right in the Data Analytics Hub, ensuring that data can be shared securely and that member data sharing preferences can be both tracked and enforced across the healthcare data fabric, regardless of source or destination. The Actian Healthcare Data Analytics Hub also ensures that the IT organization can comply with the rules requiring that the systems and networks supporting both the sharing and protection of personal data stay on top of constantly evolving HIPAA and other regulatory requirements.

Beyond helping healthcare IT organizations to accelerate the completion of their integration and data sharing projects, the Data Analytics Hub can all but eliminate the need for an organization’s end users to petition IT for support when striving to perform their analytics. Self-service, native integration tools in the Actian Data Platform make it easy for healthcare data analysts, actuaries, provider network, revenue cycle management, and fraud and claims analysts to access and integrate data sets on their own, without help from IT—and without impacting the existing systems of record. The Actian Data Platform supports all popular analytic and visualization tools, so users can use the tools with which they are already familiar to gain the insights needed to automate and optimize processes, improve outcomes, and drive a more satisfying experience for all parties involved.

HCA Use Case Mapping Diagram

The result? Having an intelligent, end-to-end healthcare data analytics hub enables IT organizations to accelerate completion of the critical integration and data access projects they are already committed to completing. It also empowers an organization’s data scientists and analysts to perform the critical work that they have been tasked to perform by removing the roadblocks preventing them from gaining the insights they are trying to discover and act upon. An Actian Healthcare Data Analytics Hub enables an organization to shift from siloed models of business and operations to models that are forward-looking and collaborative. An organization can better manage the move to value-based care, find ways to compete more effectively against new competitors (many of whom are less burdened with a legacy IT infrastructure and can move nimbly in seizing new opportunities), improve the experiences of and alignment with members and partners alike, and, ultimately, improve outcomes, save time, and reduce costs.

About Actian

Actian has helped hundreds of payers, providers, clearinghouses, and healthcare technology organizations automate data integration, processing, and analysis, enabling them to capitalize on real-time analytics to derive essential insights and automate processes in an ever-evolving healthcare data landscape. Actian helps these organizations apply analytics and data processing to address a breadth of use cases from claims processing to population health insights and compliance.

actian avatar logo

About Actian Corporation

Actian empowers enterprises to confidently manage and govern data at scale. Actian data intelligence solutions help streamline complex data environments and accelerate the delivery of AI-ready data. Designed to be flexible, Actian solutions integrate seamlessly and perform reliably across on-premises, cloud, and hybrid environments. Learn more about Actian, the data division of HCLSoftware, at actian.com.
Data Management

The Top 10 Benefits of an Operational Data Warehouse for 2021

Actian Corporation

August 1, 2021

operational data warehouse

The previous blogs in this series discussed the top 5 pitfalls of traditional operational data warehouses and defined the Operational Data Warehouse (ODW) as a potential solution.  Below is a list of my Top 10 desirable benefits of an effective ODW:

Current. Continuous data updates via “micro-batches” or streamed singleton updates throughout the day provide the most current information for analytics-based decision-making.

Fast. Changes to ODW data need to be made with the lowest performance penalty. Columnar data blocks that maintain their min-max value metadata eliminate the overhead of creating indexes that need to be updated with every change, as traditional row-based databases do. The ability to make better business decisions faster can translate into multiple data warehouse benefits.

Scalable. An effective enterprise data warehouse must be scalable in two dimensions. Vertical scalability enables workloads to take advantage of more CPU and storage capacity on a single system. When you have saturated the hardware capacity of a single system, the ability to scale horizontally to a cluster of systems provides the ability to grow the ODW to handle larger databases and more users. The ability to increase capacity as demand grows is a key advantage of a modern data warehouse.

Secure. The explosive growth of cybercrime and increased regulation of data privacy means that even “internal” systems must be secured. A good ODW must offer built-in support for advanced encryption, auditing, role-based security and data masking.

Flexible. The days when an organization could standardize on a single computing platform are over. The ODW needs to offer the flexibility to be deployed on-premises (on Linux, Windows, or Hadoop Clusters) or in the cloud (on AWS, Microsoft Azure and beyond).

Consistent. Some databases sacrifice query integrity for speed. A good ODW needs to provide row-level locking and full read consistency for running queries even as the underlying data changes.

Robust. A key advantage of a modern data warehouse is the ability to deliver enterprise-level resiliency and manageability. This translates to having an ODW with solid back-up, recovery, failover and replication capabilities.

Economical. Several factors can affect the total cost of ownership (TCO) for a specific database technology being used to support a particular business case. One is the ability to run standard servers to avoid esoteric appliances. Others include offering flexible deployment models to match different business needs, flexibility to scale up and down according to performance requirements, and the option to use different sized components (compute, storage) to optimize operating efficiencies.

Interoperable. A good ODW needs to provide open application programming interfaces (APIs) such as those that support Open Database Connectivity (ODBC) and American National Standards Institute Structured Query Language (ANSI SQL). These are necessary to enable the data warehouse to work with the multitude of query tools an organization might use. Many organizations use more than 20 different visualization and query tools.

Connected. The ability to ingest data at high speed is a critical ODW requirement. If you cannot load your data in a reasonable time, the result is having to work with summary data or worse, using stale data.

I would be very interested to hear which benefits you value the most or others I could have included? Email me at Pradeep.bhanot@actian.com if you would like to share your views.

actian avatar logo

About Actian Corporation

Actian empowers enterprises to confidently manage and govern data at scale. Actian data intelligence solutions help streamline complex data environments and accelerate the delivery of AI-ready data. Designed to be flexible, Actian solutions integrate seamlessly and perform reliably across on-premises, cloud, and hybrid environments. Learn more about Actian, the data division of HCLSoftware, at actian.com.