facebooklinkedinrsstwitterBlogAsset 1PRDatasheetDatasheetAsset 1DownloadForumGuideLinkWebinarPRPresentationRoad MapVideofacebooklinkedinrsstwitterBlogAsset 1PRDatasheetDatasheetAsset 1DownloadForumGuideLinkWebinarPRPresentationRoad MapVideo
Actian Blog / It’s Time for Data Historians to Become … History

It’s Time for Data Historians to Become … History

Big Data Technology For Business Finance Concept.

Database Historians…History?

Why a modern time-series capable database can simplify yet enhance time-series data analysis.

Despite the professorial image the term suggests, a data historian is not an instructor or researcher, but a purpose-build software solution. And evolutions in how operational data is used and managed have eclipsed the need for data historian software solutions.

What is a data historian?

There are many Operational Technology (OT) environments within manufacturing, oil and gas, engineering research, and countless other industries. In these environments, complex equipment, machinery, and networks of sensors and devices generate time-series data.  These time-series streams range from sensor data representing Pressure, Volume and Temperature to Video streams for machine vision and surveillance.

Initially, these streams were ignored or sampled only at low periodic rates.  As time-series streams increased in volume and local data processing incorporated multiple feed reconciliation, OT engineers began to build data collection, aggregation, and minimal processing systems to better handle these time-series data streams.  Eventually, these proprietary and bespoke systems were collectively labeled data historians.

The data historian process gap

The use and users of OT data have both changed much during the past few years. Increasingly, OT data is leveraged by a host of other players within an organization beyond OT professionals. These newer users include developers, business analysts and data scientists supporting the OT, and product and service managers driving the business.

However, no data historian software solution was ever designed for use with a range of external systems or by users who were not OT professionals. Instead, the typical data historian platform was little more than libraries of data collected by and intended only for the use of OT professionals. And they typically built each data historian software solution from the ground up, directly or by proxy through vendors of manufacturing or other specialized equipment. In essence, data historian solutions are libraries built only for the librarians.

In addition, much data historian software was implemented on expensive legacy hardware. Resource constraints and lack of standards meant that functionality was pared down and focused only on the localized and immediate requirements of the OT infrastructure and process at hand. The result is that data historian software solutions are not easily extended for functions such as localized analytics and visualization or sharing data across local systems. It is also difficult or impossible for the typical data historian platform to easily and securely exchange data with modern backend systems for further analytics and visualization.

Technology that empowers historical data to shape the future

As with any other part of the business and IT industries, the technology for data management is continuously evolving, with new capabilities emerging every day. Currently, three primary technology shifts are combining to move beyond the capabilities and expected outcomes of Data Historian software.

Modern time-series databases:  beyond the data historian

Outside of the OT domain, the rest of your company data is likely stored in traditional relational databases and data warehouses. Data historian solutions were focused on capturing largely structured data in time-series formats. Today’s data is a vast superset of the data captured by these legacy systems.

Modern time-series databases include traditional time-series data capabilities. However, those modern solutions are designed and optimized for capturing data chronology and ingesting data from unstructured and multi-variate streaming data sources. These can range from binary large objects (BLOBs) and data compliant with the JavaScript Open Notation (JSON) open standard to the latest in Internet of Things (IoT) connectivity.

Ad-hoc analysis and reporting: the right data for everyone

Data historians tend to rely upon NoSQL application programming interfaces (APIs). These store and access data based on so-called “key values,” rather than in the rows and columns of traditional databases. NoSQL APIs are great for data collection and local data management. However, they are not readily accessible for post-collection ad hoc analysis and reporting – particularly by business analysts and data scientists outside the OT domain.

Modern time-series databases provide both a NoSQL API and APIs compliant with the American National Standards Institute (ANSI) Structured Query Language (SQL) standard. The latter feature enables easy extraction of data to support remote ad-hoc analysis, reporting and visualization through widely used business intelligence and reporting tools that rely on standard IT connectivity mechanisms such as Java Database Connectivity (JDBC) and Open Database Connectivity (ODBC).

Artificial intelligence (AI): enabling history to support predicting the future

Traditional data historian solutions can enable operations managers in the field to catch problems with their infrastructures, such as when pressure is too high or a part has failed. But these alerts are always after the fact. The collection and processing speed of the specific data historian solution somewhat determines how quickly afterwards, but hindsight is always the default.

AI, powered by modern machine learning (ML) capabilities, can deliver alerts that are more insightful. Depending on the combinations of data, past patterns, and the ability to analyze them, AI-driven successors to data historian solutions can even deliver predictive guidance about when a part is likely to fail.  Modern, integrated time-series databases can support AI and ML capabilities locally at the point of action within the OT domain by integrating OT with backend IT. The result is that data scientists and engineers can craft AI and ML capabilities for backend IT systems. Developers and front-end OT engineers can then invoke those capabilities in the OT environment. This approach provides a new and modern way of interacting with your company’s data to generate more useful insights and improved outcomes.

Respect the legacy, but move into the future

Data historian solutions have been crucial to the evolution of OT and the IT industry since the 1980s and earlier, and their contributions should be acknowledged and respected. Their time has passed, however, and modern technology solutions are replacing them. These allow you to better manage the data your company needs today and have faster, more complete, and more accurate information insights for the future.

Actian is the industry leader in operational data warehouse and Edge data management solutions for modern businesses. With a complete set of solutions to help you manage data on-premises, in the cloud, and at the edge, including mobile and IoT devices. Actian can help you develop the technical foundation you need to support true business agility. To learn more, visit www.actian.com.

About Lewis Carr

Senior strategic vertical industries, horizontal solutions, product marketing, product management, and business development professional focused on Enterprise software, including Data Management and Analytics, Mobile and IoT, and distributed Cloud computing.