Why a modern time-series capable database can simplify yet enhance time-series data analysis
There are many Operational Technology (OT) environments within manufacturing, oil and gas, engineering research, and countless other industries where complex equipment, machinery, and networks of sensors and devices generate time-series data. These time-series streams range from sensor data representing Pressure, Volume and Temperature to Video streams for machine vision and surveillance. Initially, these streams were thrown away or sampled at low, periodic rates. As time-series streams increased in volume and local data processing incorporated multiple feed reconciliation, OT engineers began to build data collection, aggregation, and minimal processing systems to better handle these time-series data streams. Eventually, these proprietary and bespoke systems were collectively labeled Data Historians.
A library for more than just the librarians?
Use of OT data and those that must use it has changed much during the past few years. Increasingly, OT data is leveraged by a host of other players within an organization, including developers, business analysts and data scientist supporting the OT, and product/service managers driving the business. However, Data Historians were never designed for use with a range of external systems and literally were libraries of data collected by and only for the use of the OT professionals. Why, because they built them from the ground up directly or by proxy through vendors of manufacturing or other specialized equipment. In essence, they’re libraries built only for the librarians.
Many of these systems were built from the ground up on expensive legacy hardware where resource constraints and lack of standards meant that functionality was pared down and focused only on the localized and immediate requirements of the OT infrastructure and process at hand. The result is that these systems are not easily extended either for localized analytics and visualization, sharing data across local systems, or easily and securely exchanging data with modern backend systems for further analytics and visualization.
Technology that empowers historical data to shape the future
As with any other part of the business and IT industries, the technology for data management is continuously evolving, with new capabilities emerging every day. Currently, three primary technology shifts are combining to move beyond the capabilities and expected outcomes of Data Historian software.
- Modern Time-Series Databases capture multi-modal data
Outside of the OT domain, the rest of your company data is stored in standard databases and data warehouses. Data Historians were focused on capturing largely structured data in time-series formats; today’s data is a vast superset of data captured by these legacy systems. Modern time-series databases cover traditional time-series data capabilities, however, they are designed and optimized for capturing data chronology and ingesting data from unstructured and multi-variate streaming data sources including binary large objects, JSON data, and adherence to standards for the latest in IoT connectivity.
- Ad-hoc Analysis and Reporting allows everyone to “checkout” the right data
Data Historians tend to have NoSQL APIs which are great for data collection and local data management but not readily accessible for post collection ad hoc analysis and reporting – particularly by business analysts and data scientist outside the OT domain. Modern Time Series databases provide both a NoSQL API and the standard ANSI SQL-compliant API for extraction of data to support remote ad-hoc analysis, reporting and visualization through widely used business intelligence and reporting tools that rely on standard IT connectivity mechanisms such as JDBC and ODBC.
- Enabling history to support predicting the future with Artificial Intelligence
Data Historians provided Operations managers in the field the ability to catch problems with their infrastructure – the pressure is too high, a part has failed, and so forth. But these alerts were always after-the-fact. The collection and processing speed somewhat determines how quickly afterwards but, by default, always in hindsight. Artificial Intelligence (AI) built on modern machine-learning (ML) capabilities can deliver alerts that are more insightful and, depending on the combinations of data, past patterns, and the ability to analyze them, increasingly in a predictive mode such that you’re determine when a part will fail. Modern integrated time-series databases are able to support AI/ML capabilities locally at the point of action within the OT domain by integrating OT with backend IT. The result is that AI/ML capabilities can be crafted on backend IT systems by data scientists and engineers and then transported and run in the OT environment by developers and front-end OT engineers, providing a new and modern way of interacting with your company’s data to generate improved outcomes.
Respect the legacy, but move towards the future
Data historian solutions have been crucial to the evolution of OT and the IT industry since the 1980s and earlier, and their contributions should be acknowledged and respected. Their time has passed, however, and modern technology solutions are replacing them. These allow you to better manage the data your company needs today and have faster, more complete, and more accurate information insights for the future.
Actian is the industry leader in operational data warehouse and Edge data management solutions for modern businesses. With a complete set of solutions to help you manage data on-premise, in the cloud, and at the Edge with Mobile and IoT. Actian can help you develop the technical foundation needed to support true business agility. To learn more, visit www.actian.com.