Databases

The Data Challenges of Telemetry

Actian Corporation

September 7, 2022

data challenges telemetry

Telemetry is the automated communications process by which measurements are made and data collected at remote points. The data is then transmitted to receiving equipment for monitoring. The word ‘telemetry’ is derived from Greek roots: tele = remote, and metron = measure.

Telemetry is not a new concept, that’s for sure. We’ve been watching telemetry at work for decades. For example, we’ve strapped transmitters onto migrating animals, weather buoys, seismic monitoring, etc. However, the use of telemetry continues to accelerate, and this technology will bring up huge challenges to those of us responsible for data collection, data integration, and data analysis.

The most recent rise of telemetry is around the use of new and inexpensive devices that we now employ to gather all kinds of data. These can range from Fit Bits that seem to be attached to everyone these days to count the steps we take, to smart thermostats that monitor temperature and humidity, to information kicked off by our automobiles as to the health of the engine.

The rise of the “Internet of Things” is part of this as well. This is a buzzword invention of an industry looking to put a name to the rapid appearance of many devices that can produce data, as well as the ability of these devices to self-analyze and thus self-correct. MRI machines in hospitals, robots on factory floors, as well as motion sensors that record employee activity, are just a few of the things that are now spinning off megabytes of data each day.

Typically, this type of information flows out of devices as streams of unstructured data. In some cases, the data is persisted at the device, and in some cases not. In any event, the information needs to be collected, put into an appropriate structure for storage, perhaps combined with other data, and stored in a transactional database.  From there, the data can be further transferred to an analytics-oriented database, or analyzed in place.

Problems arise when it comes time to deal with that information. Obviously, data integration is critical to most telemetry operations. The information must be managed from point to point and then persisted within transitional or analytics databases. While this is certainly something we’ve done for some time, the volume of information these remote devices spin off is new, and thus we have a rising need to manage a rising volume of data effectively.

Take the case of the new health telemetry devices that are coming onto the market. They can monitor most of our vitals, including blood pressure, respiration, oxygen saturation, and heart rate, at sub-second intervals. These sensors typically transmit the data to a smart phone, where the information is formatted for transfer to a remote database, typically in the cloud.

The value of this data is very high. By gathering this data over time, and running analytics against known data patterns, we can determine the true path of our health. Perhaps we will be able to spot a heart attack or other major health issues before they actually happen. Or, this information could lead to better treatment and outcome data, considering that the symptoms, treatment, and outcomes will now be closely monitored over a span of years.

While the amount of data was relatively reasonable in the past, the number of data points and the frequency of collection are exploding. It’s imperative that we figure out the best path to data integration for the expanding use of telemetry. A few needs are certain:

  • The need to gather information for hundreds, perhaps thousands of data points/devices at the same time. Thus, we have to identify the source of the data, as well as how the data should be managed in-flight, and when stored at a target.
  • The need to deal with megabytes, perhaps gigabytes of data per hour coming off a single device, where once it was only a few kilobytes. Given the expanding number of devices (our previous point), the math is easy. The amount of data that needs to be transmitted and processed is exploding.
  • The massive amounts of data will drive some data governance and data quality issues that must be addressed at the data integration layer. Data is typically not validated when it’s generated by a device, but it must be checked at some point. Moreover, the complexity of these systems means that the use of data governance approaches and technology is an imperative.

This is exciting stuff, if you ask me. We’re learning to gather the right data, at greater volumes, and leverage that data for more valuable outcomes. This data state has been the objective for years, but it was never really obtainable. Today’s telemetry advances mean we have a great opportunity in front of us.

actian avatar logo

About Actian Corporation

Actian empowers enterprises to confidently manage and govern data at scale. Actian data intelligence solutions help streamline complex data environments and accelerate the delivery of AI-ready data. Designed to be flexible, Actian solutions integrate seamlessly and perform reliably across on-premises, cloud, and hybrid environments. Learn more about Actian, the data division of HCLSoftware, at actian.com.
Data Intelligence

The Benefits and Challenges of Data for the Banking Industry

Actian Corporation

September 6, 2022

Banking Industry Data

Banks and insurance companies have one thing in common: they collect massive amounts of customer data. Due to rising customer expectations and increased competition from Fintech players, the financial services industry simply cannot afford to let the data collected go unused. Explanations.

The financial services industry has invested heavily in data collection and processing technologies for over a decade. This reality is expected to grow even more with the growth of digital consumer habits and the emergence of new forms of competition. Banks and insurers must leverage existing and future datasets to maximize their understanding of customers and gain a competitive advantage.

The Impact of Digital Transformation in the Banking Industry

Disciplines such as Finance Data or Finance Data Analytics are widely employed in the banking industry to calculate risks, detect fraud, limit losses, and maximize gains. According to a study conducted by IDC, by 2025, the volume of data to be analyzed in the banking sector could reach 163 billion terabytes. Payment card use, multiplication of banking services, online transactions, dematerialization of salaries, online consultation of personal accounts, identification of customers’ consumption habits…banks have a considerable amount of information about their customers. The banking world is undergoing massive digital transformation efforts.

According to a study conducted by the Autorité de contrôle prudentiel et de résolution (ACPR), there are several reasons for these efforts. First, there is a need for easy-to-access, multi-channel digital tools that allow for seamless and perfectly secure customer paths. Secondly, there is a need for immediacy and flexibility in customer relations. And finally, there is a need to personalize the service delivered to the customer, allowing them to become autonomous. All of these needs can be met through the use of data.

The Benefits of Data for Banks

The urgency to carry out this digital transformation, however, is amplified by various structural elements that affect the banking market. Between the evolution of crypto-currencies, the emergence of NFTs, and the appearance of new models favoring new forms of competition with open-banking and Fintech, traditional banks must not only rethink their offers, methods, and organization but also their territorial network, to maximize proximity with customers, minimize operating costs and offer a differentiating customer experience.

However, the use of data in the banking sector is not limited to customer relations.

Through data science, the financial services industry is undergoing a real disruption. By analyzing data, companies can extract valuable information through mathematical and statistical techniques. Finance Data not only allows companies to better understand their customers in order to deliver better service offers but also optimizes the profits of the banking sector – at a time when their business model is being challenged by the emergence of new players such as digital banks and fintech. Data is also used in activities such as high-frequency trading.

The Stakes of Data for Banks

According to Verizon’s 2022 Data Breach Investigation Report, 95% of data compromises identified in the report are motivated by greed. In this context, the financial services industry is, by its very nature, a target for criminal organizations. As a designated target for cyber threats, the banking sector is also the object of particular vigilance on the part of the authorities, particularly with regard to compliance with the provisions of the RGPD. The first challenge of the banking sector in relation to data is indeed that of security and compliance. But it is not the only one.

Indeed, the very nature of banking activity allows the collection and aggregation of large volumes of data that can be of a heterogeneous nature. The banking world is therefore faced with major challenges in terms of data governance, data quality, and the continuous optimization of data assets.

actian avatar logo

About Actian Corporation

Actian empowers enterprises to confidently manage and govern data at scale. Actian data intelligence solutions help streamline complex data environments and accelerate the delivery of AI-ready data. Designed to be flexible, Actian solutions integrate seamlessly and perform reliably across on-premises, cloud, and hybrid environments. Learn more about Actian, the data division of HCLSoftware, at actian.com.
Data Architecture

Scalability is All About Getting a Smart Start

Actian Corporation

September 6, 2022

Man showing a wide range of solutions illustrating an example of scalability

Scalability is for the most part, a peek into the future. That’s why organizations often bring it up when discussing growth and expansion opportunities – whether it’s scaling up operations to keep pace with increased customer demand or adapting to a new industry-disrupting technology.

Scalability is particularly important to business leaders right now, as the technology landscape is evolving at breakneck speed. A recent McKinsey study found that over 50% of companies think that scaling their business is a top priority, but just 22% have scaled successfully within the past ten years.

What’s often not discussed, however, is the difficulty of growing new projects from the ground up and the pain points encountered when integrating new datasets. This requires careful consideration and proactively planning for growth early on, as opposed to re-tooling systems and applications reactively to meet unexpected shifts.

Business leaders are faced with many challenges, from internal business complexities to external market forces. It’s possible to offset these hurdles by understanding data’s role in enabling scalability and integrating technology into enterprise stacks.

Scalability and What it Means Today

What does scalability mean for an enterprise? Scalability is defined as a measure of how performance changes as available resources or volume of input data changes. A truly scalable solution exhibits a proportional change in performance in response to a change in either of those two variables.

When adding resources to a system, there are two directions to consider:

  • Scaling up refers to increasing the available resources on a single node of the system. This is the traditional model of adding more CPUs, more disk, or more memory to a machine to increase performance.
  • Scaling out refers to increasing available system resources by adding additional nodes. This is the typical distributed cluster model, enlarging the cluster size to increase performance via cluster analysis.

Be it scaling up or scaling out, successful business leaders have an intrinsic understanding of what they can gain through strategic planning and preparation for future growth. Once a clear understanding of the direction of their scalability goals has been established, organizations can move forward and build an enterprise with a solid foundation that can grow as the business does.

Typically, when scaling a new initiative, enterprises can struggle with securing IT and cloud operations, and getting broad buy-in from leadership. To overcome these challenges, ambitious organizations and IT leaders should move their operational workloads to the cloud and offer support for cloud, multi-cloud, or hybrid deployments. This takes a technical backbone that’s built on a foundation of streamlined and democratized data for analysis across business stakeholders.

Building for Success From the Ground Up

Building this foundation requires having systems in place that make data easily accessible across departments. Gartner recently identified data-sharing and data analysis for business use as one of the top trends in data management, signaling that most enterprises have accepted that data is the lifeblood of their business.

Oftentimes, this data is stored in silos that are not only hard to break down but are also disparate from each other, which makes them difficult to access across the enterprise. The Actian Data Platform gives data managers the ability to knock down silos and free the data that’s generated from them to help drive their business forward.

With this platform, business technologists can leverage the power of the cloud to scale to larger and a greater number of concurrent projects, enabling them to iterate and innovate faster. Cloud connectivity helps businesses avoid incomplete and inaccurate insights from spreadsheet sprawl, data being stored in legacy warehouses or lake dumpsters.

This platform empowers business stakeholders to harness real-time data and analytics and get the maximum value out of their data pipelines. Faster innovation and agility ultimately make scaling easier – all with minimal risk or upfront investment into IT resources, CAPEX, and additional training.

The successful scaling of a project or initiative relies on the foundation it’s built on. Actian helps enterprises better understand their business, reduce costs, demonstrate value, and make decisions faster.

actian avatar logo

About Actian Corporation

Actian empowers enterprises to confidently manage and govern data at scale. Actian data intelligence solutions help streamline complex data environments and accelerate the delivery of AI-ready data. Designed to be flexible, Actian solutions integrate seamlessly and perform reliably across on-premises, cloud, and hybrid environments. Learn more about Actian, the data division of HCLSoftware, at actian.com.