Some exciting technology trends are emerging that are projected to hit the mainstream over the next few years and will have a significant impact on your data management systems. Will your integration platform be ready to support these new trends? If not, now is the time to act so you will be prepared to support a new wave of business capabilities your company will need to succeed.

The IT industry is changing rapidly, and there are 4 key emerging technology trends that data management and IT professionals should be monitoring closely.

  1. Cloud-Native Architectures – Companies are rapidly shifting from home-grown systems to cloud services, both platforms and SaaS. These cloud services leverage cloud-native architectures that are often highly distributed, leverage parallel processing, involve non-relational data models, and can be spun up or shut down in a matter of seconds. Integrating data from these systems can be challenging for legacy data integration systems that require manual configuration of each data connection.

Your integration platform needs to be able to recognize and adapt to these cloud-native architectures and enable your business and IT teams to make frequent changes to the application landscape while maintaining the integrity and security of underlying enterprise data assets.

  1. Event-Driven Applications – Traditional IT applications were built around structured workflows that were well-defined, much like a novel. Modern “event-driven” applications are more like a “choose your own adventure” book, where the end-to-end transaction flow may not be pre-defined at all. Events and data are evaluated, leading to dynamic workflows emerging based on the needs of the individual transaction. Many cloud-based container apps and functions are being used to deploy capabilities this way.

The challenge event-driven applications pose to data management is that they lack the data context that traditional application workflows provide. Context is derived from the series of events and actions that led to the current point in time. Your integration platform will need to understand and be able to support the unique nuances of these event-driven applications and contextualize the data they produce differently.

  1. API-Led Integration – Similar to event-driven applications, API-led integration is a new model for bringing IT capabilities together. Applications are treated as pseudo-black boxes, and what is managed in a structured way is the interfaces between them. From a data management perspective, this raises the need to manage data in motion (traveling between apps over APIs) as well as data at rest (within individual applications). Your integration platform will need to understand the differences between these two types of data and be able to ingest, transform, and load them together in your data warehouse for further processing.
  2. Streaming Data – Companies in all industries are now being inundated with streaming data coming from a variety of data sources – IoT, Mobile apps, deployed sensors, cloud services, and digital subscriptions are a few examples. The data these systems generate is significant, and in even a small organization, the number of data sources can be extensive. When you multiply large data streams across many data sources, the streaming data volume that a company needs to manage can be massive.

Most legacy integration platforms were designed for batch data processing, not the scale challenges of streaming data. Cloud-based integration platforms are often better suited to address streaming data needs than on-premises systems because of the underlying capacity of the cloud environments where they operate.

Is Your Integration Platform Ready?

If you aren’t sure whether your integration platform is up to the task of supporting these emerging technologies, it probably isn’t. Actian DataConnect is a modern hybrid integration platform that leverages cloud-scale and performance to deliver the capabilities you need to connect anything, anytime, anywhere, and integrate it into your enterprise data landscape. To learn more, visit DataConnect.


If you want your business to be agile, you need to be leveraging real-time data. The environments that your businesses operate in are changing faster than ever – new competition in the marketplace, regulatory changes; operational issues; and new technology advancements are only the tip of the iceberg. If you want to survive and thrive in the fast-paced business environment, you need to be agile.

To be agile, you need to understand what is going on in your environment, make quick and informed decisions, and then rapidly respond to exploit opportunities and mitigate risks. Every moment of delay is a lost opportunity. If you can learn to manage streaming actionable insights effectively, you will be able to expand your company’s capabilities for real-time responsiveness and by doing so, achieve the grand ambition of business agility.

Data Blind Spots Lead to Bad Decisions

The first step in achieving business agility is to collect the right amount and types of data about what is going on in your environment. Recent technology trends, such as IoT, embedded sensors, data subscriptions, and mobile apps, have greatly expanded your data collection options. These new data collection technologies enable you to monitor what is going on both inside your operations as well as in the broader business environment in real-time. They produce continuous streams of data that serve as your eyes and ears to understand what is happening and, more importantly, what is changing.

If you aren’t collecting enough data, the result is blind spots. It is like driving a car; if you only look out the front window, you have a limited view of what is going on around you. Mirrors, cameras, sensors, and the habit of looking around give you a broader perspective that reduces blind spots.

It’s the unknown unknowns that cause companies to fail. If you don’t recognize your blind spots, you will naively make decisions that you think are informed by data but are nothing more than assumptions and guesses. That can lead to disastrous consequences. The good news is current capabilities give you the ability to eliminate most of your blind spots and give you the insights needed to develop a holistic view of your business environment.

Converting Raw Data into Actionable Insights

Collecting raw data isn’t enough; you need a way to manage it and transform it into actionable insights. Streaming data is great; it provides you with broad visibility into what is going on across your business. But if you don’t have the right tools and processes to manage streaming data, you will quickly be overwhelmed.

The meaningful signals in the data get drowned out by the noise, and before long, decision-makers stop using the data entirely. This is what happens when big data isn’t managed – it becomes clutter. To avoid this, you need a data management process for turning streaming data into actionable insights about your operations.

Converting streaming data into actionable insights is a process of incremental refinement – a value chain. Inputs are collected from many different data sources – there are the new technologies mentioned above, and there are also things like transactional workflows, event logs, social media feeds, and website interactions.

The first step in refinement is to connect to all these data sources and aggregate the data streams in a common place where they can be further processed. Because the data sources you need are so diverse, many companies are leveraging an integration platform as a service (IPaaS) to help them do this.

Once aggregated, the data must be integrated and organized to understand how the different streams relate to each other. This typically happens in an operational data warehouse. Modern cloud data warehouses are designed for high-performance, massively scalable data processing that is ideal for working with streaming data.

After the streaming data is organized, it can then be analyzed to separate the meaningful signals from the noise. These signals may be indicators of something deviating from what is expected or a change occurring in the environment. The signals are analyzed in the context of your operations, systems, and business processes to assess their relevance and importance. Applying this information is how you build actionable insights.

Once an actionable insight has been identified, it then needs to be converted to action. It does no good to identify an issue or opportunity if you aren’t going to act on it.  Real-time responsiveness is achieved by getting actionable insights into the hands of the decision-makers who can use them to drive change and action within the organization.

This may be strategic decision-makers in management or equipment operators who can implement tactical changes. Analytics and reporting tools, real-time operational dashboards, and alerts (texts, alarms, email, and audible messages) are universal tools for letting decision-makers know there is an actionable insight requiring their attention.

The Cost of Delay

Business agility comes from real-time responsiveness. You can’t respond in real-time if there is a delay in learning about the problem or opportunity. To leverage streaming data effectively, you need a set of systems and processes that enable you to transform raw streaming data through the full data value-stream in real-time. You can’t wait for batch data updates, latency in analytics processing, or manual integration of data. You need the entire process to be automated and optimized.

Actian can help. Actian, the hybrid data management, analytics, and integration company, delivers data as a competitive advantage to thousands of organizations worldwide. Actian Data Platform is a fully managed hybrid cloud data warehouse service designed from the ground up to deliver high performance at a fraction of the cost of alternative solutions. It is the first and only data warehouse to provide comprehensive integration capabilities, including connecting to on-premises and SaaS applications as well as managing those integrations.

For more information, visit https://www.actian.com.


Robotic process automation (or RPA for short) is the term used to describe the next generation of technology-enabled workflow management capabilities. RPA solutions come in various sizes and shapes from a myriad of different vendors. RPA is the current buzzword and has enormous growth opportunities but integration is complex and requires expensive skills and resources.

Still, the underlying premise is the same – in a post-digital transformation environment, companies need the ability to leverage a wide variety of technology components to support their business: IoT, cloud services, mobile devices, SaaS software, and traditional IT systems.

To leverage and integrate this diverse technology footprint effectively, they need a new set of workflow management and rules-based orchestration capabilities to help them transform specialized components into end-to-end processes that meet business goals.

The Challenges That RPA Vendors Face

This is the market opportunity that RPA vendors are seeking to address for their customers through their RPA solutions. What this picture doesn’t address is the set of challenges that RPA vendors themselves face in building RPA solutions and what tools are available to help them be successful in developing the products that customers want, deploying them quickly, and operating them effectively. RPA vendors often resort to using or embedding open-source integration tools, which are risky; these tools may not have enterprise-level support, may not be secure, and also have the risk of being acquired by a larger vendor. RPA vendors should partner or embed industry-leading integration tools to mitigate such risks.

RPA vendors also have a data challenge. It’s both a big data challenge and a distributed data challenge. RPA solutions are designed to connect and orchestrate a large number of IT components from different manufacturers – each doing different things. Components have differing capabilities in terms of processing, connectivity, data storage, and management capabilities, which the RPA vendor must figure out how to understand, manage and exploit in pursuit of the customer’s business objectives.

These components (whether they be IoT, mobile apps, or traditional IT systems) also produce vast amounts of data – some of it streaming, some of it static, some of it processed in batches. So the question for RPA vendors is: “How do I get all of this data connected, organized and processed at the scale of modern IT and meeting the performance expectations of modern businesses?” This is where Actian comes into the picture. Actian DataConnect provides a set of integration tools to support RPA vendors in managing and integrating the diverse and large volumes of data that their customers want to leverage, enabling RPA solutions to focus on things like managing business rules, workflows, and processes.

Connecting Diverse Components Together

The first challenge in managing data in RPA solutions is connecting the various parts of the IT ecosystem together so they can be managed in a consistent and centralized way to ensure the free-flow of data, security, and manageability.

Actian DataConnect is a hybrid integration platform ideally suited to this challenge – integrate anything, anywhere, anytime and enabling centralized governance, management, and control.

Whether its IoT devices deployed within customer operations, mobile apps in use by employees/partners/customers, SaaS solutions hosted by 3rd parties, or traditional IT systems deployed in a company’s data center. Actian DataConnect is an enterprise-grade solution to manage all of the RPA vendor’s (and customers’) connectivity needs.

Embedded and Edge Processing of Streaming Data

Some of the most powerful and specialized technology components that customers want to leverage in their digitally transformed business processes aren’t very powerful in reality. They are IoT devices and mobile apps with limited storage and data processing capabilities. The things they do, they do very well, but to maximize their value in an enterprise context, the data that these devices produce needs to be processed and aggregated with other data. Streaming raw data consumes a lot of network bandwidth (which is expensive) and introduces latency that slows down real-time business processes.

Actian Zen provides a small footprint, distributed database capabilities that can be deployed either as embedded capabilities or in edge devices to enable processing of streaming data close to where it is produced. In the context of an RPA solution, this means better performance, lower total cost of ownership (TCO), and the ability to distribute rules execution and workflow orchestration around the globe.

Aggregating Data for Enterprise Insights

Continuous improvement, be it through process analysis and optimization or supported by machine learning and artificial intelligence, requires RPA vendors to aggregate distributed data in a centralized location for analysis and the harvesting of enterprise insights. Actian is a modern cloud data warehouse solution that is optimized for the high-performance processing, enterprise scalability, and robust analytics that RPA solutions demand.

Robotic process automation is poised to have a disruptive impact on the IT industry over the next few years. As RPA vendors race to get their products to market – balancing speed, quality, features, and scalability – Actian offers a set of capabilities in DataConnect, Zen, and Actian Data Platform that can give you a leg-up in the marketplace.

To learn more, visit www.actian.com.


Blog | Data Intelligence | | 3 min read

Databook: How Uber Turns Data into Exploitable Knowledge With Metadata

uber-databook-cover

Uber is one of the most fascinating companies to emerge over the past decade. Founded in 2009, Uber grew to become one of the highest valued startup companies in the world. There is even a term for their success: “uberization”, which refers to changing the market for a service by introducing a different way of buying or using it, especially using mobile technology.

From peer-to-peer ride services to restaurant orders, it is clear that Uber’s platform is data-driven. Data is the center of Uber’s global marketplace, creating better user experiences across their services for their customers, as well as empowering their employees to be more efficient at their jobs.

However, Big Data by itself wasn’t enough; the amount of data generated at Uber requires context to make business decisions. So like many other unicorn companies, such as Airbnb with Data Portal, Uber’s Engineering team built Databook. This internal platform aims to scan, collect, and aggregate metadata to see more clearly the location of data in Uber’s IS and its referents. In short, a platform that wants to transform raw data into contextualized data.

How Uber’s Business (and Data) Grew

Since 2016, Uber has added new lines of business to its platform, including Uber Eats and Jump Bikes. Some statistics on Uber include:

  • 15 million trips a day.
  • Over 75 million active riders.
  • 18,000 employees since its creation in 2009.

As the firm grew, so did its data and metadata. To ensure that their data & analytics could keep up with their rapid pace of growth, they needed a more powerful system for discovering their relevant datasets. This led to the creation of Databook and its metadata curation.

The Coming of Databook

The Databook platform manages rich metadata about Uber’s datasets and enables employees across the company to explore, discover, and efficiently use their data. The platform also ensures their data’s context isn’t lost among the hundreds of thousands of people trying to analyze it. All in all, Databook’s metadata empowers all engineers, data scientists and IT teams to go from just visualizing their data to turning it into exploitable knowledge

Databook enables employees to leverage automated metadata in order to collect a wide variety of frequently refreshed metadata. It provides a wide variety of metadata from Hive, MySQL, Cassandra and other internal storage systems.To make them accessible and searchable, Databook offers its consumers a user interface with a Google search engine or its RESTful API.

Databook’s Architecture

Databook’s architecture is broken down into three parts: how the metadata is collected, stored, and how its data is surfaced.

Conceptually, the Databook architecture was designed to enable four key capabilities:

  • Extensibility: New metadata, storage, and entities are easy to add.
  • Accessibility: Services can access all metadata programmatically.
  • Scalability: Support business user needs and technology novelty.
  • Execution: Power & speed.

To go further on Databook’s architecture, please read their article https://eng.uber.com/databook/

What’s Next for Databook?

With Databook, metadata at Uber is now more useful than ever.

But they still hope to develop other functionalities such as the abilities to generate data insights with machine learning models and create advanced issue detection, prevention, and mitigation mechanisms.

Sources


Blog | Data Intelligence | | 2 min read

Actian Data Intelligence Platform Data Catalog Now Connects to Snowflake

snowflake connector for data catalog

Actian Data Intelligence Platform announces the launch of its Snowflake connector, already up and running with its clients. 

In order to deliver an enterprise and relevant data catalog, we provide many connectors that automate and curate metadata, including  Cloud & Big Data storage, Relational databases, Non-relational databases, and more are constantly being added.

By using our connectors, the platform provides a reality-proof data catalog for data-driven companies to ensure that your data consumers access up-to-date information stored within our powerful platform. 

Founded in 2012, Snowflake developed a new data platform from the ground up for the cloud. Its patented new architecture was designed to be the centerpiece for data pipelines, data warehousing, data lakes, data application developments, and for building data exchanges to easily and securely share governed data. Their promise? Deliver as a service data warehouse that’s powerful but simple to use.

To achieve this, Snowflake also works with a wide array of industry-leading tools and technologies, enabling access to Snowflake through an extensive network of connectors, drivers, programming languages, and utilities. 

It is, therefore, no surprise that we are announcing our integration with one of the most popular data management & cloud data platforms of the moment, with the Actian Data Intelligence Platform Data Catalog. Welcome to Snowflake.

What Does Our Snowflake Connector Do?

Our connector will help you to better manage data in your Snowflake environment. It can automatically connect to your database and retrieve a broad range of metadata. Our Scanner process, which uses this new connector, can periodically collect information and detect modifications related to your schema. The connector can collect technical metadata, such as technical names, data types, and constraints related to your tables and their columns, but also documentation you may have provided at the table of field level.

This is our first version, be sure we will improve it with other great features in the coming weeks.

The more information the connector collects, the less your Data Stewards have to fill in your Data Catalog. This is why we are considering metadata ingestion automation as a key success factor.


Does your company have a real-time connected data warehouse where you can aggregate data flowing in from all of your IT systems together with streaming data from IoT, mobile, and SaaS services? Can you easily connect your on-premises, cloud, and multi-cloud systems to enable centralized analytics? If so, that’s awesome! Now for the big question, what are you doing with those capabilities? A real-time connected data warehouse is a powerful platform, but the value to your company doesn’t come from having a connected data warehouse; it comes from connecting and using the data (at rest and in motion) captured in the siloed systems.

This article is the second in a series to discover where in your company, you can leverage Actian’s real-time connected data warehouse solution to generate business value. The first article looked at manufacturing operations and integrating data across your supply chain. This installment discusses the use of a real-time connected data warehouse within your IT organization to bring together data from your ITSM, Operations Management (OM), and cloud management platforms to give you a 360-degree view of your IT systems. Then how to leverage this connectivity to enable informed decision-making about your IT portfolio and investment decisions.

Do You Need Better IT Management Capabilities?

Companies spend large sums of money on IT systems. But do they know where that money is going, what value they are getting for it, and how to optimize resource spending for maximum benefit? Most don’t. What about your company? You’ve invested in 360-degree views of your customers, your products, and your operations, what about your IT environment? A somewhat prophetic IT thought leader made the statement a few years ago, “IT organizations tend to put their business first and often neglect their own needs. It’s like the cobbler’s children walking around with no shoes.” With digital transformation creating inextricable links between business processes and IT systems, the time is right to invest in a greater understanding of your IT assets so that you can leverage them for maximum results.

Data About Your IT Systems are All Over the Place

Most IT organizations have multiple systems that they use to manage their IT environments.

  • An ITSM system for defining configuration items, managing incidents, and orchestrating change.
  • Engineering and agile requirements management systems for executing IT projects.
  • Enterprise architecture systems for mapping out the connective tissue of the enterprise.
  • Operations management systems for managing physical infrastructure and data centers.
  • Cloud management systems for orchestrating the deployment and management of cloud resources and subscriptions.

No wonder you are having problems optimizing your IT investments, data is fragmented across a bunch of disconnected IT management systems.

Bringing the Big Picture of IT Together in One Place

Developing a 360-degree view of your IT environment requires bringing data from each of these IT management systems together and assembling the “big picture” of your IT ecosystem.  Including things like technology components, data flows, policy and governance compliance, security controls, and IT spend. A real-time connected data warehouse enables you to aggregate all your IT management data in one place and apply the same types of analytics that you use to optimize your business operations to your IT environment.

It Would Help if You Had a Real-Time Connected Data Warehouse

The real-time connected data warehouse solution from Actian has two main components. Actian Data Platform is the cloud data warehouse solution designed to handle enterprise-scale data, whether streaming or batch and process analytics insights in near-real-time to drive informed decision-making. Actian DataConnnect is the Integration Platform as a Service (IPaaS) solution that enables you to manage all of the data connections in your company in a centralized location to ensure security, compliance and ensure controlled access to sensitive information.

DataConnect and Actian Data Platform from Actian are the tools your company needs to integrate your disparate IT management systems. Together they can bring in streaming data sources from things like network devices, door sensors, and access control systems to compile a holistic 360-degree view of your IT environment. It’s great to know what systems you have, but understanding how they are used, what they cost you, and how they are supporting your business is powerful information that modern IT organizations and decision-makers need to be successful.

To learn more about connected data warehouse solutions from Actian and how to leverage them to create value for your company, visit www.actian.com.


Blog | Data Integration | | 4 min read

Thinking of Buying an Integration Solution?

Per Connector Pricing Model

Summary

  • Successful integration requires aligning business goals with technical needs to ensure long-term scalability.
  • A hybrid approach is essential for connecting legacy on-premises systems with modern cloud-based applications.
  • Low-code tools empower non-technical users to build integrations, reducing the burden on IT departments.
  • Evaluating total cost of ownership helps avoid hidden fees associated with data volume and connector limits.

How to Break Free From Pay “Per-Connector Pricing Model”

According to Gartner: “Many integration providers base pricing of the components of their enterprise integration platform as a service (eiPaaS) suite on the range of functionality and variable items — such as the number of applications connected via the platform, a number of users or the volume of application programming interface (API) traffic per month. When application leaders procure iPaaS to support a potentially large range of still partially unknown use cases, they do not fully know upfront the functionality set they will actually need, or the quantification of the variable items. This may lead to over-provisioning and overspending.”

Buying an Integration Solution Does Not Have to Be Complicated Anymore

Most iPaaS vendors have very complex pricing and packaging structures. On the one hand, some vendors have a use-case-based pricing model, meaning they have a different price if you want to purchase B2B integration, data integration, or hybrid integration. On the other hand, some vendors charge based on per-user or consumption-based pricing. Most vendors have a “per connector” or “per connection charge” associated with the purchase – meaning if I need a Salesforce connector today and tomorrow, I need to connect to Netsuite or Marketo; I will be required to pay for both connectors separately. So, in reality, if today I need connectors for two applications and tomorrow I need five connectors, then my total price goes up significantly. For most organizations, it isn’t very easy to predict beforehand how many applications they would need to connect to in 6 months or a year.

Many companies considering the purchase of an iPaaS solution are usually mid-size organizations that have limited resources to invest in an integration strategy empowerment team (ISET) or an Integration Competency Center (ICC) or Integration Center of Excellence (ICoE). The sole responsibility of these teams is to evaluate vendors and their portfolios, use cases and scenarios, and make buying recommendations.

The Need for Transparent, Predictable Value-Based Pricing

In most cases, iPaaS customers buy iPaaS for ease of use, and to lower their total cost of ownership (TCO). These buying criteria should also ideally lend itself to ease of purchase. It should be transparent for customers to know what they are buying and be able to quantify that value easily. They should also be able to predict the cost going forward.

Keeping all these customer pain points in mind, we at Actian took a very close look at all the competitive iPaaS offerings, and we wanted to simplify the packaging and pricing for Actian DataConnect. The pricing model is simple, transparent, and predictable, with no hidden fees or surprises.

Pricing on Your Terms – Based on the Number of Engines You Need to Run Concurrently

Procured as an annual subscription, Actian DataConnect has predictable pricing based on your integration needs today. Actian DataConnect’s pricing model is solely based on the number of concurrently running integration engines you need to support your use cases and processing needs. This means you can add more engines as you grow and scale. Everything else is included – unlimited users, development seats, unlimited connectors or connections. At Actian, we do not charge customers on a “per connector” basis. We believe purchasing iPaaS using a “consumption-based” or “connection-based” pricing model can quickly become very expensive with unanticipated growth in usage and throughput.

Predictable Pricing with No Hidden Fees or Surprises

You only pay per runtime/processing engine, not connectors, the number of users, data used, or CPUs. We make it easy for you to predict what your costs will be as you grow.

Integrate Anything

UniversalConnect™ patented technology enables you to quickly connect to virtually any data source, format, location, protocol, any cloud or SaaS application.

One Price that Enables You to Deploy Anywhere

Supports any use case or integration pattern – in-cloud, multiple clouds, on-premises, or hybrid cloud.

Simple, Transparent, and Predictable

Actian DataConnect offers pricing with no hidden fees or surprises. Cloud integration pricing does not have to be complicated. Finally, it’s time to break free from “per connector” pricing.

Learn more here https://www.actian.com/data-integration/dataconnect/


In the next few years, it is projected that more than half of major new business systems will use real-time connected data and continuous intelligence to improve decision-making. You can’t have continuous intelligence without continuous data ingestion or real-time connected data for decision-making if you’re working off data that is processed in batches – you need streaming data. Integrating streaming data into your enterprise data landscape requires re-thinking your delivery methods for transporting streaming data from the source to the target consumer. Stream data integration is the way you do that.

The Exciting Challenge of Big Data

For nearly a decade, analysts and industry experts have been talking about Big Data and the impact that it is going to have on organizations. Big data isn’t an “emerging trend’ anymore – it’s a business reality. What makes big data challenging (and exciting) is that it isn’t just big, it’s also fast. Organizations are being engulfed in a growing volume of data from a variety of sources. Some of the data is transactional, but most of it is what is called event streams – digital records of things taking place in the applications and devices that make up the IT ecosystem. This event stream data is where companies can identify fascinating trends, behaviors, and relationships that can enable them to understand their operations, their environment, and their customers better.

The amount of information in event streams can be enormous, but with the proper analytics, they can lead to valuable business insights in areas like fraud detection, supply chain optimization, customer support, resource scheduling, dynamic pricing, preventative maintenance, and achieving high availability in IT systems and services.  Streaming data can help you identify events, opportunities, and threats faster, so you can respond quickly to minimize risk and maximize opportunity.

What is Stream Data Integration?

A new generation of Business Intelligence is on the horizon.

Historical transaction data has been the foundation for business intelligence (BI) and analytics for decades – analyzing past trends and behaviors to predict future events.  That’s great in an environment where the data isn’t changing – but in modern IT environments, systems, processes, and data sources are changing continuously. As they say in the financial industry, “past results are not an indicator of future performance.” In a rapidly changing environment, business leaders make decisions based on near real-time data. Modern business intelligence systems are.” for this type of analytics, and the prospects are exciting!

Can’t My Reporting Tools Handle Streaming Data Already?

The answer to that question is “maybe” – it depends on what system you’re using for analytics and reporting. The challenge for most organizations is that there is no “one reporting tool.”  There are many reporting and analytics platforms, apps, and systems in use across the enterprise, with a widely varying level of sophistication when it comes to capabilities for handling streaming data. You can’t depend on your reporting tools to handle this for you. You need to look at an integration platform like Actian Connect and Actian DataConnect to help you manage your data streams and normalize your data before loading it into your data warehouse that your reporting and BI tools can query.

A data warehouse like the Actian Data Platform can handle large volumes of streamed data to give you affordable cloud-scale capacity, economy, and high levels of processing performance that traditional data warehouses can’t match. The integrated connectivity of Data Connect puts you on the path to DataConnect to manage more streams.

Visit www.actian.com/data-platform to learn more.