Big Data

Data Platform

The term Big Data describes data sets that are too large or complex to be processed by traditional data processing methods. It is also used to describe data sets that need to be processed in their entirety to gain business insight into the information contained in the data, as processing subsets of the data could lead to false conclusions.

Three key attributes can characterize it – volume, velocity, and variety explained below:

  • Volume can vary by application and business. Many businesses consider any dataset larger than ten terabytes Big Data, while others may use the term to describe petabyte-scale data sets. Web logs, financial systems, social media feeds, and IoT sensors can generate vast volumes of data, making it increasingly common.
  • The Velocity of data creation can demand real-time in-memory processing in use cases such as fraud detection or IoT sensor processing in manufacturing. Edge processing and smart devices can help throttle data velocity by pre-processing a high volume of data before it overruns central server resources.
  • Variety refers to data types. It is not limited by structured data alone. Its datasets also encompass unstructured and semi-structured data types, such as JSON, audio, text, and video.

Big Data Storage

Early data storage systems used for decision support relied on data warehousing technology for structured data storage and retrieval. This became a limiting factor as businesses began to see value in semi-structured and unstructured data. Open source and scalable, structured file systems evolved to store thousands of files economically that could be accessed using clustered servers. In the early days, Apache Hadoop software stacks running on server clusters managed Big Data files.

SQL Access to Big Data

Apache Hive provided a SQL API that made file-based data available to applications. The Spark SQL provides an API layer that supports over 50 file formats, ORC and Parquet. Modern cloud-based and hybrid-cloud software, such as the Actian Data Platform, provides a high-performance data analysis data warehouse with the ability to access Hadoop file formats as external tables using a built-in Spark SQL connector. By supporting popular semi-structured data formats, including JSON and website logs, in addition to Spark SQL and standard SQL, application builders and data analysts can gain easy access to Big Data stores in the cloud and on-prem.

Processing

Processing systems employing Massively Parallel Processing (MPP) capabilities using hundreds of compute nodes make it possible to analyze large and complex datasets. Low storage costs and the ready availability of massive compute resources as needed make cloud computing services a good fit for vast amounts of processing. Subscription pricing and elastic provisioning make cloud computing an economical choice, as you only pay for the resources you use. On-premise alternatives often use clustered or GPU-based systems, which can be harnessed for highly parallelized query processing.

Why is it Used?

The approach became popular because it provided a new source of empirical data to support business decision-making. Organizations generate and collect vast amounts of data that contain valuable insights that only become evident when the data is processed and analyzed. Technology has enabled businesses to efficiently mine large datasets for fresh insights that allow them to be competitive and increase successful customer interactions. Making decisions based on actual consumer data reduces the risks and costs associated with uninformed decision-making, ultimately making the business more effective.

Big Data Use Cases

Below are some examples of real-world use cases for it:

  • The Healthcare industry uses it to improve patient care by using telemetry from smart wearable devices to monitor patient health, blood pressure, glucose levels and heart rates, for example. Clinical trials collect huge amounts of data that needs to be analyzed to manage and prevent diseases.
  • The Telecoms industry uses data collected from mobile service subscribers to improve network reliability and customer experience.
  • The Media industry leverages user data to personalize content to match the viewer’s interests. This increases satisfaction with the service and improves customer loyalty.
  • The Retail industry needs its analytics to sell goods that are most relevant to the buyer. By tracking customers from e-commerce and making appropriate recommendations, retailers can increase foot traffic to their physical stores.
  • Banking and Insurance companies use it to detect potentially fraudulent transactions and prevent money laundering.
  • Government organizations use it to improve policing and fight cybercrime. Cities use traffic cameras to manage accidents and improve traffic flow on roads.
  • Marketing departments use it to inform targeted social media and digital advertising campaigns to provide their sales teams with contacts who are likely to be interested in the product or service the business provides.

Big Data and Actian

The simplest on-ramp to the Actian Data Platform is to sign up for a free trial. Some of the benefits of the Actian Data Platform include the following:

  • Blazing performance for your most complex workloads.
  • Built-in data integration for quickly loading and accessing data as well as transformation and data quality.
  • Scale your data warehouse in real-time to your computing and storage needs.
  • SOC 2 Type 2 compliance for your most sensitive data deployments.

Register for the trial at https://www.actian.com/free-trial/