One petabyte (PB) is 1,024 Terabytes or 1,048,576 Gigabytes.
How Big Is a Petabyte?
Starting with the smallest, the following progression is intended to depict a Petabyte in context:
- In a computer system, relative sizes are expressed in Bytes. One byte consists of 8 Bits. A single bit represents a 0 or a 1. When writing applications in Assembler or machine code, programmers use Hexadecimal values to express byte values. In hexadecimal, a string of 8 bits with all zeros is written as x’00,’ and a string of all ones is written as x’FF.’
- Bytes are commonly used to represent key values on a keyboard. How these are displayed as characters changes on the keyboard’s geographic and language region. Early personal computers using the Intel 8080 and Zilog used an 8-bit bus, and the CPU processed 8-bit instructions. These systems could only address 32KB of RAM.
- A Kilobyte (KB) is 1,024 bytes. The size of a small file might be expressed in KB.
- Megabytes (MB) are the next unit of storage in a computer system. One MB is 1,024 KB. Megabyte is used when describing the size of a large file, such as an image file. File size is important when attaching an image to an email or sharing a link to a larger file.
- A Gigabyte (GB) is 1,024 Megabytes. Larger capacity storage devices, such as memory card sizes, are usually defined in Gigabytes.
- A Terabyte (TB) is made up of 1,024 Gigabytes. Nowadays, computer hard drive capacity is expressed in Terabytes. Cloud storage usage is typically provided in Terabytes per month.
- One Petabyte (PB) is 1,024 TB. It’s hard to imagine 1,125,899,906,842,624 Bytes. The size of a server farm might be described in Petabytes.
- An Exabyte (EB) is 1,024 Petabytes. An Exabyte is so big that even total storage managed by a public cloud provider would be a fraction of an Exabyte.
- A Zetabyte (ZB) is 1,024 Exabytes. All the data in the world is a few Zetabytes.
- A Yottabyte (YB) is 1,024 Exabytes.
- A Brontobyte (BB) is 1,024 Yottabytes.
- A Geopbyte is 1,024 Brontobytes. These numbers become useful in astrophysics, DNA research and nuclear physics.
Before Hadoop, large enterprises used commercial databases on-premise. Centralized data warehouses from Oracle, IBM and Teradata claimed the largest enterprise data warehouses (EDBs). The Apache Hadoop open-source project brought low-cost clustered databases within reach of smaller businesses, and the Big Data movement began. Early Hadoop implementations ran database engines such as Hive, which provided an SQL interface to interrogate data stored on the cluster but were very slow. Actian Vector quickly became the fastest database technology on Hadoop clusters by its ability to store data in its columnar format and use vector processing to parallelize queries across all the server cores and server nodes.
In the 1990s, a Terabyte database was called a Very Large Database (VLDB) and was often used to set transaction processing benchmark records. In 2010, Facebook claimed the largest classic Big Data Hadoop cluster at a size of 21 Petabytes and growing by half a PB per day.
Today, the database market has evolved to store, query and mine structured and non-structured data. Artificial intelligence (AI) and Machine Learning (ML) can find correlations in vast quantities of raw data.
Actian and Big Databases
The Actian Data Platform uses a highly parallel query capability provided by the built-in vector processing database engine.
Actian is a pioneer in the database industry. The Ingres database was the first to develop a distributed query with Ingres-Star. Today, Ingres Next delivers a rock-solid transaction processing database using row storage along with an integrated extension using columnar storage optimized for data warehousing queries.