Embrace the future of data warehouses
Data platforms, infrastructure, architectures, and applications constantly evolve to take advantage of technological advances. Although technologies become dated as new versions emerge, IT often skips upgrades, leading to support lapsing and higher maintenance costs. Modernization enables organizations to embrace new technologies that extend the useful life and reduce the maintenance costs of existing deployments.
Unmodernized applications can stop working as their technology stack evolves or application programming interfaces (APIs) change. Organizations cannot assume hardware and software vendors will support their platforms forever. As vendors introduce new versions, support for older versions can begin to degrade. The software industry norm is to provide the highest levels of support for the current version and one generation of the previous version. After that, vendors may not retrofit bug fixes, offer only best-effort support, and hike maintenance costs to encourage businesses to migrate to new versions. Modernizing applications to be more portable and using newer APIs protect existing investments, adds flexibility, and lowers maintenance costs.
Innovation helps vendors differentiate their offerings to keep them competitive and justifies license or subscription renewals, support, and maintenance costs. Ongoing revenue from customers keeps software providers in business for the long term. If customers don’t modernize, they may have to rewrite applications, and IT administration overhead can become overwhelming, hindering the ability of the business to launch new initiatives to meet the demands of their customers.
Employee skills are often tied to specific technologies, such as programming languages and operating systems. Finding and retaining employees with the knowledge needed to maintain legacy systems and applications gets harder over time, creating an additional major reason to modernize.
Modernization and data platforms
Today’s most prevalent form of modernization is the pervasive adoption of cloud services. The cloud is very compelling as it eliminates the need for IT to manage hardware and operating systems in the data center. Cloud providers manage the hardware, operating systems, and many services, such as application servers, on behalf of subscribers, freeing up IT resources for more strategic initiatives.
Older hardware, such as the Dec Alpha system, is hard to virtualize, but newer Intel-based software is easy, thanks to vendors such as VMware. Virtual machines isolate operating systems from physical hardware constraints by offering virtual CPUs, memory, and disk drive volumes. Hardware simulators running on modern Intel processors, such as Nat Semi 6502 and the Motorola M6800, can be too slow for production workloads.
Docker was one of the first tools to containerize applications with software stack dependencies, making them portable and platform agnostic.
Database platforms such as the Ingres Database can easily be re-platformed to cloud platforms thanks to Actian. The Actian Vector columnar database is now at the heart of the Actian Data Platform, making it available on-premises and as a cloud service across cloud providers.
Application programs used to be written in procedural Assembler and 3GL code. More than 220 billion lines of COBOL are in use today, with 43% of banking systems still relying on it. Early applications consist of mainline code that calls subroutines and functions. When modernizing such applications, developers refactor them into smaller components, such as checks and deposit transactions in banking applications. This component-based approach helps organizations build new applications faster by connecting pre-built and pre-unit tested components into new applications. Developers can wrap application components written in legacy code or encapsulate them in modern languages and re-engineer them to use more future-proof APIs such as XML and JSON that are self-describing.
Further, component-based web applications can be containerized and deployed on modern cloud-based serverless platforms that remove the need for IT to maintain the full software and hardware stack, reducing maintenance costs. This micro-services deployment offers the most flexibility, allowing applications to run on-premises and in the cloud. Developers can optimize performance by keeping application code close to data to minimize network latency.
Larger, long-established enterprises carry lots of legacy application baggage. Programmers that built aging applications are reaching retirement age, leaving a skills gap. Software providers like IBM have established classes to train software engineers and developers in less popular languages. Many vendors have created centers of competence for legacy platforms in Eastern Europe and India to fill skill gaps. Ultimately, these applications must be modernized or retired as developers prefer to learn new languages on new platforms.
Y2K was a great example of the IT industry stepping up to update legacy code. Vendors such as SAP have encapsulated code from the mainframe R2 days into web services that are part of the current versions of their products. In this way, even if original programmers are long gone, their written routines continue to provide value.
Actian reduces modernization overhead
The Actian Data Platform was designed from its inception to operate in hybrid cloud configurations, allowing customers to put their analytics processing wherever the data resides.
The Ingres Database provides a proven and highly available transaction-oriented database server. The Ingres NeXt program preserves existing investments by helping customers move their on-premises instances to the cloud. Actian provides the infrastructure, services and tooling required to modernize their Ingres Database investment easily.
Learn more about the Ingres NeXt initiative here.