For 30 years, vendors armed with a variety of alternative architectures predicted that mainframes had arrived at death's doorstep. While these competitors chipped away at the venerable system's market share, very few heavily invested enterprises tossed out their mainframes in favor of the latest bright and shiny platform.
In fact, a research report released late last year by an enterprise software maker stated there are now some 800 billion lines of active COBOL code running on mainframe platforms, a figure significantly higher than the vendor anticipated. The report went on to say that many enterprises fully expect their COBOL applications to remain in use for at least the next decade.
Chief among those applications helping to sustain the viability of mainframes are databases that serve both as an application and application development platform.
"Mainframe databases are one of the primary reasons mainframes still exist," said Jack Gold, president and principal analyst at J. Gold Associates. "But the effort involved to move a jam-packed, on-premises database to a cloud environment or other distributed platforms is enormous. In many cases, it's easier to maintain it in place and/or just upgrade the hardware it currently runs on. It can be a huge problem."
Challenges moving mainframe databases to the cloud and distributed databases
One of the more perplexing challenges in moving a proprietary, on-premises database to the cloud is that enterprises lose a level of control over the reliable accessibility users have to data. Businesses must put their trust and faith in cloud providers being able to avoid costly cloud outages.
"A critical issue many users face in moving their databases to the cloud is avoiding outages, which can cost not just users but vendors millions of dollars," Gold said. "If the cloud goes down, they are out of business."
Another reason IT operations are reluctant to move databases off the mainframe are the tight ties they have to the instruction set of mainframe hardware. Also, IT administrators must deal with hundreds of undocumented internally developed applications that some enterprises have written over the decades often containing mission-critical data.
"The mainframe's instruction set has been geared over the years to specifically take advantage of a database's core capabilities," said Francis Dzubeck, president of information infrastructure design firm Communications Network Architects. "Database vendors have enhanced their offerings largely based on their users' wishes, which has served to keep them loyal to those products."
Unique challenge for managing mainframe databases: Missing documentation
The lack of documentation available on decades-old database applications can sometimes delay mainframe projects by months, placing IT staff under undue pressure from impatient CEOs.
"In today's environments, good documentation is vital. It shouldn't be treated as a trivial issue," said one systems architect with a Fortune 100 bank who preferred to remain anonymous. "I was involved in one project migrating a mainframe database to another [distributed] system without proper documentation. It took well over a year to complete with management screaming and yelling about why we bothered taking on a project like this."
A couple of decades ago, migrating mainframe databases to a distributed platform was relatively easier than it is today. Databases and their internally developed applications back then were smaller in size and crafted to do a specific task. That process has grown increasingly complicated over the years as corporate users looked to migrate databases to a variety of other platforms.
"Converting the code [of a proprietary database] over to another platform takes time because they have to get it right," Dzubeck said. "Applications that sit on top of a mainframe database are not simple things that might exist in a mom and pop shop. Often, they turn out to be mission-critical applications that enterprises use companywide."
Best practices for mainframe database management
Besides having the discipline to produce and save detailed documentation on older database applications, it's important that IT operations teams be aware of the latest administrative and monitoring tools that can bolster database security. Some recommended practices for better security include implementing role-based security, regularly "de-cluttering" a security database and identifying the most critical data.
Jack GoldPresident and principal analyst, J. Gold Associates
Another important practice is automating critical business processes. Chief among automation tools that analysts find promising is the emerging class of AI offerings, such as ChatGPT and Bard. Some analysts and consultants advised that many of these AI-based tools are in the early stages of development and won't be safe to use in production platforms for some time.
"Database vendors are making automation and orchestration a priority as they move to the cloud," Dzubeck said. "There's been an increasing migration among IT shops to a more tools-centric and AI-centric management of database environments. But it is early days to introduce AI to a production platform."
What's in store for the future of mainframe databases
One advantage for IT administrators keeping their mainframes and databases on premises is the arrival of faster, lower-cost mainframes and storage devices. The latest generation of mainframe hardware is better equipped to run increasingly larger data sets and large language models serving to extend the lifecycle of existing mainframe databases into the burgeoning age of AI development.
Along with the successful integration of AI technologies into mainframe software, the life of mainframes could be further extended as AI opens the door for younger IT workers stepping in to replace retiring IT veterans. "AI can do a lot of automation. But, as importantly, it means you don't have to sweat about finding qualified personnel to take over for retiring mainframe veterans," said Judith Hurwitz, chief causality evangelist at AI software startup Geminos. "You can now tap into the large pool of younger workers hip to AI but don't have 40 years of mainframe experience."