Mainframes endure as a technology stalwart some 60 years after their invention, shouldering critical, transaction-intensive workloads.
Nearly three-quarters of the U.S. IT decision-makers Forrester Research polled for a 2020 study said mainframes have long-term viability as a strategic platform -- quite a vote of confidence. That said, some customers might be looking to part with big iron. Mainframes can prove costly to maintain and businesses struggle to find people& who understand legacy languages and databases. In addition, the appeal of cloud computing as a flexible, modern IT environment means many mainframe adherents will move at least some of their applications and data to as-a-service alternatives. Getting there won't be easy.
Mainframe migration is a complicated chore that demands careful planning and an assessment of options. The choices include moving the entire mainframe environment to a hosting services provider or refactoring a mainframe application for deployment in a public cloud or another platform.
Whatever the method, nailing that transition is critical for IT shops still counting on the mainframe's compute power.
"The mainframe is the backbone of their organizations -- it's typically driving the transactions that make these companies run," said Ken Marr, CTO at FNTS, a managed IT services provider based in Omaha, Neb. "It's a big deal to move these applications and rewrite them."
Mainframe migration in five steps
Businesses planning a mainframe migration must consider myriad factors, including their time frame for the project, budget, the expertise required to rewrite mainframe applications and the criticality of the applications to the business.
Here are the key steps in the mainframe migration journey.
Step 1: Consider mainframe rehosting options
An organization determined to abandon mainframes but with limited time to do so might investigate mainframe rehosting. The approach comes in several flavors -- mainframe as a service, outsourcing and emulation, to name a few -- but they all boil down to moving the mainframe application from a customer's on-premises data center to a service provider's cloud.
Rehosting, as a lift-and-shift migration, doesn't involve an extensive software rewrite. That makes it faster and less expensive than other migration methods.
Hosting appeals to "people trying to find a cloud environment for their mainframe workloads without rewriting them," said Juan Orlandini, CTO for the North America branch of Insight Enterprises, an IT services provider based in Chandler, Ariz. "What you are doing is getting rid of the on-premises burden that you typically have with the mainframe and shifting that to a different location."
Rehosting also helps customers meet deadlines. When an organization decides it's time to get off the mainframe, it might have a year to 18 months to do so before its on-premises hardware and software come up for contract renewal, Marr noted.
Those time frames, however, are generally insufficient for sorting a customer's mainframe estate and transforming applications. But mainframe hosting buys companies some time and provides a steppingstone toward modernization.
"It gives them the runway they need for application rationalization and transformation," Marr said.
FNTS, which offers its own cloud for hosting IBM Z-series mainframes, is among the service providers offering mainframe outsourcing. Others include Ensono, an MSP and technology advisory firm in Downers Grove, Ill., and IBM, the mainframe maker itself. IBM hosts mainframes through its IBM Z compute Virtual Server Instance that runs in an IBM virtual private cloud.
Step 2. Refactoring? Start with an application assessment
Customers might opt to migrate mainframe workloads. This method involves refactoring code, meaning rewriting all or part of an application to make the best use of cloud services. Refactoring is more time-consuming and expensive than hosting but has greater transformative potential.
"The refactoring model gives the application a new lease of life," said Oliver Presland, vice president of the global consulting services portfolio at Ensono. A refactored application can more easily integrate into cloud-native services and eventually evolve into a microservices architecture, he added.
This option, however, should begin with an assessment of the mainframe application or set of applications to be migrated.
"There's a technical element to that, which is understanding the codebase, the dependencies and the interactions with external interfaces and systems," Presland said. "If you don't spend time on the assessment phase, understanding how you are going to transform that application, then you're running the risk that the project will hit some challenges along the way."
Orlandini also cited the importance of starting with a fundamental grasp of the mainframe system to be migrated.
"It boils down to truly understanding what your application is doing for the business and what business logic it is encoding and how to more efficiently recode that," he said.
Marr said this application rationalization step helps organizations determine the art of the possible. With that knowledge, a business can devise its strategy for moving workloads off the mainframe or, alternatively, decide either to retain its application on the mainframe or retire it.
Sticking with the mainframe
Mainframe migration isn't always the best answer.
"Mainframes are perfectly fine for the purpose they were originally built," Orlandini said. "It's hard to make a move away from something that is materially good to something that might not be better when you're done with it."
IBM's Z-series mainframe advances, which include the ability to crunch AI workloads and perform fraud detection, provide an argument for staying on the platform. Businesses that can handle such functions natively on the mainframe --without having to rewrite the application stack -- might delay migration, Orlandini noted.
"We continue to see people invest in the Z series," he added.
Presland said businesses often have a core set of applications best placed on mainframes due to transactional volume or data gravity. In those cases, API-based offerings and data virtualization interfaces make it easy for cloud developers to access, consume and interact with mainframe data, he said.
Step 3. Watch out for the tough bits
Customers opting for refactoring should consider a complexity analysis to identify potential migration hurdles.
Automated tools help with the conversion of common mainframe technologies -- such as COBOL, PL/1 and VSAM data files -- to modern languages and data formats. But such offerings might not exist for the more obscure code residing on the mainframe.
"Whilst you could move the bulk of your applications and data with a variety of solutions, finding a way to deal with some of the more exotic [technologies] can become quite challenging," Presland said.
An application might contain, for example, very specific code for a particular business purpose or a specialized third-party plugin, he noted. And the backward compatibility of IBM mainframe environments means that old code and archaic systems can still run on newer hardware.
Presland cited the example of Model 204, a 1970s database management system. He estimated Ensono has conducted three sizeable Model 204 transformations. "It's very difficult to find the skills and staff for those," he added.
Staffing is indeed a problem, since many technicians adept in the arcana of mainframe systems have retired.
"These can be 40-, 50-year-old applications," Marr said. "Having somebody who knows the code well may or may not be the case."
In general, applications involved in lower-volume transaction processing are the most problematic. "The high-volume pieces are not a problem because [they] tend to be written in common languages," Presland said. "The low-volume-but-critical pieces of the application portfolio are really key. They were written a long time ago and you need to be able to transform those."
Identifying the toughest code conversion problems upfront helps with migration project planning.
Step 4. Plan the project
A thorough application assessment and complexity analysis feeds into the migration project plan.
Oliver PreslandVice president of global consulting services portfolio, Ensono
"That lays the foundation to say, 'I understand the business case and I understand the technology path,'" Presland said.
The result is predictability when it comes to the project timeline, price, the method for dealing with complex components and a requirements/resource plan, he added.
Ensono's project planning offers customers a couple of ways forward: an immediate transition to migrating mainframe workloads, or a proof of concept with full migration to follow. The latter involves converting the first section of the application to be migrated. Converting a section gives the customer something to test and run -- which provides a confidence boost, Presland said.
Other migration approaches might start with a few code modules to show the process generates good quality code. But this demonstration doesn't provide a working system that the customer can test, he noted.
Step 5. Covert the code
Refactoring calls for converting code written in old languages to modern ones, such as Java, C# or COBOL 6.0. Data maintained in flat files or nonrelational formats is converted to a relational database format, often a PaaS database, Presland said.
IT service providers specializing in mainframe conversion use tool sets to automate the conversion task.
Marr estimated mainframe code tools achieve a conversion accuracy percentage in the low 90s. Converting the first 92% or 93% of the code can go rather quickly, but converting the remaining code to make the application work takes time, he added.
"It's still a big project, but it's far faster than totally rewriting an application in C#, C++ or Java," he said.
Businesses can extract individual transactions from an application to reduce the conversion burden. A bank, for instance, might focus solely on the check-balance function of a customer-facing mainframe application and migrate that piece, rather than the entire system.
"Most companies look at it on a transaction-by-transaction basis," Marr said.