AI's emergence in productivity suites and other mainstream software products is well underway. But some IT service providers see an integration opportunity beyond the latest vendor offerings.
The AI-inside trend continued last week with Google bringing its rebranded Gemini large language models to Gmail, Docs and other Workspace applications. Google's move follows Copilot for Microsoft 365, which entered the enterprise market last November. That offering embeds Microsoft's Copilot generative AI platform into Word, Excel and other office applications.
Partners are hardly ignoring such developments, as companies train thousands of consultants on generative AI technology from Google, Microsoft and AWS among other providers. But they see another opportunity amid the integration of AI: embedding AI into customers' client-server applications and even mainframe ones. Such systems operate outside of the limelight, but they often handle core business functions. That status makes them targets for AI investment as enterprises seek reinvention projects with high-return potential.
Integration of AI on tap for 2024
David McCurdy, CTO at Insight Enterprises, a solutions integrator based in Chandler, Ariz., anticipates the merging of generative AI with more traditional client-server systems.
"I call 2024 the year of integration," he said. "I think a lot of companies, especially companies that have a development arm, are trying to integrate [AI] into different applications."
Insight counts itself among those companies. McCurdy said the integrator plans to meld AI into its own back-office applications this year. For example, the company's AI-based Knowledge Hub, expected to roll out this month, will bring information to employees within the applications they use, rather than having them search for it in a separate system.
McCurdy said he believes Insight's customers will also rethink their internal systems and embark on back-office AI integration.
"This will be a huge market for us in the future," he said. "Every company has some type of application or service that's specific to them. And that's where you're going to really see the ROI -- when you get to the core."
Mainframes ripe for AI integration
Mainframes, with their ample processing power and memory capacity, are also candidates for AI integration, according to Gordon McKenna, cloud evangelist and vice president of alliances at Ensono, a technology adviser and MSP based in Downers Grove, Ill.
"The mainframe is an obvious platform to run AI," he said.
Ensono's customers already do so. McKenna said clients run IBM's Watsonx generative AI cloud services on their mainframes. Those services include Watsonx Code Assistant for Z, which aims to speed up the translation of COBOL code to Java and generally boost developer productivity on the IBM Z line of mainframes.
McKenna said Watsonx gives IBM's "old Watson," which dates to 2007, a new lease on technology life. "I see IBM leaning very heavily on that," he added.
But while some customers bring AI to the mainframe, others are moving in the opposite direction.
"We are seeing users want to extend the mainframe into the cloud and take advantage of OpenAI, [Amazon] Bedrock and Gemini," he said, noting that Ensono helps customers tap such AI cloud services.
A survey from Advanced, a company that offers mainframe modernization services, reflects the same contrast. Fifty-two percent of 400 IT leaders polled said AI innovation has accelerated their migration off mainframes, while 29% of respondents said they are exploring the integration of AI on mainframes. Advanced, based in Atlanta, released its annual "Mainframe Modernization Business Barometer Report" last week.
IBM in January agreed to acquire the application modernization assets of Advanced.
Application, data stack modernization
The current interest in AI integration is the latest phase of a broader -- and longer -- application modernization initiative, according to Erik Duffield, CEO at Hakkoda, a Snowflake services partner that offers data modernization and managed services.
David McCurdyCTO at Insight Enterprises
"We've been on application modernization [for at least 10 years], because organizations had to address this," he said. "But now, generative AI, and AI in general, has put a different lens on application modernization, a further horizon and a bigger capability that you're able to apply to it. There's a lot of conversations around, 'What do we do with our application stack and how is AI a part of it?'"
Duffield said those customer talks involve front-office as well as back-office systems.
A Hakkoda study published today found two-thirds of the 500 data leaders it surveyed believe generative AI will be "very" or "critically" important to their success by 2027. In addition, 85% of organizations are expected to use some type of generative AI tool this year, according to Hakkoda's "State of Data" report.
But another form of modernization will influence whether AI adopters fully take advantage of the technology: updating companies' data stacks. Ninety-four percent of the organizations Hakkoda polled said they need to modernize their data stack in 2024, a move the report said will help them "harness the power of GenAI."
Regarding specific modernization tactics, the Hakkoda survey said 45% of respondents plan to centralize on a primary cloud platform this year, while 23% expect to do so in 2025. Such cloud data management platforms are considered a key IT infrastructure component for assembling the data necessary to fuel AI models.
John Moore is a writer for TechTarget Editorial covering the CIO role, economic trends and the IT services industry.