Getty Images/iStockphoto

IT leaders share enterprise AI change management tips

An Insight Enterprises CTO, an ex-Google AI lead, and SRE leaders from Telus offer tips to enterprise IT leaders struggling to cut through AI noise and realize business value.

Despite years of hype and technical development, most enterprise AI initiatives are stuck at the starting gate. But some large companies have made headway in deploying AI productively, starting with a foundation of organizational change.

That generative AI (GenAI) and AI agents have yet to fulfill vendors' heady promises or be deployed widely at scale by enterprises has been the growing consensus among Big Tech leaders and market research reports so far this year.

In January, PwC’s 29th Global CEO Survey report, based on responses from 4,454 chief executives, found that 56% of respondents have not realized revenue or cost benefits from AI. Just 12% reported realizing benefits from AI in both categories.

During the Cisco AI Summit earlier this month, executives from Cisco, AWS, Google and OpenAI said that AI is moving faster than enterprise customers can absorb.

At Insight Enterprises, a global systems integrator based in Chandler, Ariz., the disconnect between the breakneck pace of AI development in the industry and typical enterprise change management processes has been most evident so far in software development. This was true both internally during Insight's initial phases of adopting GenAI and among clients, according to the company's CTO for North America, Juan Orlandini.

Juan Orlandini, CTO North America, Insight EnterprisesJuan Orlandini

"That's actually a work in progress as an industry, and I'd be remiss telling you that we have it all figured out -- anybody that tells you that they've got it figured out, they're [wrong]," Orlandini said in an interview with Informa TechTarget. "Because we can generate code so quickly, we've forgotten the very front end of the application development cycle. … Part of the guidance that we give to customers is that, yeah, there are some parts of the workflow that have gotten significantly faster, but some of those structures that we've developed over 50, 60 years of running it properly -- don't forget those."

Developers keen on shipping code faster with AI sometimes chafe at the enterprise change management processes that remain in place, Orlandini said. But these processes are crucial to weeding through a mass of prototypes to identify the projects that will have a significant impact on a business when deployed at scale.

Take, for example, Insight's development of an AI agent for its website. The minimum viable product for that was developed using AI in three weeks, but it took three months for the rest of the change management process to vet the agent, Orlandini said.

"The scaling and the continual verification and all that … security, cost controls and governance … developers typically tend to struggle against, but there's a reason why you have a security team and a governance team and a FinOps team," he said. "They're not there to prevent innovation. They're there to make sure that you're doing it fiscally responsibly."

People and process as an AI foundation

Friction remains between coding agents and the rest of the software development process, but an even bigger problem for enterprise AI lies in broader organizational change management issues. In a Feb. 4 Process Optimization Report by data processing firm Celonis, among 1,649 surveyed businesses, the top three hurdles to AI in production were people and process problems: a lack of expertise, cited by 47% of respondents; misalignment between departments by 45%; and difficulties getting AI to understand business context by 45%. Difficulties driving automation across disjointed systems were also cited as a blocker by 34% of respondents.

A comprehensive change management process for people in the organization that started early was even more important for AI adoption at Insight Enterprises than software delivery checks and guardrails. When ChatGPT launched in 2022, evaluating how people in the organization responded to it was a key part of how Insight assessed the potential benefits of the technology, Orlandini said. Before the company developed any AI apps, a "walled garden" for internal experimentation by 14,000 employees served as the setting for an organizational approach to AI change management.

The ones that were [saying] 'Burn it with fire' are finding out, 'This is a tool for me. I'd better use it, or I'm not going to be as useful to our company.'
Juan Orlandini, CTO, North America, Insight Enterprises

During this early experimentation, Insight observed three categories of responses: highly -- sometimes overly -- enthusiastic early adopters; reluctant opponents of the technology; and, for the majority, "'Hey, this sounds really cool, but I don't know how to use it,'" Orlandini recalled. "'Where's the manual, where's the training?'"

In response, the company created a training program and platform called Flight Academy, where users start from very basic knowledge, such as "What is a prompt?'" and progress into deeper prompts, then connect the results of those prompts to their work. As users progressed, they competed individually and as teams. Flight Academy was initially an internal tool at Insight, but the company now sells it to clients.

"That lowered the barrier of entry for that broad middle group," Orlandini said. "The ones that were [saying] 'Burn it with fire' are finding out, 'This is a tool for me. I'd better use it, or I'm not going to be as useful to our company.' And the overly enthusiastic ones became some of the leaders."

Evaluating 'a zillion good ideas'

Flight Academy helped prepare Insight Enterprises for AI, but there was more to AI change management than employee training. Next, the company had to whittle down "a zillion good ideas" to focus on the ones that would deliver an ROI for the business. To do that, Insight created a platform called Insight Prism, which it also now sells to clients.

"We created an onboarding process for these ideas to be brought forth. Then [Prism] runs those ideas through an engine that spits out a business case and says, 'This idea is going to be amazing because it's going to generate this much more revenue, or it's going to save us this much more money,' or both," Orlandini said. It gives you a business justification for whether this thing is good or not. And for some of those ideas, the numbers are actually not so good, so we don't invest in those."

There are other tools companies can use to evaluate ideas for AI apps, ranging from hosted cloud services to open source AI risk assessment tools. Ducker Carlisle, a global consulting and M&A firm, uses StackAI to host a similar platform for its citizen developers to build and evaluate apps created using AI agents. This decentralized approach to AI application development and evaluation emerged because the initial phase of adoption generated an overwhelming number of niche requests for the company's centralized engineering team, raising concerns that employees would resort to shadow AI, according to Fabien Cros, chief data and AI officer at Ducker Carlisle.

In response, Cros drew on previous experience as data & AI country lead for manufacturing at Google Cloud in France to create a citizen developer and tool discovery program.

When we see something that is rising fast, we look at it … and we ask, 'Can we do it better? Can we do it faster? Should we move it in-house?'
Fabien Cros,Chief data and AI officer, Ducker Carlisle

"You let the users come up with ideas, build some stuff, even if it's limited, and then when you see adoption, you say, 'Is it core DNA for organization, or is it not?'" Cros said. "When it's not, you let it run through the [SaaS] platform and the [citizen developer] program. When it's core DNA you want deep monitoring. You want to control everything, end to end. It's like a pyramid, where you have a lot of use cases at the bottom, and then you bubble up the core use cases, and your central team [takes] over."

Ducker Carlisle also uses gamification techniques to assess the popularity of users' apps, which is reflected in a leaderboard and a rating system akin to GitHub stars.

"When we see something that is rising fast, we look at it and we say, 'That's a good use case, and apparently people like it,'" Cros said. "And we ask, 'Can we do it better? Can we do it faster? Should we move it in-house?'" 

'The hackathon mentality'

Telus, a Canadian telecommunications and technology company, used its internal developer platform (IDP) to host a hackathon where users tested AI tools, including AI infrastructure utilities, to determine which of the many choices in a teeming market would be most useful.

"We really adopted the hackathon mentality, especially last year," said Kulvir Gahunia, site reliability office director at Telus. "It's [done in] a controlled environment, but at the same time, we didn't put guardrails on what users wanted to hack on. The tool they build might or might not help, but sometimes the technology to get to that point is a game-changer."

One example of that is n8n, an AI workflow automation platform created by a company of the same name in Germany that has a source-available, self-hostable free version, which lent itself to use during the Telus hackathon.

Dana Harrison, principal site reliability engineer, TelusDana Harrison

"In [about] 100 ideas that were submitted, something like 14 or 15 of them used n8n, so there were tons of little n8n instances running around," said Dana Harrison, principal site reliability engineer at Telus. "And we went, 'Oh, before this goes completely off the rails, this is clearly a need. So we met that need, got licensed, and we now have an agreement with n8n."

Given how fast AI tools are emerging and changing, following those indications of user need are a good way for an enterprise platform team to keep up with what's important to secure and support, Harrison said. Three days into setting up n8n as part of the internal platform, it had 1,300 users.

The fact that the company had already taken steps to centralize on an IDP based on CNCF's Backstage gave it a strong foundation for AI adoption. It had also taken steps to  consolidate its IT and business data using Dynatrace, and control and moderate corporate access to large language models with the Fuel iX platform. "We are trusted in what we do, which is a privileged place to be in," Harrison said of the Telus platform team. "What it also means is that when we develop, people listen."

One of the early internal AI adoption wins for Telus was a Slackbot, combined with an open source search engine tool called turbopuffer, that gave users an easy way to search Dynatrace data.

It's a natural position for enterprise platform engineers to be in, according to Gahunia -- AI was created to remove toil and offload repeated tasks, which is also the mission of platform teams, he said.

"We really embrace that mantra," he said. "A lot of the stuff that we started coming out with was, 'Hey, we do this all the time. Let's just automate this piece. Can we now leverage this somewhere else? Oh, yes, we can.' And it just organically started growing into this path that we're on now."

Final thoughts: lessons learned for IT leaders

Finding opportunities to automate and remove toil from existing workflows using AI helps center the most useful tools but can also be a good starting point for people fearful about being replaced by the technology, Gahunia said.

Kulvir Gahunia, site reliability office director, TelusKulvir Gahunia

"It helps remove that fear of, 'AI is taking over my job,' because [users can see how to] use AI to enhance [their] work," he said. "It's not going to replace you, but you can leverage it to enhance your work and the outcomes you deliver. … That's a very key message for any organization to drive adoption."

For some workers, however, disruption is already undeniably taking place, Insight's Orlandini said, especially among software developers. AI is now performing the simple, entry-level tasks that used to help junior developers learn to build larger, more complex systems. Senior developers sometimes find themselves in a role more like a product manager, without the craft of developing code they've spent years honing and have come to enjoy.

"We need to be very mindful as leaders of understanding that this isn't just a technology thing," Orlandini said. "We have to manage the people and manage the expectations as much as we have to be able to consume this new capability."

For junior developers, managers should encourage them to start thinking about "the whys, rather than the hows, of building applications," he said. For senior developers, if a product manager role doesn't suit them, leaders should find other ways to put their expertise to use for the organization.

"People are wary of change because change implies the unknown -- help them through that unknown," Orlandini said. "Any amount of education that you put into your organization is going to pay dividends down the road.

"The other thing I tell IT leaders is, don't forget your roots. All the things that we've learned over the decades that we've been doing in IT still apply. Some of the things might happen faster, but not all of them, and some of the fundamentals are still there."

Beth Pariseau, a senior news writer for Informa TechTarget, is an award-winning veteran of IT journalism. Have a tip? Email her.

Dig Deeper on Systems automation and orchestration