This content is part of the Conference Coverage: Red Hat Summit 2024 news and conference guide

Slow Ansible Lightspeed adoption might reflect AI qualms

Customers with Ansible Lightspeed in pilots or production remain rare, as concerns about the risks of AI and the need to shore up a solid IT automation foundation prompt caution.

DENVER -- As new Red Hat AI products took center stage at its annual conference this week, discussions about a tool it has already made available might reflect a hesitancy among enterprises about diving into generative AI.

Some keynote and breakout session speakers, along with Ansible Automation Platform (AAP) attendees at Red Hat Summit and the co-located AnsibleFest, expressed interest in evaluating Ansible Lightspeed and early-preview versions of the AI-driven code assistant for OpenShift and Red Hat Enterprise Linux launched Tuesday. AAP customers have had free access to the feature since it reached general availability in Nov. 2023.

The Ansible community at JPMorgan Chase will conduct a proof-of-concept evaluation on Ansible Lightspeed and Red Hat OpenShift Dev Spaces in the next year, according to Beth Boy, executive director of global technology at the financial services company, in a session presentation about AAP adoption Wednesday.

"My team doesn't spend a majority of their time writing playbooks. … They're busy building that [platform]," Boy said. "But I have a couple people on my team and at JPMorgan, all they do is write playbooks all day long. They've probably authored hundreds of playbooks. Those are the people we want feedback from."

But sessions showcasing Ansible Lightspeed hands-on pilots or detailing production use were hard to come by during the main conference programs on Tuesday, Wednesday and Thursday. A breakout session on Wednesday titled "Tales from the field: How early adopters are putting Red Hat Ansible Lightspeed to work" featured two reps from IBM and IBM Consulting. Ansible Lightspeed was created using IBM's Watsonx code assistant and was tested before its release by IBM internal users.

That amounts to an anecdotal accounting of conference sessions. When asked to give a more scientific estimate of the number or percentage of customers using existing AI products at a press session on Tuesday, Red Hat executives said only that they're still in an early adoption phase.

It's unusual for Red Hat to promote a product as strongly as Ansible Lightspeed and not have more user testimonials at its annual summit, said Rob Strechay, lead analyst at enterprise tech media company TheCube. But for Ansible Lightspeed, the reasons for slow adoption might be complicated.

"When [Red Hat] brought it out, I wondered about that, because the community is so robust," Strechay said. "The LLM [large language model] was trained on [code from] the community and people were like, 'OK, well, if I go and use Lightspeed, I don't get credit even though I built that [code] out and put it into the general community.' … There was a lot of hesitation from people about contributing into the system."

Red Hat AnsibleFest keynote 2024.
Ansible reps, led by Matthew Jones, chief architect of Ansible automation at Red Hat (center), present product updates at Wednesday's AnsibleFest keynote.

'Copiloted to death?'

There are other potential reasons behind slow adoption for Ansible Lightspeed that aren't limited to any specific product. Lingering concerns about the risks of generative AI, especially for production and customer-facing use, range from copyright and IP legal issues to data privacy and security vulnerabilities as well as the quality, accuracy and bias of model results.

One AnsibleFest keynote speaker on Wednesday issued a caveat about tech and data preparation when companies consider AI tools.

A team at MAPFRE Insurance tried using an unnamed LLM with retrieval-augmented generation to evaluate 20 years of IT incident data, said Mat Jovanovic, corporate cloud strategy director at the company in Madrid.

"The operator can play with the tool and say, 'Hey, these are the symptoms. What's the most likely root cause?'" Jovanovic said. "So what do you think about how it went? It went horribly wrong. Nothing worked. We had accuracy of only 3%."

This was because the company's systems weren't automated, making the data inconsistent, Jovanovic said.

"If you don't automate and you try to put AI on top of nothing, it won't work," he said. "You will get 3%. We proved this."

If you don't automate and you try to put AI on top of nothing, it won't work. … We proved this.
Mat JovanovicCorporate cloud strategy director, MAPFRE

Another factor for some IT pros is sorting out the differences between the deluge of code assistants practically every one of their IT vendors now offers.

"We're trying to look at the value proposition between Lightspeed and GitHub Copilot," said Nick Cassidy, lead innovation engineer at Blue Shield of California, in an interview this week. "Just from today, in my conversations, what I've understood is that Lightspeed is trained on Ansible best practices and then you can retrain the model on your playbook implementations."

This is potentially appealing, Cassidy said, but that's as far as his evaluation has gone.

The glut of tools might be paralyzing the market somewhat at this point, according to Strechay.

"We've seen in [TheCube research] data that code development has gone down as a use case that people were evaluating over the last quarter by about 5% or so," he said. "It used to be a lot higher, and I think that people are … copiloted to death. It's like, 'Which one am I going to use? And where do I go?' That's part of it."

Event-Driven Ansible cleared for takeoff

Multiple session speakers said they have plans to deploy Event-Driven Ansible -- a non-AI-driven IT automation tool that reached general availability in June 2023 -- in their environments over the next year.

Speakers said they were particularly interested in Event-Driven Ansible after this week's launch of automated policy-as-code support that puts company rules written with Open Policy Agent into infrastructure automation workflows. These can include a variety of policies invoked in response to events, from a developer attempting to request resources above a pre-set quota to automated responses to potential security events.

Jovanovic said his team plans to quickly adopt these features.

"Extreme automation and AI are awesome. But you need to focus on security," he said. "We had an incident [where a] developer in Brazil in a pre-production environment [got] compromised data, and we [had] to manually disconnect their VPNs. So we need something that can implement that automation based on a set of different attributes [than] just RBAC [role-based access control] -- something really attribute-based, and it needs to be in the code. It needs to be fast."

Beth Pariseau, senior news writer for TechTarget Editorial, is an award-winning veteran of IT journalism covering DevOps. Have a tip? Email her or reach out @PariseauTT.

Dig Deeper on Systems automation and orchestration

Software Quality
App Architecture
Cloud Computing
SearchAWS
TheServerSide.com
Data Center
Close