Cloud computing vendor ServiceNow is taking a different approach from other vendors looking to enter the AI race and market their large language models against tech giants such as Google and Microsoft.
ServiceNow is working on new generative AI models that are not trained on public data available on the internet, but are tailored explicitly to understanding some of the exact problems that enterprises have -- and trying to address those problems.
The LLM war
"We're not trying to sell the language models," said Jeremy Barnes, vice president for AI product at ServiceNow. "We're trying to sell productivity, and we're selling the ability to leverage the ServiceNow platform to create end-to-end digital transformations which involve generative AI."
As part of selling that productivity, ServiceNow on July 26 introduced two generative AI capabilities for its customers: case summarization and text-to-code. Both features are powered by the vendor's LLMs and are built for the vendor's Now platform. The cloud-based platform is for IT operations, business management and process automation.
Case summarization uses generative AI to read and distill case information across different use cases in the IT, HR and customer service sectors.
One application for case summarization is in the contact center. If an agent has been speaking with a customer, and the agent needs to leave and pass on the information, the model can automatically summarize what the next agent needs to know about the customer based on the previous conversation.
Meanwhile, text-to-code enables developers to write descriptions of the type of code they want using natural language.
ServiceNow also revealed a partnership with AI hardware and software vendor Nvidia and IT services and consulting firm Accenture, aimed at helping enterprises quickly adapt and develop their own generative AI capabilities. The program, called AI Lighthouse, uses the ServiceNow enterprise automation platform and engine, Nvidia software, and Accenture AI transformation services.
This partnership is ServiceNow's attempt to show that its path is not necessarily to engage in LLMs, but to help its customers derive value from their own LLMs.
"There's this whole kind of LLM war going on -- we're not in that," Barnes said. "We will win no matter who wins the LLM wars. Our customers will get value from the way it's embedded in their platform."
Examining ServiceNow's strategy
ServiceNow's generative AI strategy is about governance concerns surrounding generative AI models and the vendor's plan to stay relevant in the AI market in the long term, IDC analyst Lara Greden said.
Therefore, ServiceNow's plan of using its own LLM ensures enterprises can feel secure that there are walls that can prevent their proprietary data from leaking out, Greden said.
Keith KirkpatrickAnalyst, Futurum Group
Moreover, ServiceNow is also showing that it is keeping up in this new era of generative AI, she said. Instead of using an API, the vendor has more control by providing its own set of generative AI models.
"That strategy that they're playing puts them in a position of strategic advantage as they continue like others to navigate what the ultimate opportunity [and] business margin will be, both near term and long term," Greden said, referring to the vendor's generative AI strategy.
That advantage could be providing different options for enterprises in the market.
"They're giving customers the options to really push generative AI into deeper or more customized use cases," said Keith Kirkpatrick, an analyst at Futurum Group.
This is where new capabilities, such as text-to-code, come in because they help developers maximize their expertise and use the capabilities for more simple tasks, Kirkpatrick added.
A choice for enterprises
For many enterprises, ServiceNow's strategy could be appealing, depending on what they need.
Some enterprises might need an LLM from OpenAI and the larger data set it offers, while others need something more targeted such as what ServiceNow is offering.
"It's just like most things," Gartner analyst Cameron Haight said. "You can either go wide with a much larger kind of net, but then the filtering process, the buying process, becomes more difficult and requires more resources, or you go narrow, which provides you a smaller corpus of information that makes potentially the development of models more easy and perhaps more pertinent."
Some large enterprises could even decide to opt to avoid using vendors such as ServiceNow or OpenAI and instead create their own models without using a third party to do so. For example, Bloomberg created BloombergGPT, a large language model for finance. Wayfair is also an enterprise that is considering building its own LLM.
However, whatever approach enterprises take, a main challenge is cost. That is true whether the user chooses to go with LLMs that are broad such as the one OpenAI offers, application-specific ones such ServiceNow's or industry-targeted models such as Bloomberg's.
LLMs, by nature, are expensive because of the significant computing power they require, among other factors. So big enterprise software vendors such as ServiceNow or Salesforce will have to find a way to get customers to pay the upcharge for the capabilities they create using their own LLMs.
"It really comes down to ROI," Kirkpatrick said. "Are organizations going to see another value created by these use cases? My sense is that over time, sure -- but initially, I think it's going to be a bit of, 'Hey, let's throw out an XYZ price and see what the uptake is.'"
Esther Ajao is a news writer covering artificial intelligence software and systems.