IBM Consulting workers can now use GenAI bots to help implement their client's generative AI projects. The bots may, in time, also be queried by Watsonx users as well.
The company rolled out role-based chatbots on its Consulting Advantage platform. The AI assistants have been trained to find answers for project managers, developers, business analysts, salespeople, software testers and more of the 160,000 IBM Consulting employees. The toolbox is powered by Watsonx, IBM's line of generative AI cloud services released last year.
The idea behind the bots is to enable IBM consultants to complete work for Watsonx users faster, said Matt Candy, a managing partner for Generative AI at IBM Consulting. That work typically falls into the category of digital business transformation AI -- generative and otherwise -- use cases. The bots can dig into data from IBM and third-party models embedded in the Watsonx platform, as well as uploaded documents and data specific to a project.
The bots have potential to help not only consultants complete work more quickly, but also to be deployed throughout Watsonx enterprise teams, said Gartner analyst Arun Chandrasekaran.
Candy said the bots could potentially help mold AI use cases for customers after the consulting engagement ends.
"You're going to end up in a world where you're blending human and AI-based workers and labor together in order to deliver those outcomes and tasks more effectively," Candy said. "Today, a client might [engage me] as a consultant to come and work with them … [then] the human may finish working with that client, and we leave behind various kinds of AI assistants they can continue to use."
Security, governance major priorities
Watsonx is an open platform that can accommodate whatever large language model (LLM) the user wants to deploy, Chandrasekaran said. Amazon and Google offer AI platforms as well, but he said he doesn't see IBM as competing directly with the cloud hyperscalers for business. Rather, IBM Consulting will use generative AI to keep its massive customer base -- in verticals such as healthcare, consumer banks, telco and government -- current.
The platform model enables this because IBM delivers security and data governance that is trusted by regulated industries, he added. Knowing that's in place is crucial for open platforms that enable users to access third-party tools and LLMs.
To that end, IBM along with a number of AI heavy hitters such as Amazon, AMD, Nvidia, Intel and Salesforce, invested in Hugging Face, an open source community-driven platform that hosts machine learning models and tools. IBM so far has contributed 200 open models and data sets to the Hugging Face community, which has collaborated with IBM to build, deploy and customize Watsonx foundation models.
Enterprises have begun to embrace generative AI in their business workflows, albeit slowly, Chandrasekaran said. Many use cases he's seen are internal, not customer-facing, and are in early stages of development. Some of the more common examples include AI code generation for developers, document translation and content summarization.
"Customers are not sitting still and doing nothing with GenAI," he said. "They are actively running pilots and exploring the possibilities."
The potential of generative AI to do real work will likely not be overlooked by IBM Chairman and CEO Arvind Krishna. He has been a strong advocate of AI within IBM, telling Bloomberg last year that he "could easily see" a third of 7,800 IBM non-customer-facing positions currently affected by a hiring pause be filled with AI "employees" between now and 2028.
Don Fluckinger covers digital experience management, end-user computing, CPUs and assorted other topics for TechTarget Editorial. Got a tip? Email him.