Nearly a week after introducing a new AI chip, IBM expanded its embeddable AI software portfolio with three new libraries.
IBM's embeddable AI enables enterprises to embed different AI applications within their own products and tools. The new libraries, revealed on Oct. 25, are the IBM Watson Natural Language Processing library, IBM Watson Speech to Text library and IBM Watson Text to Speech library.
The natural language processing library helps developers produce human language processing tools that can obtain meaning through intent. The speech-to-text library helps with speech transcription in customer service environments. The text-to-speech library helps developers translate text to audio in different languages.
Previous IBM embeddable AI products include IBM Watson Assistant, IBM Watson Discovery and IBM Maximo Visual Inspection.
By making Watson embeddable in other applications, IBM follows other large hyperscale AI vendors such as Google Cloud and Microsoft in commoditizing their products for developers, said Dan Miller, analyst at Opus Research.
Vendors are making their tools into better building blocks that developers can incorporate into their own products.
Dan MillerAnalyst, Opus Research
"They're making elements of Watson more easily consumable," Miller said. "It's at long last."
This will also help developers who might not want to build their own machine learning models, said Arun Chandrasekaran, vice president analyst at Gartner.
"It is a way for developers to introduce machine learning into their workflow without having to learn about it," Chandrasekaran said.
IBM is also competing against OpenAI, an AI research vendor known for putting its tools directly into the hands of developers, who then resell them. A few weeks ago, OpenAI released Whisper, a speech recognition model trained on a large data set of diverse audio.
Moreover, IBM aims to get Watson more deeply implemented and use its ecosystem partners as multipliers, Miller said. Ecosystem partners already using embeddable AI products include SingleStore, EquBot, CrushBank and Sherlok, according to IBM.
While making Watson embeddable is a natural step for IBM, it's unclear what part of the libraries introduced were scaled down to be easy to use by developers, Miller said.
In addition, it's important for these machine learning models to have responsible AI guardrails.
"With natural language in particular, you don't want out-of-the-box kind of status suddenly cropping up," Chandrasekaran said, adding that vendors must make sure models aren't biased or racist. "I think the ability to deliver these [models in a] trustworthy manner is super important, apart from the accuracy of the model and the performance."
Last week, IBM launched the Artificial Intelligence Unit (AIU), an application-specific integrated circuit for deep learning applications. AIU is built with 32 processing cores and contains 23 billion transistors -- about the same number as an IBM z16 chip. It can be plugged into any computer or server with a PCIe slot and programmed to run any deep learning task.