your123 - stock.adobe.com

Cerebras launches Alibaba model, forms key AI partnerships

The vendor expanded Alibaba’s Qwen3-235B context window to 131k. It also formed strategic partnerships with DataRobot, Notion and other companies.

AI hardware and software vendor Cerebras Systems on Tuesday launched Alibaba's Qwen3-235B reasoning model adapted for Cerebras’ inference platform.

The vendor also revealed partnerships with other vendors including Notion and DataRobot.

Cerebras introduced the Chinese tech giant’s reasoning model, with a larger 131K context window, on its inference platform Cerebras Inference Cloud, at the RAISE Summit conference in Paris.

Alibaba and other partnerships

According to the AI vendor, its Wafer Scale Engine speeds up Qwen3-235B and reduces response times from one to two minutes to one to two seconds. Cerebras also expanded the Qwen3-235B context window from 32K to 131K tokens, enabling the model to simultaneously process dozens of files and tens of thousands of lines of code.

Cerebras also offers the model at $0.60 per million input tokens and $1.20 per million output tokens, which Cerebras said is less than a tenth the cost of comparable closed-source models. OpenAI's o3 reasoning model is $2 per million input tokens and $8 per million output tokens.

Besides this offering, Cerebras introduced several partnerships, including with Docker, vendor of an open source platform that helps developers build and run applications. With Docker Compose and Cerebras, developers can deploy multi-agent AI stacks within seconds, Cerebras said.

Cerebras also revealed that DataRobot's new open source AI/machine learning framework for automating agentic workflows, Syftr, now uses Cerebras Inference. With the integration, DataRobot customers can build highly quality and low-latency agentic applications. Cerebras inference also powers Hugging Face's SmolAgents library, enabling developers to create agents that reason, use tools and run code. Also, Notion, a workspace platform, now uses Cerebras' AI inference technology to power its AI offering, Notion AI for Work.

Extending its presence

Through its partnerships, Cerebras is expanding its reach as it seeks to compete against vendors like Groq, AMD and SambaNova.

"It seems like Cerebras is doing a good job of extending its ecosystem," said Karl Freund, founder of Cambrian AI. "[It] makes their product more available and credible to the market."

He added that Cerebras, a vendor that started small in 2015 and now offers an ultra-fast wafer-scale engine system for training generative AI models that costs $2 to $3 million, found it hard to gain acceptance in the market because it didn't have a lot of software to accompany its hardware.

"They're known as being the fastest AI machine out there, but they're also known as the most expensive machine system out there," Freund said. He added that Cerebras needs to demonstrate its price performance, and the partnerships help with that.

The vendor also faces a challenge in gaining more awareness and consideration for its processors compared to Nvidia, the dominant AI chipmaker, said Addison Snell, CEO of Intersect360 Research.

"It's a very high cost of sales and establishing that level of interest, there's a lot of cost," Snell said. Moreover, Nvidia provides hardware accompanied by software components, an approach that many, like Cerebras, are trying to pursue and are trying to catch up on, Snell said.

Qwen3-235B

Regarding its deployment of Qwen3-235B, Cerebras stands out because of the expansion of the context window.

"It takes it from a toy to a real enterprise platform," Freund said. He added that the larger context window will help those trying to write code and improve their productivity.

"If you can do a large context window, which is important for coding and agentic AI, if you can do that for one-tenth the price, I think you've got something that's going to make a difference," Freund continued.

The deployment of Qwen3-235B also shows how Cerebras thinks about the international market.

"Any major AI process provider is looking at the global market for its processors and considering how it competes internationally," Snell said. "In the long term, national sovereignty efforts, particularly in government-funded or controlled AI datacenters, might make that more difficult, but for the time being, it seems to be fair play."

Meanwhile, OpenAI, Microsoft and Anthropic joined with the American Federation of Teachers to create the National Academy for AI Instruction. The new $23 million initiative is geared toward training 400,000 educators in the next five years. The academy will include hands-on workshops for educators. Its headquarters are in Manhattan.

Microsoft also introduced Deep Research in Azure AI Foundry on July 7. This API and SDK enable developers to build agents that can plan, analyze and synthesize information across the web.

Esther Shittu is an Informa TechTarget news writer and podcast host covering artificial intelligence software and systems.

Dig Deeper on AI technologies