Smarter robots: Agentic and physical AI converge in business

The robotics industry is entering a new era where, thanks to AI, robots learn, optimize and solve the world's most complex supply chain, logistics and labor challenges in real time.

Robots have long handled the jobs that workers consider dull, dirty and dangerous -- think cold storage warehouses where workers lift heavy packages in subzero temperatures all day. But with advances in AI and robotics, physical AI systems have reached a tipping point where they're capable of more.

During CES 2026 in January, Nvidia CEO Jensen Huang predicted that the "ChatGPT moment for physical AI" is arriving.

A number of factors have reached a confluence that led to this point: AI models that provide cognitive intelligence and situational awareness, mechanical systems such as actuators and sensors that enable more precise physical movements, and availability of high-density, lower-cost battery systems that power mobility, according to the January 2026 Barclays "AI Gets Physical" report.

"We are at that tipping point where physical AI is making improvements, leaps and bounds, and will really enable a lot of operations for robots that couldn't be done in the past," said Thomas Ryden, executive director of MassRobotics.

In light of these advancements, Barclays predicted the humanoid market will grow from $2 billion to $3 billion today to $40 billion and possibly $200 billion by 2035.

Advancements in AI and robotics

James Kuffner, CTO at Symbotic, a robotics company focused on the supply chain, and co-founder of Google's robotics division, has worked in robotics for more than 30 years and seen first-hand its advancements. "Back then, we didn't have the memory; we didn't have the compute power and the ability to manage the data needed to build anything beyond a toy," he said.

Robots would break down before scientists even had enough data to train their algorithms, Kuffner said. A major milestone came when computer vision began working, enabling robots to detect objects, navigate and do 3D mapping.

Those early robots did what they were programmed to do, whether or not it was the most efficient approach. AI changed all that. "Robots are getting smarter in the sense that they can go, 'Oh, wait a minute, if I change my path plan, I can do this more efficiently,'" Ryden said. "They can replan on the fly, and that's going to help make the entire operation more efficient." If there are multiple robots from different vendors on the factory floor, he surmised, and they all try to go down aisle 13 because it's the shortest and quickest way, they can autonomously choose a different path to avoid a traffic jam.

Training ground

Physical AI systems are being trained in simulated environments, where real-time operational data is integrated to test complex what-if scenarios and validate autonomous workflows.

"Today, we see companies generating vast, structured data sets entirely in simulation to bridge the Sim2Real gap," said Jeff Wilhelm, CEO of Infused Innovations, a consultancy focused on emerging technologies. "By deliberately injecting extreme variables, such as erratic lighting, unexpected visual noise or rare equipment failures, we can stress test AI agents in ways that would be impossible in the physical world."

Ryden emphasized the need for realism in simulations to ensure they match unpredictable, dirty and dangerous conditions. "There's oil, there are bits of machining parts. Where's that in your simulation?" he said. "Can you translate what's simulated to actually working in a messy environment?"

Symbotic relies on digital-twin simulation technology it acquired from OhmniLabs in 2024. The simulation system allows production code to be run on a virtual version of a robot, or digital twin, in an environment that's synthetic, so the team can conduct deterministic and repeatable testing, Kuffner said. "When you find a bug," he explained, "you can model it synthetically, create data to address it and behaviors that can be learned to keep it working better, and then that becomes part of a long collection of regression tests that you build as a library to keep the system ever improving."

Physical AI applications

While videos of humanoid robots "doing weird and dangerous things" tend to go viral, the robots making an impact are less entertaining, Ryden said. "We see a number of use cases in warehouses, in factories, where humanoids are really adding value. And that's going to continue to grow," he said. "Those aren't the fun ones to watch, though. They're really repetitive. They just move the same part hundreds of times."

We see a number of use cases in warehouses, in factories, where humanoids are really adding value.
Thomas RydenExecutive director, MassRobotics

The most common applications for robots today are in manufacturing, heavy industry and logistics.  Boston Dynamics' electric Atlas robot uses reinforcement learning (RL) and large behavior models that let the robot perform tasks autonomously, such as sequencing parts in automotive manufacturing environments, and react to unexpected variables in real time, a company spokesperson said.

This AI-driven approach has drastically increased the speed of training and the complexity of the tasks the robot can handle. "Much like how RL allowed [the robot] Spot to master slippery surfaces through thousands of simulations," the spokesperson said, "Atlas uses balancing controllers and computer vision to perceive surroundings and identify objects without teleoperation."

The electric Atlas, under development for Hyundai facilities and Google DeepMind, is more reliable, as the focus shifts toward dexterous innovation and generalizability, according to Boston Dynamics. The company expects its robots to have a major effect in challenging environments, like nuclear decommissioning and disaster response.

Amazon has famously developed its own fleet of robots and built a generative AI foundation model to run them across its fulfillment network. The company reportedly operates over 1 million robots, including Hercules for lifting heavy inventory, Pegasus robots for precision sorting, and a fully autonomous robot called Proteus capable of navigating around employees in open and unrestricted areas to move heavy carts full of customer orders.

In the energy sector, quadruped robots equipped with acoustic and thermal sensors conduct autonomous inspections of hazardous facilities. "Instead of just recording video, the onboard edge AI processes anomalies locally and determines whether a safety threshold has been breached before alerting a human operator," Wilhelm said.

Robots are even beginning to aid in nursing shortages. One example is Mirokai, a 4-foot-tall, AI-powered humanoid developed by French company Enchanted Tools that's being used in California to support elderly residents, particularly those with dementia.

Physical AI's limitations

There are still a number of challenges that need to be overcome to make physical AI more widely useful and accessible -- among them is battery life. "Everybody wants the robot to do tasks that are similar to humans in many respects, but none of them can operate at the same capacity that [humans] can," Ryden said. "You can go on a factory floor and do some task for eight hours. Most of these robots can't. … Compared to humans, robots still have a long way to go in terms of their energy efficiency."

The battery life of Boston Dynamics' Atlas humanoid, for example, is four hours during typical use, but it can autonomously swap its own batteries in a few minutes.

Symbotic recently started using battery technology from Nyobolt for its Symbot autonomous mobile robots. The new battery holds six times the energy capacity and is 40% lighter than the ultracapacitors the company used previously, enabling the robots to do more work for longer periods.

Another shortcoming is manual dexterity, due to mechanical, sensory and computational constraints. Robotic hands must operate freely with more than 20 joints, which makes real-time motion planning, force control and collision avoidance complicated to achieve. But there has been progress in 3D spatial intelligence to support dexterity, according to the World Economic Forum white paper on physical AI published in September 2025.

What's holding back a lot of humanoids is the fact that the human hand has an incredible power-to-rate ratio.
James KuffnerCTO, Symbotic

"What's holding back a lot of humanoids is the fact that the human hand has an incredible power-to-rate ratio," Kuffner said. "It's powerful, but it's also very dexterous."

Kuffner also pointed out the goal of mimicking the capabilities of human skin, "the biggest sensor on the body," which senses temperature, force and the wind, and has the ability to self-heal. "We don't have anything like that today," Kuffner acknowledged. "At some point, I think we will have artificial skin. We will have proprioceptive technology for touch, and that will allow a lot of robots to do fine motor manipulation, dexterous manipulation and unlock a lot of tasks that humans can do."

High costs limit access

The cost of deploying autonomous types of robots remains prohibitive for all but the largest companies. Humanoid robots reportedly cost upward of $150,000 each.

One alternative approach is Robotic Logistics Platform as a service, an offering through Exol (formerly Greenbox) that Kuffner likened to paying for compute resources via the cloud. "[The service] provides the same cost efficiencies that the big guys who can afford to own and operate one of the Symbotic systems gets," Kuffner said. "That really allows more competition, which, in turn, creates opportunities for lower cost of goods all over."

Long-term, AI could help lower the cost of robots through the development of a multipurpose robot, Ryden said. "Physical AI can help with flexibility, so that one [robot] can do multiple tasks, and you don't have to spend time reprogramming it," he explained. "The robot is smart enough to know 'Yesterday I was picking cans of soda. Today, I need to pick packages of something else.' It can automatically adapt with limited reprogramming requirements."

Future robot uses

Boston Dynamics' products are designed for industrial and commercial use, but as the technology matures, consumer-level products might be available, according to the company. Looking ahead, the path from industrial use cases leads next to service industries, where humanoids like Atlas could appear in retail stores, restaurants, hospitals, schools and commercial offices -- and eventually even homes, a Boston Dynamics spokesperson said.

Before that happens, autonomous robots must meet safety guidelines. Deploying physical AI requires a governed, responsible architecture, with specific goals and guardrails well-defined from the outset, because choices made early about sensor data capture and models matter, Wilhelm said. "When we give physical machines the autonomy to reason and act, explainability becomes paramount," he added. "Ensuring that agentic decisions are transparent, governed by clear safety constraints and anchored in rigorous human validation is just as critical as the hardware itself."

The ultimate goal, Kuffner said, is to safely deploy robots in unstructured environments like our homes and offices, "where robots are going to be reasoning at a level of uncertainty that we still can't dream about. In the meantime, for the next 10, 15 years, we're going to see physical AI, embedded AI, creating lots of value in solving useful problems."

Bridget Botelho is a technology journalist covering artificial intelligence and emerging IT industry trends. 

Next Steps

The dollars and sense of implementing AI

Battle of the bots: Best GenAI chatbots for business

Context engineering takes prompting to a higher business level

Emerging technologies to watch

History of Generative AI Innovations Spans 9 Decades

Dig Deeper on AI business strategies