The Physical AI Revolution: What You Need to Know
As AI continues to reshape digital work, a new frontier is emerging: Physical AI. This evolution blends artificial intelligence, robotics, and IoT to bring intelligence into the physical world, revolutionizing how tasks are performed across industries.
In a recent SSON webinar, in collaboration with Formant, industry leaders from the AI and robotics spaces detailed what is on the horizon for Physical AI. Panelists for, Physical AI: Understanding the Next AI Evolution & How it Applies to Your Business, included:
- Jeff Linnell, Founder & CEO, Formant
- Kate Kidd, Vice President of Product, SoftBank Robotics America
- James Turnshek, Co-founder & Chief Architect, Formant
What is the current reality of Physical AI?
“The future is not exactly evenly distributed yet,” said James Turnshek, highlighting that while generative AI has reshaped digital workflows, its application in the physical world is only beginning to mature.
He explained how AI entered the physical realm around 2012 with advances in deep learning and image recognition. In 2025, we are now entering a phase where AI doesn’t just perceive, it acts. This paves the way for machines that adapt and collaborate in real-world environments.
However, the panelists were keen to differentiate between science fiction fantasies and today’s reality. While AI-powered, humanoid robots may be a long-term vision, functional, task-specific robots are already generating business value at scale.
“Humanoid form and fully general, super-intelligent models are not completely necessary to run robots in the real world, and to do useful things.”
What are the three domains of Physical AI?
Turnshek then categorized Physical AI into three core domains:
1. Robotics: Machines that interact with the world through sensors and actuators.
“If you have a fully intelligent AI model that can run a robot, and it can perform tasks and listen to instructions, you can very easily imagine how that impacts the world.”
2. Simulation: AI used to model real-world outcomes for planning and testing.
3. Data Systems: AI tools analyzing sensor and camera data to optimize decisions.
“It’s really AI systems that can act on the data generated by the physical world in some way that actually improves our ability to operate in it.”
These systems, Turnshek noted, are “essentially big cameras running around the world collecting data about the business environment.”
How are robotics and AI already being used?
Kate Kidd shared how SoftBank Robotics has deployed over 35,000 robots globally for commercial use, from cleaning to logistics.
“We look for dull, dangerous, repetitive, or expensive tasks […] That’s where automation is not just feasible, it’s necessary.”
But SoftBank’s focus isn’t just on deploying robots; it’s about creating the ecosystem that allows them to succeed at scale. That includes integration with enterprise systems, continuous support, and real-time analytics.
“The companies that don’t invest in the surrounding systems find out they’re not actually able to scale.”
How is Formant building the intelligence layer for Physical AI?
Jeff Linnell outlined how Formant is enabling physical AI by connecting robots, AI models, and business systems through a unified intelligence platform.
He described three waves of robotics:
1. Industrial Robotics: Pre-programmed, fixed machines (such as automotive welders).
2. Cloud Robotics: Mobile, connected robots in dynamic environments.
3. Physical AI: Today’s emerging stage, with intelligent, adaptive, and networked machines.
Formant’s vision is a platform that offers a “single pane of glass,” a unified interface for managing disparate robotic systems.
“Think about this as connected systems tied together by an intelligence layer. They’re sharing information with us and amongst themselves […] We’re far more interested in robots speaking to humans, robots speaking to each other, and sharing information so they can better do their jobs.”
What is the human role in the Physical AI revolution?
Rather than replacing people, Physical AI is designed to augment human capability. Both Kidd and Linnell emphasized the need for tools that are intuitive and integrate into existing human workflows, particularly in industries such as cleaning and facilities management, where staff turnover is high and on-the-ground workers are under constant pressure. “If you have the turnover, it has to be simple,” Kidd noted. “You have to have a process in place that's lights out […] easy for the employee to start using the technology.”
SoftBank is already deploying chat-based assistants and mobile tools that frontline workers use to interact with robots through voice or text.
“Operators can call an AI assistant, ask questions to a robot, get photo validation- things like that are already starting to go to market.”
Similarly, Linnell shared how Formant is making human-robot interaction conversational:
“I was dubious at first. Do I want to be talking to my fleet? But I find myself more often than not talking to the fleet. I feel like Tony Stark.”
And it’s not just humans initiating the interaction; robots themselves can now communicate their needs. “There’s something sort of magical about that,” Linnell said. “Where the machines are doing their own preventative maintenance… and they can actually ask for what they need.”
The Physical AI Revolution is Here
Overall, the Physical AI revolution isn’t something that’s coming; it’s already here, quietly transforming how work is done in facilities, logistics, agriculture, and more. As Linnell concluded:
“We’re at the point where any of us, not just engineers, can now broadcast attention to the system and say, ‘Here’s what I need you to do.’”
To learn more, you can watch the webinar recording on demand above. To gain more excellent insights from our SSO Network, please join us for our upcoming Process Mining and Intelligence Virtual Summit.