Physical AI is the next frontier - and it's already all around you

3 hours ago 4
RayNeo Air 4 Pro at CES 2026
Kerry Wan/ZDNET

Follow ZDNET: Add us as a preferred source on Google.


ZDNET's key takeaways

  • Physical AI is the latest trending frontier of the technology.
  • It leverages real-world data for more autonomous robots.
  • Its early stages could be on your face right now.

ChatGPT's release over three years ago triggered an AI frenzy. While AI models continue to become more capable, to truly be as helpful as possible to people in their everyday lives, they need to have access to everyday tasks. That's only possible by allowing them to live outside a chatbot on your laptop screen and more presently in your environment. 

Enter the industry's latest buzzword: physical AI. The term was on full display at the Consumer Electronics Show (CES) last week, with nearly every company touting a new model or hardware that can contribute to advancing the space, including Nvidia. During the company's keynote, CEO Jensen Huang even compared the significance of physical AI to that of ChatGPT's release. 

 "The ChatGPT moment for physical AI is here -- when machines begin to understand, reason, and act in the real world," he said. 

What is physical AI?

Physical AI can be generally defined as AI implemented in hardware that can perceive the world around it and then reason to perform or orchestrate actions. Popular examples including autonomous vehicles and robots -- but robots that utilize AI to perform tasks have existed for decades. So what's the difference?

Also: Can Google save Apple AI? Gemini to power a new, personalized Siri

According to Anshuman Saxena, VP and GM of automated driving and robotics at Qualcomm, the distinction lies in the robot's ability to reason, take action, and interact with the world around it. 

"The whole idea of a chain of thoughts, a reasoning, a brain, which will work in a context and take some actions as humans would -- that's the real definition of physical AI," said Saxena. 

For instance, a humanoid robot would be able to go a step beyond performing a task such as moving materials or packages as directed, and instead would be able to perceive its environment to intuitively perform the task. 

Also: Nvidia's Rubin may transform AI computing as we know it

However, examples don't have to be that elaborate; in fact, according to Ziad Asghar, SVP & general manager of XR, wearables, and personal AI at Qualcomm, you may already own a prime example of physical AI. 

"Smartglasses are the best representation already of physical AI," said Asghar. "They are a device that basically are present and are able to see what you are seeing; they're able to hear what you're hearing, so they're in your physical world."

A symbiotic data relationship 

Saxena adds that while humanoid robots will be useful in instances where humans don't want to perform a task, either because it is too tedious or too risky, they will not replace humans. That's where AI wearables, such as smart glasses, play an important role, as they can augment human capabilities. 

Also: CES 2026: These 7 smart glasses caught our eye - and you can buy this pair now

But beyond that, AI wearables might actually be able to feed back into other physical AI devices, such as robots, by providing a high-quality dataset based on real-life perspectives and examples. 

"Why are LLMs so great? Because there is a ton of data on the internet,  for a lot of the contextual information and whatnot, but physical data does not exist," said Saxena. 

The problem he describes is one that often hinders physical AI developments. Because it is too risky to train robots in the real world, such as by putting autonomous cars on the road, companies must create synthetic data simulations to train and test these models. Many companies attempted to tackle this issue at CES. 

Also: I'm an AI expert, and this note-taking pin is the most convincing hardware I've tried at CES

Nvidia released new models that understand the real world around you and can be used to create synthetic data and simulations that emulate realistic life scenarios. Qualcomm offers a comprehensive physical AI stack that combines a new Qualcomm Dragonwing IQ10 Series processor, released at CES, with the necessary tools for AI data collection and training. 

Creating datasets for this training is often a time-consuming and costly process. However, robots could use the data from the wearables people already use every day, which is effectively physical AI data that is true to human experiences. 

"Think about these sensors, the glasses, so many things that are there, which, if I have the glasses on, and I take an action based on, 'Oh, I saw something here,' so much information is immediately generated, which can help the robots as well, creating a new set of information today," said Saxena. 

Also: I tried Gemini's 'scheduled actions' to automate my AI - the potential is enormous (but Google has work to do)

Given the privacy concerns that may come from having your everyday data used to train robots, Saxena highlighted that the data from your wearables should always be kept at the highest level of privacy. As a result, the data -- which should already be anonymized by the wearable company -- could be very helpful in training robots. That robot can then create more data, resulting in a healthy ecosystem. 

"This sharing of context, this sharing of AI between that robot and the wearable AI devices that you have around you is, I think, the benefit that you are going to be able to accrue," added Asghar. 

Read Entire Article