Meta is Advancing Robot Touch, Dexterity, and Human-Robot Collaboration with AI

image provided by pixabay
Representational image

This post is also available in: עברית (Hebrew)

Meta is stepping into the future of robotics with an ambitious focus on “embodied AI”—technology that brings robots closer to mimicking human-like interaction through touch, feel, and reasoning. By partnering with US-based GelSight and South Korean firm Wonik Robotics, Meta is developing tactile sensors that will allow robots to understand and respond to physical sensations, making them more effective and responsive in real-world environments.

Meta’s new research initiative includes three groundbreaking projects: Sparsh, Digit 360, and Digit Plexus. These innovations are designed to enhance robot dexterity and human-robot interaction through advanced touch perception and reasoning.

Sparsh is a collaboration with the University of Washington and Carnegie Mellon University, focused on enabling robots to “feel” their surroundings. Using vision-based tactile sensing, Sparsh helps robots determine how much pressure can be applied to an object without causing damage—an essential skill for tasks like delicate object manipulation.

Digit 360 takes tactile sensing a step further with a finger-shaped sensor featuring over 18 sensing capabilities and more than 8 million taxels (TActile piXELs). This allows the robot to detect and respond to omnidirectional deformations with great precision. What sets Digit 360 apart is its onboard AI models, enabling faster, localized processing of touch data—similar to how humans react quickly to stimuli.

Digit Plexus is a versatile hardware-software platform that integrates various tactile sensors into a single robotic hand. Meta intends to share the design and code for Digit Plexus, aiming to accelerate research into robot dexterity.

In addition to these sensors, Meta is unveiling PARTNR, a new benchmark designed to test AI’s ability to assist in human-robot collaboration. Built on Meta’s Habitat platform, PARTNR evaluates how well AI models help humans carry out household tasks. With 100,000 tasks and 5,800 objects, this benchmark helps measure how effectively AI can follow natural language instructions and reason through complex tasks.

With these innovations, Meta is positioning itself as a leader in the next phase of robotics—where AI doesn’t just act as a tool, but as a true partner in collaboration.