Facial Expressions: A New Way to Control VR and AR Technology

Image by Unsplash

This post is also available in: עברית (Hebrew)

A recent study offers promising insights into how facial expressions could be used to control virtual reality (VR) and augmented reality (AR) systems, potentially revolutionizing accessibility for users with disabilities. Conducted by researchers from the University of Glasgow and the University of St. Gallen in Switzerland, the study demonstrates how Meta’s Quest Pro headset can recognize seven simple facial movements, offering a hands-free way to navigate virtual environments.

The study, set to be presented at the CHI 2025 conference in Japan later this month, focused on testing the effectiveness of facial expressions as an input method for VR and AR. According to TechXplore, participants were asked to perform 53 facial expressions recognized by the Quest Pro’s onboard software. The goal was to identify expressions that were both easily recognized by the system and comfortable enough for users to repeat regularly.

The researchers found that seven Facial Action Units (FAUs)—such as squinting the eyes, puffing the cheeks, and pulling the mouth’s edges—provided the best balance of accuracy and comfort. These expressions could reliably control basic tasks in a VR game and an AR web environment.

The results suggest that facial expression control could become a valuable alternative to traditional hand controllers, particularly for individuals with motor impairments or disabilities that prevent the use of standard input devices. The study’s authors noted that VR and AR technologies have often been considered inaccessible due to their reliance on dexterous hand movements. This new approach offers the potential to overcome those barriers.

To validate the facial control method, the team created a neural network model that achieved 97% accuracy in recognizing the selected facial movements. In trials, volunteers used their faces to perform tasks like turning, selecting options, and navigating web pages. While controllers still provided more precise control for gaming, participants found the facial input method intuitive and easy to use for browsing, according to TechXplore.

The researchers believe that facial control could be a game-changer not only for people with disabilities but for everyday tasks as well, such as controlling devices when hands are occupied. They plan to expand the research to include individuals with disabilities, offering a new avenue for accessibility in XR technologies.

This study highlights the untapped potential of facial expressions in user interfaces and opens the door for more inclusive and efficient interactions in both VR and AR environments.