VR Gaming Systems Are Not Safe From These Cyber Threats

VR Gaming Systems Are Not Safe From These Cyber Threats

Virtual Reality. image by pixabay

This post is also available in: heעברית (Hebrew)

Voice command features on virtual reality headsets could lead to major privacy leakages, known as “eavesdropping attacks.” Built-in motion sensors within common VR headsets such as Oculus Quest 2, HTC Vive Pro, PlayStation VR, do not require any permission to access. This security vulnerability can be exploited by malicious actors intent on committing eavesdropping attacks. Researchers at Rutgers University-New Brunswick have discovered that hackers could use popular virtual reality (AR/VR) headsets with built-in motion sensors to record subtle, speech-associated facial dynamics to steal sensitive information communicated via voice-command. This information could include sensitive credit card data and passwords.

To demonstrate the existence of security vulnerabilities, leading researcher, Yingying “Jennifer” Chen, associate director of WINLAB and graduate director of Electrical and Computer Engineering at Rutgers University-New Brunswick and her fellow WINLAB researchers developed an eavesdropping attack targeting AR/VR headsets, known as “Face-Mic.”

“Face-Mic is the first work that infers private and sensitive information by leveraging the facial dynamics associated with live human speech while using face-mounted AR/VR devices,” said Chen. “Our research demonstrates that Face-Mic can derive the headset wearer’s sensitive information” with some of the most popular AR/VR headsets. 

Eavesdropping attackers can derive simple speech content, including digits and words, to infer sensitive information, such as credit card numbers, Social Security numbers, phone numbers, PIN numbers, transactions, birth dates and passwords. Exposing such information could lead to identity theft, credit card fraud and confidential and health care information leakage. Once a user has been identified by a hacker, an eavesdropping attack can lead to further exposure of user’s sensitive information and lifestyle, such as AR/VR travel histories, game/video preferences and shopping preferences. Such tracking compromises users’ privacy and can be lucrative for advertising companies.

The researchers hope these findings will raise awareness in the general public about AR/VR security vulnerabilities and encourage manufacturers to develop safer models.The research team is now examining how facial vibration information can authenticate users and improve security, and how AR/VR headsets can capture a user’s breathing and heart rate to measure well-being and mood states unobtrusively, according to the university’s announcement. 

Prepared to dive into the world of futuristic technology? Attend INNOTECH 2022, the international convention and exhibition for cyber, HLS and innovation at Expo, Tel Aviv, on November 2nd – 3rd

Interested in sponsoring / a display booth at the 2022 INNOTECH exhibition? Click here for details!