This post is also available in: heעברית (Hebrew)

Privacy preserving methods are becoming increasingly important in enhancing consumer privacy and security as voice technologies become part of daily life. Consumers use voice assistants like Amazon Alexa or Google Assistant to shop online, make phone calls, send messages, control smart home appliances and access banking services. However, these devices have security flaws. Hackers can manipulate voice assistant systems. 

Researchers have developed a new technique to protect consumers from voice spoofing attacks. The Void (Voice liveness detection) system can be embedded in a smartphone or voice assistant software and works by identifying the differences in spectral power between a live human voice and a voice replayed through a speaker, in order to detect when hackers are attempting to spoof a system.

“Although voice spoofing is known as one of the easiest attacks to perform as it simply involves a recording of the victim’s voice, it is incredibly difficult to detect because the recorded voice has similar characteristics to the victim’s live voice. Void is game-changing technology that allows for more efficient and accurate detection helping to prevent people’s voice commands from being misused,” said Muhammad Ejaz Ahmed, Cybersecurity Research Scientist at CSIRO’s Data61. 

Unlike existing voice spoofing techniques which typically use deep learning models, Void was designed relying on insights from spectrograms — a visual representation of the spectrum of frequencies of a signal as it varies with time to detect the ‘liveness’ of a voice.

This technique provides a highly accurate outcome, detecting attacks eight times faster than deep learning methods, and uses 153 times less memory, making it a viable and lightweight solution that could be incorporated into smart devices.

How to protect data when using voice assistants? Dr Adnene Guabtni, Senior Research Scientist at CSIRO‘s Data61, shares tips for consumers on how to protect their data when using voice assistants, according to

Always change your voice assistant settings to only activate the assistant using a physical action, such as pressing a button.

On mobile devices, make sure the voice assistant can only activate when the device is unlocked.

Turn off all home voice assistants before you leave your house, to reduce the risk of successful voice spoofing while you are out of the house.

Voice spoofing requires hackers to get samples of your voice. Make sure you regularly delete any voice data that Google, Apple or Amazon store.

Try to limit the use of voice assistants to commands that do not involve online purchases or authorizations – hackers or people around you might record you issuing payment commands and replay them at a later stage.

Prepared to dive into the world of futuristic technology? Attend INNOTECH 2023, the international convention and exhibition for cyber, HLS and innovation at Expo, Tel Aviv, on March 29th-30th

Interested in sponsoring / a display booth at the 2023 INNOTECH exhibition? Click here for details!