The New Chip That Can Protect You from Data Theft

The New Chip That Can Protect You from Data Theft

This post is also available in: heעברית (Hebrew)

Researchers from the MIT-IBM Watson AI Lab developed new machine-learning accelerator chips that aim to enhance data security for health trackers, fitness apps, and other AI-powered devices.

While health-monitoring apps are extremely useful and help people manage their wellness and even chronic conditions, they require constant data exchanges between your phone and a central server. Unfortunately, this constant communication drains the device’s battery and makes these apps sluggish and slow. To combat this, engineers often use machine-learning accelerators (specialized hardware that speeds up the process). However, these accelerators leave devices vulnerable to data theft and the stealing of sensitive information.

MIT researchers explain that their new machine-learning accelerator is designed to resist the most common types of attacks, and involves several clever optimizations to maximize security while minimizing the impact on speed and accuracy.

The accelerator aims to maintain the privacy of sensitive user data while allowing large AI models to run seamlessly on devices. “It is important to design with security in mind from the ground up. If you are trying to add even a minimal amount of security after a system has been designed, it is prohibitively expensive,” explains EECS graduate student at MIT Maitreyi Ashok.

According to Interesting Engineering, the key to the enhanced security of these machine-learning accelerators lies in a three-part approach – first the chip splits data into random fragments to prevent hackers from reconstructing meaningful information through side-channel attacks. It then uses a lightweight cipher to encrypt the AI model stored in off-chip memory, making “bus-probing attacks” ineffective. Finally, it generates a unique decryption key directly on the chip itself that is nearly impossible for hackers to duplicate.

MIT’s Chief Innovation Officer Anantha Chandrakasan said: “As security has become a critical issue in the design of edge devices, there is a need to develop a complete system stack focusing on secure operation.” He also explained that the device generates “unique codes” through randomization and variability and secures data access between the processor and memory to prevent side-channel attacks, concluding that these kinds of designs are going to be critical in future mobile devices.

When testing this innovation, the researchers simulated millions of real-world hacking attempts and were unable to recover any private information, while they managed to steal data from an unprotected chip after only a few thousand samples.

The implications of this advancement are immense, and while it was initially created for health apps, secure machine-learning accelerators could power demanding AI applications like virtual reality or autonomous driving while prioritizing the secure handling of user data.