Securing This Network is Imperative

Securing This Network is Imperative

securing neuron network

This post is also available in: heעברית (Hebrew)

Neural networks are part of artificial intelligence technology. Inspired by the nerve cells (neurons) that make up the human brain, neural networks comprise layers (neurons) that are connected in adjacent layers to each other. The more layers, the “deeper” the network. In order that a deep learning network recognizes photos of objects, humans, etc., we need to compile a training set of images – thousands of examples of the object, and label them accordingly.

A new secure neural network will provide privacy and security for analysts training deep neural networks. The U.S. government relies on deep neural networks for critical machine learning tasks. 

The Strategic Capabilities Office of the US Secretary of Defense (OSD) has awarded funding to Charles River Analytics for the development of a Secure Private Neural Network (SPNN) that hardens deep neural networks against adversary attacks.

SPNN is intended to provide privacy and security for analysts training deep neural networks to perform inference on big data. These networks learn using training datasets that may contain sensitive data; adversaries can exploit these networks, causing data breaches or misclassification of sensitive information.

The OSD Strategic Capabilities Office is concerned with both black box and white box attacks on a deep neural network. According to the company, SPNN will produce a secure neural network to preserve the privacy of training and testing data against white box attacks via end-to-end efficient encryption, and other tools, according to mil-embedded.com.

The company conducts AI, robotics, and human-machine interface R&D and leverages that R&D to create custom solutions.