Artificial Intelligence is Taking Surveillance Technologies to Next Level

Artificial Intelligence is Taking Surveillance Technologies to Next Level

surveillance

This post is also available in: heעברית (Hebrew)

New deep learning techniques have enabled us to analyze video footage more quickly and cheaply than ever before, and a larger number of companies in Japan, America, and China are developing products with resembling capabilities. Similar features are making their way into home security cameras, with companies like Amazon and Nest offering rudimentary AI analysis.
This sort of automated surveillance is only going to become more common in the future, with researchers working on advanced analysis like spotting violent behavior in crowds, and tech companies selling tools like facial recognition to law enforcement.
A new AI security cam is designed to help shop owners in Japan spot potential shoplifters. It uses open source technology developed by Carnegie Mellon University to scan live video streams and estimate the poses of any bodies it can see.
The security camera, called the “AI Guardman”, was built by Japanese telecom giant NTT East and startup Earth Eyes Corp.
The system tries to match data to predefined ‘suspicious’ behavior. If it sees something noteworthy, it alerts shopkeepers via a connected app, according to theverge.com.
But there are a lot of potential problems with automated surveillance, including privacy, accuracy, and discrimination. Although AI can reliably look at a person and map out where their body parts are, it’s challenging to match that information to ‘suspicious’ behavior, which tends to be context dependent.
NTT East said that “common errors” by the AI Guardman included misidentifying both indecisive customers (who might pick up an item, put it back, then pick it up again) and salesclerks restocking shelves as potential shoplifters.
It’s also possible that the training data might be biased towards certain groups, or that the technology might be used as a pretext for discrimination. NTT East denied that the technology could be discriminative as it “does not find pre-registered individuals.”