This post is also available in: עברית (Hebrew)
In an attempt to control the crowds arriving for the 2024 Paris Olympics next summer, the authorities plan to use real-time cameras and artificial intelligence to detect suspicious activity, but civil rights groups say the technology is a threat to civilian rights.
According to BBC News, a recent law dictates that police will be able to use CCTV algorithms to pick up abnormalities like crowd rushes, fights or unattended bags, and explicitly rules out using facial recognition technology, as was adopted by China, for example, in order to trace “suspicious” individuals.
Despite this disclaimer, opposers claim that this is a very thin line that is easily crossed, and fear that the French government’s real intention is to make the new security provisions permanent.
“We’ve seen this before at previous Olympic Games like in Japan, Brazil, and Greece. What were supposed to be special security arrangements for the special circumstances of the games, ended up being normalized,” says Noémie Levain, of the digital rights campaign group La Quadrature du Net (Squaring the Web).
According to French officials, the AI system monitors all the cameras and raises an alert when detecting something it’s been told to look out for. It is then up to the human police officers to examine the situation and make an action plan.
The AI algorithm was trained by a huge bank of images of lone bags on the street, but unattended luggage is easy to detect- a person with malicious intentions is way harder to spot in a crowd.
This is where the XXII group comes in. It is a French start-up that specializes in computer vision software and is currently waiting for further specifications from the French government before fine-tuning its bid for part of the Olympics video surveillance contract.
The XXII group and other developers are aware of the criticism regarding this new so-called “unaccepted level of state surveillance”, but they insistently claim they have safeguards set in place, saying they cannot by law provide facial recognition.
Nevertheless, according to the digital rights activist Noémie Levain, this is only a “narrative”.
“They say it makes all the difference that here there will be no facial recognition. We say it is essentially the same,” she says. “AI video monitoring is a surveillance tool which allows the state to analyze our bodies, our behavior, and decide whether it is normal or suspicious. Even without facial recognition, it enables mass control.
“We see it as just as scary as what is happening in China. It’s the same principle of losing the right to be anonymous, the right to act how we want to act in public, the right not to be watched.”
This information was provided by BBC News.