Intriguing Collaboration in Public Safety

Intriguing Collaboration in Public Safety

Photo illust Wikimedia

This post is also available in: heעברית (Hebrew)

Facebook wants to improve its automatic image detection systems in order to be able to immediately remove violence contents and increase security. The social media giant has recently partnered with law enforcement in the UK to obtain footage to train its automated content moderation tools. 

Starting in October, the UK’s Metropolitan Police Service will provide bodycam footage taken during its firearms training exercises, which Facebook will use to train its video recognition AI, according to theverge.com. 

The aim is to automatically identify footage of an attack, remove it, and notify the police. Facebook is currently exploring similar partnerships with law enforcement agencies in the US, according to the ft.com.

The new initiative comes in the wake of Facebook’s inability to prevent a mass shooting from being live-streamed on its platform. Facebook said that the Christchurch, New Zealand shooting in March was viewed 200 times during its live broadcast, and 4,000 times in total before it was removed. In the 24 hours following the incident, Facebook says it removed 1.5 million videos of the attack from its platform. Of these, 1.2 million were blocked “at upload,” meaning 300,000 of them slipped through Facebook’s automated systems.

According to the company’s press release, “the video of the attack in Christchurch did not prompt our automatic detection systems because we did not have enough content depicting first-person footage of violent events to effectively train our machine learning technology.” Getting more footage from law enforcement should improve these detection systems, Facebook says, as well as cutting down on footage from video games or movies being incorrectly detected.

The footage from the Metropolitan Police will include training drills of terrorist incidents and hostage situations across land, public transport, and water-based locations. 

The Metropolitan Police will also pass the footage on to the UK’s Home Office to share with other technology firms. 

Back in May, Facebook imposed new restrictions on live-streaming with a “one strike” policy that bans users from using its live-streaming service for a set period of time after just a single violation of the platform’s community standards. The company says it is also using automated techniques to attempt to remove terrorist and hate organizations from its platform.