This post is also available in: heעברית (Hebrew)

Emotion recognition technology – can it be considered biometrics 3.0? Emotion recognition technology tracks traits such as facial muscle movements, vocal tone and body movements in order to infer a person’s feelings. China has been ramping up such technology in order to monitor human feelings and help law enforcement.

With the global industry forecast to be worth nearly $36bn by 2023, growing at nearly 30% a year, rights groups say action needs to be taken now, according to theguardian.com.

The tool is increasingly being used in China in various fields including health, anti-terrorism, and urban security, sources told the state-run Global Times, and some Chinese experts boast that the new technology is up to 95 percent accurate at detecting people’s emotions.

For example, the artificial intelligence system can monitor occupants of cars passing through a busy intersection. Security officers then might stop a vehicle in which a passenger appears nervous to search the vehicle for drugs, the report said.

The tool may also have predictive value. According to Taigusys, a company specializing in the technology — the tool has the capability to predict dangerous behavior from prisoners, students in schools and elderly people experiencing dementia in nursing homes. Taigusys’ systems have already been installed in about 300 prisons and detention facilities around China, connecting 60,000 cameras, report nypost.com. While the use of emotion-recognition technology in schools in China has sparked some criticism, there has been very little discussion of its use by authorities on citizens.

Critics say the technology is based on a pseudo-science of stereotypes, and an increasing number of researchers, lawyers and rights activists believe it has serious implications for human rights, privacy and freedom of expression. 

A recent report by Article 19 organization on the development of these surveillance technologies by 27 companies in China found its growth without safeguards and public deliberation, was especially problematic, particularly in the public security and education sectors.