Think You Could Lie To Artificial Intelligence?

Think You Could Lie To Artificial Intelligence?

This post is also available in: heעברית (Hebrew)

Artificial Intelligence (AI) is getting increasingly better year by year. It can answer questions, find patterns in data, and even drive cars. Now, researchers at the University of Michigan are teaching it to detect lies.

Rada Mihalcea, professor of computer science and engineering, and Mihai Burzo, assistant professor of mechanical engineering, are leading a team that’s developing a unique lie detection system based on real-world data by studying videos from actual court cases.

Their software prototype works by analysing a speaker’s words and gestures, unlike a polygraph that requires physical contact with a subject in order to work. What’s more, this system has some impressive results. In experiments, the software was able to accurately detect deception in up to 75% of cases, as defined by trial outcomes. Humans, in contrast, score slightly more than 50%. That’s about as good as a coin flip.

“People are poor lie detectors,” Mihalcea said. “This isn’t the kind of task we’re naturally good at. There are clues that humans give naturally when they are being deceptive, but we’re not paying close enough attention to pick them up. We’re not counting how many times a person says ‘I’ or looks up. We’re focusing on a higher level of communication.”

The researchers say that they managed to identify several revealing signs of lying using their software. Liers moved their hands more; tried to sound more certain; and, surprisingly, looked questioners in the eye more than truth tellers.

The researchers used machine-learning technique to develop the software and “train” with a set of 120 video clips from actual trials that include testimony from defendants and witnesses. To determine who was lying, and who was telling the truth, the team compared the testimonies with trial verdicts.

The team transcribed audio, including all instances of “um, ah, uh,” and other vocal fills. They then analysed how often subjects used certain words and categories of words, as well as various gestures using a standard coding scheme for interpersonal interactions.

The team is now planning to integrate “physiological parameters such as heart rate, respiration rate and body temperature fluctuations, all gathered with non-invasive thermal imaging,” Burzo said.