Former Google CEO Warns of AI Effects on the 2024 U.S. Elections

image provided by pixabay

This post is also available in: עברית (Hebrew)

Eric Schmidt, former CEO of Google claims that misinformation around the 2024 election is one of the biggest short-term dangers of artificial intelligence. He claims that social media companies have not yet learned how to deal with AI-spread misinformation and adds that social media should allow for “free speech for humans, not computers.”

According to CNBC News, misinformation around the 2024 U.S. elections will be rampant as advanced artificial intelligence becomes increasingly more accessible. Schmidt stated that the campaigns will be “full, full of false information that anyone can generate,” and added that “every side, every grassroots group and every politician will use generative AI to do harm to their opponents.”

“The 2024 elections are going to be a mess because social media is not protecting us from false generated AI,” Schmidt told CNBC. “They’re working on it, but they haven’t solved it yet. And in fact, the trust and safety groups are getting made smaller, not larger.”

And indeed, Meta, Amazon, Alphabet and Twitter have all drastically reduced the size of their teams focused on internet trust and safety, as well as ethics. Meta specifically has ended a fact-checking project that had taken half a year to build and was meant to combat online misinformation and hate speech.

While there are overall broad concerns regarding the long-term impacts of AI on society, including the potential for the technology to gain human-like abilities, Schmidt said that “the short-term danger is misinformation.”

CNBC reports that Google has recently decided to stop removing false claims about widespread fraud in the 2020 U.S. election from YouTube. It said the decision sought to balance its goals of protecting the community and being a forum for open discussion. When asked about the policy change, Schmidt said that social media should allow for “free speech for humans, not computers.”