This post is also available in:
עברית (Hebrew)
A new, troubling trend is emerging in the world of artificial intelligence (AI)—one where your deepest intentions and desires could soon become the currency fueling a new digital economy. Researchers from the University of Cambridge are raising alarms about the rise of an “intent economy,” a marketplace driven by AI models that can predict, manipulate, and sell your intentions to the highest bidder.
While targeted advertising based on digital preferences is already a well-established practice, the use of AI is taking this a step further. Today’s popular AI assistants, chatbots, and virtual companions are gathering and analyzing vast amounts of personal data, offering services that extend beyond just answering questions. According to the Cambridge researchers, these models could soon monetize your psychological state, political beliefs, and even your language choices, creating a new type of digital transaction where your intentions are sold to corporations.
As reported on Cybernews, the researchers at Cambridge’s Leverhulme Center for the Future of Intelligence (LCFI) highlight how these AI systems, such as large language models (LLMs), are capable of extracting far more sensitive information than what we might expect from traditional digital footprints. They argue that AI’s ability to analyze the nuances of your conversations—such as tone, cadence, and the underlying motivations behind your words—gives tech companies an unprecedented level of insight into your personal life. This opens the door for highly targeted and even manipulative advertising, with far-reaching implications for privacy, politics, and consumer behavior.
This new AI-driven marketplace, which could encompass everything from product purchases to voting preferences, poses a risk of social manipulation on an industrial scale. As AI continues to develop, large tech firms are already positioning themselves to capitalize on these insights. For instance, Apple’s new “App Intents” framework and OpenAI’s calls for data that expresses human intention highlight a shift towards predicting and influencing user behavior.
With tech giants like Nvidia, Meta, and Shopify actively exploring AI’s potential to predict human intent, the question arises: whose interests are these AI systems really serving? Researchers warn that as companies gather data on your unspoken desires, the next frontier will be selling these intentions before you even fully realize them yourself. The intention economy may soon be here—but at what cost to personal autonomy and privacy?