ChatGPT Conversations Are Being Harvested and Sold, Exper...
Tech Beetle briefing US

ChatGPT Conversations Are Being Harvested and Sold, Experts Warn Millions at Risk

Essential brief

ChatGPT Conversations Are Being Harvested and Sold, Experts Warn Millions at Risk

Key facts

Security researchers found that Urban VPN Proxy and related extensions have been harvesting private AI chat logs since July 2025.
Millions of users’ personal conversations with AI platforms like ChatGPT have been collected and sold on the open market.
This data breach exposes users to privacy risks and potential misuse of sensitive information.
Users should be cautious about browser extensions and review permissions to protect their data.
There is a pressing need for stronger regulations and transparency in AI data handling practices.

Highlights

Security researchers found that Urban VPN Proxy and related extensions have been harvesting private AI chat logs since July 2025.
Millions of users’ personal conversations with AI platforms like ChatGPT have been collected and sold on the open market.
This data breach exposes users to privacy risks and potential misuse of sensitive information.
Users should be cautious about browser extensions and review permissions to protect their data.

Recent investigations by security researchers at Koi Security, led by Idan Dardikman, have uncovered a significant privacy breach involving AI chat platforms.

Since July 2025, extensions such as Urban VPN Proxy and related tools have been secretly collecting millions of private chat logs from users interacting with AI services like ChatGPT.

These logs contain sensitive and personal information shared during conversations, raising serious concerns about user privacy and data security.

The harvested data is reportedly being sold on the open market, exposing millions of users to potential misuse of their private information.

This practice highlights a growing threat as AI chat platforms become more integrated into daily life, often without users fully understanding the risks involved.

Experts warn that users should be cautious about the extensions and tools they install, especially those that claim to enhance privacy or security but may instead be exploiting user data.

The incident underscores the need for stricter regulations and better transparency from developers regarding data handling practices.

Users are encouraged to review permissions granted to browser extensions and to stay informed about the security measures of the AI platforms they use.

As AI technologies evolve, safeguarding personal data remains a critical challenge that requires coordinated efforts from both industry and regulators to protect millions of users worldwide.