A user of ChatGPT tells Ars Technica that he has access to other users’ private conversations.
According to Ars reader Chase Whiteside, ChatGPT is allegedly leaking private conversations containing login credentials and other personal information of unconnected users. Whiteside discovered this when he used the ChatGPT platform to make a request of his own and found additional conversations in his chat history.
The screenshots Whiteside provided show private conversations containing usernames and passwords that appear to be related to a support system for employees of a prescription drug portal. Other leaked conversations include the name of a presentation, details of an unpublished research proposal, and a script in the PHP programming language.
The users involved appear to be different and unrelated. It is not yet known when the conversations took place; only in one case is the date 2020 included in the texts.
OpenAI is investigating the incident
Whiteside noticed the additional conversations in his chat history on Monday morning, shortly after using ChatGPT for an unrelated request of his own. The conversations in question were not previously included in his history.
A representative from OpenAI told Ars Techica that the report is being investigated. There have been similar incidents in the past: In March 2023, ChatGPT was taken offline by OpenAI after a bug caused titles from an active user’s chat history to be displayed to unrelated users.
The case shows that it is still advisable to remove sensitive data from requests to ChatGPT and other AI services wherever possible.