LatestNews

macOS ChatGPT Users Shocked to Learn Chats Were Stored Unencrypted

The issue has since been resolved, but it raises questions about how such an oversight occurred in the first place.

The partnership between Apple and OpenAI has had a rocky start, as ChatGPT users on macOS recently discovered that their conversations were being stored in plain-text files.

Apple prides itself on prioritizing privacy, especially in a market where competitors often profit from user data. However, data and electronics engineer Pedro José Pereira Vieito revealed on Meta’s Threads that OpenAI’s integration of ChatGPT on macOS had a significant privacy flaw.

Privacy Concerns
ChatGPT was released on macOS in May for subscribers, with general access for non-subscriber accounts available on June 25. Until Friday, July 5, the app stored all chat logs in unencrypted plain-text files on users’ hard drives. This meant that anyone with access to the computer, either physically or through remote attacks such as malware or phishing, could access all conversations users had with ChatGPT.

Sandboxing
Apple’s macOS includes a privacy protection measure called “sandboxing,” which controls application access to software and data at the kernel level. Apps installed via Apple’s app service are “sandboxed” by default, ensuring that data is never left unencrypted. Pereira Vieito attributed the issue to the fact that the ChatGPT app on macOS was offered solely through OpenAI’s website:

“OpenAI chose to opt out of the sandbox and store the conversations in plain text in a non-protected location, disabling all of these built-in defenses.”

It is unclear if any users were directly affected by this oversight, but the reaction on social media and from commentators indicated shock. For example, in the comments section of an article published on The Verge, a user named GeneralLex shared their discovery of unencrypted text files stored in their computer’s memory:

“I used Activity Monitor to dump the ChatGPT executable from memory and found that, horror of horrors, chat log is in plain text, unencrypted in memory!”

A Simple Mistake?
The real question is why this happened. We know how it happened, and it’s clear the issue has been resolved, but the underlying reasons remain unknown.

Presumably, this was done so OpenAI could easily access chat logs for further development of ChatGPT. According to the app’s terms of use, users have to explicitly opt-out of sharing their data with OpenAI. But why didn’t Apple intercede on behalf of users before the app went live, and why didn’t OpenAI recognize that it was generating sensitive, unencrypted data on users’ machines?

Both OpenAI and Apple have been contacted for more information but have yet to provide an immediate response.

Summary Review: The revelation that ChatGPT stored conversations in unencrypted plain-text files on macOS has raised significant privacy concerns, especially given Apple’s reputation for prioritizing user privacy. While the issue has been resolved, it highlights the need for stringent security measures and thorough vetting processes for third-party applications. The incident underscores the importance of transparency and accountability from tech companies to ensure user data is protected. OpenAI and Apple have yet to provide detailed explanations for how this oversight occurred, leaving users and privacy advocates eager for answers and reassurances that such lapses will be prevented in the future.

Disclaimer: Remember that nothing in this article and everything under the responsibility of Web30 News should be interpreted as financial advice. The information provided is for entertainment and educational purposes only. Investing in cryptocurrency involves inherent risks and potential investors should be aware that capital is at risk and returns are never guaranteed. It is imperative that you conduct thorough research and consult with a qualified financial advisor before making any investment decision.

Shares:

Related Posts

Leave a Reply

Your email address will not be published. Required fields are marked *