https://arstechnica.com/ai/2024/07/chatgpts-much-heralded-mac-app-was-storing-conversations-as-plain-text/ Skip to main content * Biz & IT * Tech * Science * Policy * Cars * Gaming & Culture * Store * Forums Subscribe [ ] Close Navigate * Store * Subscribe * Videos * Features * Reviews * RSS Feeds * Mobile Site * About Ars * Staff Directory * Contact Us * Advertise with Ars * Reprints Filter by topic * Biz & IT * Tech * Science * Policy * Cars * Gaming & Culture * Store * Forums Settings Front page layout Grid List Site theme light dark Sign in Seriously? -- ChatGPT's much-heralded Mac app was storing conversations as plain text The app was updated to address the issue after it gained public attention. Samuel Axon - Jul 5, 2024 8:27 pm UTC A message field for ChatGPT pops up over a Mac desktop Enlarge / The app lets you invoke ChatGPT from anywhere in the system with a keyboard shortcut, Spotlight-style. Samuel Axon reader comments 35 OpenAI announced its Mac desktop app for ChatGPT with a lot of fanfare a few weeks ago, but it turns out it had a rather serious security issue: user chats were stored in plain text, where any bad actor could find them if they gained access to your machine. As Threads user Pedro Jose Pereira Vieito noted earlier this week, "the OpenAI ChatGPT app on macOS is not sandboxed and stores all the conversations in plain-text in a non-protected location," meaning "any other running app / process / malware can read all your ChatGPT conversations without any permission prompt." He added: macOS has blocked access to any user private data since macOS Mojave 10.14 (6 years ago!). Any app accessing private user data (Calendar, Contacts, Mail, Photos, any third-party app sandbox, etc.) now requires explicit user access. OpenAI chose to opt-out of the sandbox and store the conversations in plain text in a non-protected location, disabling all of these built-in defenses. OpenAI has now updated the app, and the local chats are now encrypted, though they are still not sandboxed. (The app is only available as a direct download from OpenAI's website and is not available through Apple's App Store where more stringent security is required.) Further Reading Apple integrates ChatGPT into Siri, iOS, and macOS Many people now use ChatGPT like they might use Google: to ask important questions, sort through issues, and so on. Often, sensitive personal data could be shared in those conversations. It's not a great look for OpenAI, which recently entered into a partnership with Apple to offer chat bot services built into Siri queries in Apple operating systems. Apple detailed some of the security around those queries at WWDC last month, though, and they're more stringent than what OpenAI did (or to be more precise, didn't do) with its Mac app, which is a separate initiative from the partnership. If you've been using the app recently, be sure to update it as soon as possible. reader comments 35 Samuel Axon Samuel is a senior editor at Ars Technica. He primarily covers software development, gaming, Apple, consumer technology, and mixed reality. He has been writing about gaming and technology for 15 years, and is a Chicago-based game developer. Advertisement Channel Ars Technica - Previous story Next story - Related Stories Today on Ars * Store * Subscribe * About Us * RSS Feeds * View Mobile Site * Contact Us * Staff * Advertise with us * Reprints Newsletter Signup Join the Ars Orbital Transmission mailing list to get weekly updates delivered to your inbox. Sign me up - CNMN Collection WIRED Media Group (c) 2024 Conde Nast. All rights reserved. Use of and/or registration on any portion of this site constitutes acceptance of our User Agreement (updated 1/1/20) and Privacy Policy and Cookie Statement (updated 1/1 /20) and Ars Technica Addendum (effective 8/21/2018). Ars may earn compensation on sales from links on this site. Read our affiliate link policy. Your California Privacy Rights | [privacyopt] Do Not Sell My Personal Information The material on this site may not be reproduced, distributed, transmitted, cached or otherwise used, except with the prior written permission of Conde Nast. Ad Choices