The OpenAI ChatGPT app on macOS is not sandboxed and stores all the conversations in **plain-text** in a non-protected location:
~/Library/Application\ Support/com.openai.chat/conversations-{uuid}/
So basically any other running app / process / malware can read all your ChatGPT conversations without any permission prompt:
macOS has blocked access to any user private data since macOS Mojave 10.14 (6 years ago!). Any app accessing private user data (Calendar, Contacts, Mail, Photos, any third-party app sandbox, etc.) now requires explicit user access.
OpenAI chose to opt-out of the sandbox and store the conversations in plain text in a non-protected location, disabling all of these built-in defenses.
Just for reference: Good news! The new ChatGPT version now encrypts the local chats: https://www.theverge.com/2024/7/3/24191636/openai-chatgpt-mac-app-conversations-plain-text
I continue hoping they sandbox the app in the future for improved protection.
New post recapping last week's ChatGPT saga with extra details: https://pvieito.com/2024/07/chatgpt-unprotected-conversations
@pvieito worse, if (when) the ChatGPT app gets popped, no sandbox means it can attack the rest of your system
@pvieito@mastodon.social is this built into the new version of MacOS or something
@pvieito
Isn't that just the case for all your personal data? Any app can read your Word and Excel files and so on as well. That's the security model for PCs, that data isn't sandboxed between apps for a single user.
We do sandbox critical data (cf. password managers). But is a chat transcript really any more critical than, say, a spreadsheet containing your personal economy, or unpublished fanfic, or whatever other personal data you keep unencrypted?
@negative12dollarbill @pvieito
Isn't that open for you (and by extension, any app you run) to read, though?
@jannem @negative12dollarbill Apple has blocked access to any private data (including Mail data) since macOS Mojave 10.14 (6 years ago!).
Any app accessing private user data (Calendar, Contacts, Mail, Photos, any third-party app sandbox, etc.) now requires explicit user access.
OpenAI chose to opt-out of the sandbox and store the conversations in plain text in a non-protected location, disabling all of these built-in defenses.
@jannem No, you have to explicitly give permission to folders (Documents, Downloads, etc.) per app, when the app first attempts to access these folders. This is different than Windows and Linux.
@pvieito well, your email files are probably stored the same way... no need to say that chat gtp conversations are not important, unless the user is an complete idiot and mention sensitive info.
@specktator_ No, Apple has blocked access to any private data (including Mail data) since macOS Mojave 10.14 (6 years ago!).
Any app accessing private user data (Calendar, Contacts, Mail, Photos, any third-party app sandbox, etc.) now requires explicit user access.
OpenAI chose to opt-out of the sandbox and store the conversations in plain text in a non-protected location, disabling all of these built-in defenses.
@pvieito @specktator_ My sneaking suspicion is that they opted out of the sandbox to avoid people sharing screenshots of permissions dialogues saying things like “ChatGPT is requesting access to your documents”. Stupid and short-sighted — but what else is new?
@riotnrrd @specktator_ Those permissions prompts are presented for any process (including non-sandboxed apps).
@pvieito @specktator_ Yes, sorry, of course you and I know this. My implication was that someone within OpenAI wanted to avoid having those dialogues presented to their users, and sandbox avoidance was the method they landed on.
@riotnrrd @pvieito @specktator_ When you access files in your own sandbox (like the history would be in this case) no pop-up is shown. So that’s not the reason they opted out.
@melgu @riotnrrd @specktator_ Yep, AFAIK, there is no sensible reason why they opted out. ChatGPT does not require (nor request) Accessibility access nor does it execute third-party code (like an IDE would do), and these are the main 2 reasons to disable the app sandbox.
In fact, the app shares a huge part of the code with the iOS app, so for sure it can work perfectly in the app sandbox.
@melgu @pvieito @specktator_ I was assuming that at some point somebody would try to drag & drop something from the desktop or Documents or whatever and trigger the permission dialogue — “please summarise SUPER-SECRET-PLANS.PDF”, shortly followed by “oh noes teh evil robot haz all my planzzz”.
@riotnrrd @specktator_ @pvieito Is drag and drop not exempted, since only the specific file is shared?
@melgu @riotnrrd @specktator_ Yes, there is no difference between sandboxed & and non-sandboxed apps in that regard. Any user explicit action (drag-and-drop, open-dialog etc.) will automatically grant access to the files.
@specktator_ @pvieito good security should ideally assume that users make mistakes and don’t know stuff, I think
@saagar They will add support for that in macOS Sequoia when using Group Containers: https://developer.apple.com/wwdc24/10123
@saagar No. OpenAI could perfectly sandbox its app, and they chose to not do it. They could also encrypt the chats, and they did not.
And it is clear they where not doing the things right because they fixed this in the last version: https://www.theverge.com/2024/7/3/24191636/openai-chatgpt-mac-app-conversations-plain-text
@saagar While I understand your point of view, chat apps are special and have to treat user data with extra care.
WhatsApp had also to roll encryption (7 years ago!: https://www.wired.com/story/whatsapp-encryption-end-to-end-turned-on/) because macOS malware could access the plain-text backup stored on iCloud.
@pvieito I've never understood why anyone would install an app for a service that can be used via a browser. Every app potentially undermines the security of the system.
@pvieito This looks like it might be bad.
@pvieito 50 R374RD3D!
@pvieito I mean.... yeah. The 2 apps are un-sandboxed. That's sort of what that means.
I'm not sure what the point of this demo is?
@pvieito Oh. Well. That's nice.
@pvieito glad I chat with humans
@pvieito
Maybe we can stay of that crap altogether?
There's enough reasons