mastodon.social is one of the many independent Mastodon servers you can use to participate in the fediverse.
The original server operated by the Mastodon gGmbH non-profit

Administered by:

Server stats:

356K
active users

Pedro José Pereira Vieito

The OpenAI ChatGPT app on macOS is not sandboxed and stores all the conversations in **plain-text** in a non-protected location:

~/Library/Application\ Support/com.openai.chat/conversations-{uuid}/

So basically any other running app / process / malware can read all your ChatGPT conversations without any permission prompt:

macOS has blocked access to any user private data since macOS Mojave 10.14 (6 years ago!). Any app accessing private user data (Calendar, Contacts, Mail, Photos, any third-party app sandbox, etc.) now requires explicit user access.

OpenAI chose to opt-out of the sandbox and store the conversations in plain text in a non-protected location, disabling all of these built-in defenses.

Just for reference: Good news! The new ChatGPT version now encrypts the local chats: theverge.com/2024/7/3/24191636

I continue hoping they sandbox the app in the future for improved protection.

The Verge · OpenAI’s ChatGPT Mac app was storing conversations in plain text
More from jaypeters

@pvieito worse, if (when) the ChatGPT app gets popped, no sandbox means it can attack the rest of your system

@pvieito@mastodon.social is this built into the new version of MacOS or something

@pvieito
Isn't that just the case for all your personal data? Any app can read your Word and Excel files and so on as well. That's the security model for PCs, that data isn't sandboxed between apps for a single user.

We do sandbox critical data (cf. password managers). But is a chat transcript really any more critical than, say, a spreadsheet containing your personal economy, or unpublished fanfic, or whatever other personal data you keep unencrypted?

@jannem @pvieito
Your regular data files like images, documents and spreadsheets aren't kept in `~/Library/Application\ Support/some.application.name/`. That's for files needed for the application to work and for the use of that application only.

@negative12dollarbill @pvieito
Isn't that open for you (and by extension, any app you run) to read, though?

@jannem @negative12dollarbill Apple has blocked access to any private data (including Mail data) since macOS Mojave 10.14 (6 years ago!).

Any app accessing private user data (Calendar, Contacts, Mail, Photos, any third-party app sandbox, etc.) now requires explicit user access.

OpenAI chose to opt-out of the sandbox and store the conversations in plain text in a non-protected location, disabling all of these built-in defenses.

@jannem No, you have to explicitly give permission to folders (Documents, Downloads, etc.) per app, when the app first attempts to access these folders. This is different than Windows and Linux.

@pvieito well, your email files are probably stored the same way... no need to say that chat gtp conversations are not important, unless the user is an complete idiot and mention sensitive info.

@specktator_ No, Apple has blocked access to any private data (including Mail data) since macOS Mojave 10.14 (6 years ago!).

Any app accessing private user data (Calendar, Contacts, Mail, Photos, any third-party app sandbox, etc.) now requires explicit user access.

OpenAI chose to opt-out of the sandbox and store the conversations in plain text in a non-protected location, disabling all of these built-in defenses.

@pvieito @specktator_ My sneaking suspicion is that they opted out of the sandbox to avoid people sharing screenshots of permissions dialogues saying things like “ChatGPT is requesting access to your documents”. Stupid and short-sighted — but what else is new?

@riotnrrd @specktator_ Those permissions prompts are presented for any process (including non-sandboxed apps).

@pvieito @specktator_ Yes, sorry, of course you and I know this. My implication was that someone within OpenAI wanted to avoid having those dialogues presented to their users, and sandbox avoidance was the method they landed on.

@riotnrrd @pvieito @specktator_ When you access files in your own sandbox (like the history would be in this case) no pop-up is shown. So that’s not the reason they opted out.

@melgu @riotnrrd @specktator_ Yep, AFAIK, there is no sensible reason why they opted out. ChatGPT does not require (nor request) Accessibility access nor does it execute third-party code (like an IDE would do), and these are the main 2 reasons to disable the app sandbox.

In fact, the app shares a huge part of the code with the iOS app, so for sure it can work perfectly in the app sandbox.

@melgu @pvieito @specktator_ I was assuming that at some point somebody would try to drag & drop something from the desktop or Documents or whatever and trigger the permission dialogue — “please summarise SUPER-SECRET-PLANS.PDF”, shortly followed by “oh noes teh evil robot haz all my planzzz”.

@riotnrrd @specktator_ @pvieito Is drag and drop not exempted, since only the specific file is shared?

@melgu @riotnrrd @specktator_ Yes, there is no difference between sandboxed & and non-sandboxed apps in that regard. Any user explicit action (drag-and-drop, open-dialog etc.) will automatically grant access to the files.

@specktator_ @pvieito good security should ideally assume that users make mistakes and don’t know stuff, I think

@pvieito This is really a failure on Apple’s part, not OpenAI’s. The behavior you’re describing is a side effect of Apple being lazy and self-serving in their implementation. They should provide proper API to extend a data vault to third party unsandboxed apps.
@pvieito Yes but you do see how presenting this as something OpenAI has been doing “wrong” because the APIs needed to do this haven’t shipped on macOS yet (flipping your claim on its head, haven’t even shipped 6 years later!) is the wrong way to look at the problem?

@saagar No. OpenAI could perfectly sandbox its app, and they chose to not do it. They could also encrypt the chats, and they did not.

And it is clear they where not doing the things right because they fixed this in the last version: theverge.com/2024/7/3/24191636

The Verge · OpenAI’s ChatGPT Mac app was storing conversations in plain text
More from jaypeters
@pvieito They did that because the Verge reporters were in your mentions trying to catch a scoop on the next Recall AI disaster and they wanted to get out ahead of that. The solution they went with was deeply unsatisfying and terrible for the platform as a whole
@pvieito I’m n even going to discuss whether OpenAI “could” sandbox their app; I don’t have enough context to discuss that. But it doesn’t seem unreasonable that they use APIs that are not available from the app sandbox to e.g. know what’s going on when you invoke it
@pvieito Whatever the reasons, the result is that any non-sandboxed app until now had no protections against other apps peeking at their data. Apple made a solution for *their* apps and then later added sandboxed apps in, but ChatGPT (or Chrome, or Photoshop, or …) can’t benefit
@pvieito And the new “solution” means that every app in this situation has to hand-roll its own file encryption code. Plus, users cannot actually access these files directly anymore because there is no way to grant access to a bespoke encrypted file
@pvieito To be clear, I have no issue with users understanding the limitations of the current situation, and adjusting their threat model in response. If you want to bring attention to that more power to you. But you specifically assigned blame and formed a narrative on top
@pvieito And I think the end result of that was the company was forced into a non-optimal security outcome because of concerns everyone would see “plaintext” and freak out. The real issue is most apps people use are in the same boat so the solution needs to come from Apple

@saagar While I understand your point of view, chat apps are special and have to treat user data with extra care.

WhatsApp had also to roll encryption (7 years ago!: wired.com/story/whatsapp-encry) because macOS malware could access the plain-text backup stored on iCloud.

WIRED · WhatsApp encryption: Facebook's messaging app now encrypts iCloud back-upsBy Matt Burgess
@pvieito WhatsApp is actually a very interesting usecase! I can double check but I don’t think they actually encrypt data at rest; this is solely for backups to cloud services (which lets you upgrade an insecure service into something that is strongly encrypted)
@pvieito And you can turn this off if you want. I think this demonstrates a decent amount of thought put into balancing security with practicality and accessibility. I don’t feel that OpenAI was given the chance to put in the same amount of thought here
@pvieito Despite all I said if OpenAI decided that ChatGPT is targeted by Mac malware and they should do something special about it, I would be ok with that. But I think doing that work takes time. And the actual work they need to do is more extensive than what they’ve done here
@pvieito Maybe they create a sandboxed service that handles chats only for example, if they want to take advantage of what the OS has right now. Maybe they pick to only do protection in Sequoia and later, or fall back to manual encryption for older OSes.
@pvieito And there’s more extensive work that needs to be done, too: this protects chats, but what’s not protected? Maybe there’s an API key somewhere on disk that gives access to this anyway?
@pvieito Really, what I would have preferred is they go “ok here is the threat model we have, this is the functionality we would like to address this, this is what we have rolled out given the limitations we have”.
@pvieito And it would potentially include an analysis of what they are choosing *not* to do. Otherwise you turn into a banking app that doesn’t allow you to take screenshots because “what if someone was recording your screen and saw your bank balance”

@pvieito I've never understood why anyone would install an app for a service that can be used via a browser. Every app potentially undermines the security of the system.

@pvieito I mean.... yeah. The 2 apps are un-sandboxed. That's sort of what that means.
I'm not sure what the point of this demo is?

@pvieito
Maybe we can stay of that crap altogether?
There's enough reasons

social.saarland/@fedithom/1124

social.saarlandfedithom (@fedithom@social.saarland)AI can fuck off. New: https://pod.geraspora.de/posts/17342163 --- https://pivot-to-ai.com/2024/11/30/meet-the-underpaid-workers-in-nairobi-kenya-who-power-openai/ https://www.codastory.com/stayonthestory/nursing-ai-hospitals-robots-capture/ https://tante.cc/2024/04/22/ai-as-support-systems-for-diagnostics/ https://tldr.nettime.org/@tante/113000108227350528 https://social.saarland/@fedithom/112797547775951493 https://jascha.wtf/blogmojo-ai-plagiat-im-jahr-2023-wenn-kuenstliche-intelligenz-texte-klaut/ https://chaos.social/@leah/112871670828981320 https://mastodon.green/@akshatrathi/112445979071434307 https://danmcquillan.org/chatgpt.html https://www.brusselstimes.com/world-all-news/1042696/chatgpt-consumes-25-times-more-energy-than-google https://www.vox.com/climate/2024/3/28/24111721/ai-uses-a-lot-of-energy-experts-expect-it-to-double-in-just-a-few-years https://pluralistic.net/2024/05/15/they-trust-me-dumb-fucks/#ai-search https://www.takahe.org.nz/heat-death-of-the-internet/ https://www.bloodinthemachine.com/p/for-tech-ceos-the-dystopia-is-the https://mastodon.green/@gerrymcgovern/112456092679172352 https://www.swr.de/swrkultur/wissen/clickworker-ausgebeutet-fuer-kuenstliche-intelligenz-swr2-wissen-2023-08-24-102.html + eigentlich alles von https://tante.cc/