The lesson of this story (of a Catholic blog using commercially resold Grindr data to out a gay priest) is *either* that anonymised data can always be de-anonymised (pretty much the intuition of lots of experts I know), or, less generally, you can't expect an org that benefits from selling other people's data to calibrate how much they should spend on anonymising.

It's not just companies, mind you. The census faces similar challenges -- they work hard to inject noise into their public data so you can't draw conclusions about individuals. But the more noise you introduce, the less useful the data is for, e.g., measuring racial injustice.

And if you think this is bad in the US: how many Cote d'Ivoire mobile phone users do you think consented to this anonymized data release to academics, a year or so after the end of a civil war?


With the qualities that digital information has, you really need to treat it like toxic waste. When we say "Information wants to be free", it's not a rallying cry, it's a description of an (often dangerous) quality of digital data. It's like saying "U²³⁵ wants to be fissile."

· · Web · 0 · 10 · 7
Sign in to participate in the conversation

Server run by the main developers of the project 🐘 It is not focused on any particular niche interest - everyone is welcome as long as you follow our code of conduct!