The lesson of this story (of a Catholic blog using commercially resold Grindr data to out a gay priest) is *either* that anonymised data can always be de-anonymised (pretty much the intuition of lots of experts I know), or, less generally, you can't expect an org that benefits from selling other people's data to calibrate how much they should spend on anonymising.
It's not just companies, mind you. The census faces similar challenges -- they work hard to inject noise into their public data so you can't draw conclusions about individuals. But the more noise you introduce, the less useful the data is for, e.g., measuring racial injustice. https://www.census.gov/programs-surveys/decennial-census/decade/2020/planning-management/process/disclosure-avoidance.html
And if you think this is bad in the US: how many Cote d'Ivoire mobile phone users do you think consented to this anonymized data release to academics, a year or so after the end of a civil war? https://arxiv.org/abs/1210.0137
The other lesson of this is that hookup apps should be under the control of the community, not private interests.
@emma or possibly different communities? I know that folks were finding out similar problems when hookup apps were leaking info, which was relatively (relatively!) harmless in say a US context, but were deadly in countries with harsh anti-gay laws. It's hard to generalise threat models even when you think there's a commonality.
One of the advantages of a more federated model, I guess.
Yes, federation would help, but there's no amount of tech that will fix a systemic problem like homophobia and it's underlying causes.
@mala .@josephfcox on twitter writes: "New: the inevitable weaponization of app data is here. Grindr gives location data to third parties, broker gives it to Catholic publication, outlet uses that to track and out priest as potentially gay without consent. This is not theoretical; real threat"
Sorry for paraphrasing:
Many people even in tech underestimate how statistical correlation on data can be used to identify specific profile with high accuracy
When I see "don't worry, data is anonymised" all I think of is: that doesn't mean anything in itself
@mala between my current and last job, we have a yearly review to fill out on how the company is doing. It’s anonymous. They ask for department, age range, employment length range, and position in department. I don’t understand why they don’t just have us sign our names. Of course, it still doesn’t stop me from being brutally honest.
Server run by the main developers of the project It is not focused on any particular niche interest - everyone is welcome as long as you follow our code of conduct!