Shamar is a user on mastodon.social. You can follow them or interact with them if you have an account anywhere in the fediverse. If you don't, you can sign up here.

@Shamar I remember doing that thing a while back and it's pretty fucked up because, among other things, it assumes a self-driving car would somehow be able to determine a person's social standing and therefore "know" whether someone is a doctor vs a criminal vs a homeless person (not that I think we should be using those criteria to value human life).

@Shamar The way they choose to interpret the data is, as you say, completely dependent on their own moral frameworks. I just did it such that the car always kills itself & it's passengers when given the option. When forced to choose between killing 2 groups of people, I made it always just go straight, regardless of who/how many people that would kill. The results appear that I really care about saving people of high social standing, despite that not factoring into my decisions.

Shamar @Shamar

@ink_slinger

Indeed my own personal feeling is that the 's experiment is not , but .

· Web · 0 · 1