Follow

hey everyone writing code: you can't get around this. code does not exist in a noble mathematical void which absolves you of accountability. any system which people touch is tied into the fabric of society, and should be scrutinized for potential to do harm.

How to make a racist AI without really trying blog.conceptnet.io/posts/2017/

"... the sentiment is generally more positive for…
twitter.com/Abebab/status/1042

@lorenschmidt wwwow, besides the obvious, even the vast difference between Emily and Heather is astounding

@lorenschmidt hey everyone writing mathematics: math does not exist in a noble mathematical void which absolves you of any accountability.

No kidding, my complex analysis prof taught us conformal maps, and then showed us how the Russians designed shaped charges with them before the war.

@arxivfever

@lorenschmidt
If you do a thing with no obvious application, and somebody else figures out a bad application for it, do you deserve any blame? Is Marie curie partially responsible for the atom bomb?

@lorenschmidt AI is always trained off of data we put into it.

It's basically garbage-in-garbage-out amplified.

@lorenschmidt "You can have data that’s better because it’s less racist. There was never anything “accurate” about the overt racism that word2vec and GloVe learned." — my favourite part I guess.
@lorenschmidt

so i’m accountable for what other people do with my software? hmm 🤔 i don’t think so, p much every software license includes this nice disclaimer:

THE SOFTWARE IS PROVIDED "AS IS", WITHOUT WARRANTY OF ANY KIND, EXPRESS OR
IMPLIED, INCLUDING BUT NOT LIMITED TO THE WARRANTIES OF MERCHANTABILITY,
FITNESS FOR A PARTICULAR PURPOSE AND NONINFRINGEMENT. IN NO EVENT SHALL THE
AUTHORS OR COPYRIGHT HOLDERS BE LIABLE FOR ANY CLAIM, DAMAGES OR OTHER
LIABILITY, WHETHER IN AN ACTION OF CONTRACT, TORT OR OTHERWISE, ARISING FROM,
OUT OF OR IN CONNECTION WITH THE SOFTWARE OR THE USE OR OTHER DEALINGS IN THE
SOFTWARE.
@lorenschmidt obviously, we want good outcomes but it isn’t always possible to foresee the implications of novel technology. ethical software is good, but don’t go head hunting people for inventing something that was later used by another party to do a bad thing. not fair or just in any way.

@lorenschmidt I think we need to go about this the other way: we need to really change the source of the problems in actual society, we need to normalize this diversity much more than emphasize the need for acceptance of something that people feel as weird.

@lorenschmidt One part of that is having an actually well-working international language. This bias is something we oblige to people wanting to join international culture. English is not culturally neutral.

@lorenschmidt
This is because of slants in the training sets though, not in the code?

Sign in to participate in the conversation
Mastodon

Follow friends and discover new ones. Publish anything you want: links, pictures, text, video. This server is run by the main developers of the Mastodon project. Everyone is welcome as long as you follow our code of conduct!