similar problems with /ʒ/ ("genre" comes out as ?ehnuh where <?> is a consonant described as a "voiced alveolar fricative stop" with a hint of velar thrown in). probably because these sounds combine be less than 1% of all sounds and might not be present more than a handful of times in the training set. I might have to think about partitioning differently or augmenting the data set to even out the distribution
my seq2seq network for predicting phonetic features at from character strings after 10 epochs is at 99% accuracy on the validation set and pronounces (e.g.) "fediverse" (not in training set) almost flawlessly (I'll transcribe the features as "fidiverz") but seems to consistently mess up on interdental fricatives ("theorizing" comes out as "feruhzing," "lathe" comes out as "lat-tee," "this" comes out as "sis")
odd because it's an example of the map becoming the territory. the whole point of contemporary nlproc is to be able to work with the meaning of a sentence as something other than the sum of the semantics of its component words, but in this case, the sentences on the left are shown as being "similar" to one another *simply because of their lexical similarities*. it's hard to think of an actual human speech context where "I love tacos" means anything remotely close to "we are all tacos"
ran across this very good overview of text generation techniques with neural networks https://blog.usejournal.com/generating-natural-language-text-with-neural-networks-e983bb48caad though it's notable mostly for this very odd/hilarious illustration of "meaning space"
For the holidays, you could say thank you to some of the people who write free software you use, especially software that isn't hugely popular.
Those of us who write little-known software may go for months without hearing from a user, and it can be a little de-motivating.
Hearing from someone who actually uses one's software gives an energising jolt that can carry one through several weeks of darkness and cold and wet.
I love these hand-drawn abstract geometric gifs—so much care and so many interesting decisions (i.e. the strokes in the markers) in translating these shapes and motions to "physical" form https://www.thisiscolossal.com/2018/11/hand-drawn-gifs-by-benjamin-zimmerman/ (via https://mltshp.com/p/1F5H2)
psycholinguistic shitpost Show more
Bouba in the streets, Kiki in the sheets
Chris Pressey's #nanogenmo project ETERLAN SEPTEBMER is written in his Befunge language (and is just about the easiest-to-read Befunge program I've seen), celebrating 25 years of the language https://github.com/NaNoGenMo/2018/issues/91
I like this possibility of Befunge for generative texts -- the visual nature of the language plays to its advantage
i hope someone has written an extensive fanfic that intrafictionally explains the terrible kerning in destiny 2 https://mastodon.social/media/cs0GjOCNGCtmmwsoivI
poet, programmer, game designer, computational creativity researcher. assistant arts professor at NYU ITP. she/her.
Follow friends and discover new ones. Publish anything you want: links, pictures, text, video. This server is run by the main developers of the Mastodon project. Everyone is welcome as long as you follow our code of conduct!