exploring the phonetic space by just setting large chunks of the dimensions to arbitrary values

visualizing the vector in the latent phonetic space while interpolating between "abacus" and "mastodon." (this is after inferring the latent vectors via orthography->phoneme features->VAE). I just arbitrarily reshaped the vectors from (1, 1, 128) to (8, 16), so the 2d patterns are arbitrary. still interesting to see what it's actually learning!

I extracted 45k noun phrases from the wikipedia pages for every unicode character in the
'punctuation' and 'symbol' categories. here are 24 sampled at random

decoding the same underlying vectors from the VAE using the french spelling model, for some reason, sure, whatever

using the phonetic VAE to interpolate between US state names in a grid

chart of the day: voiced obstruents (i.e., phonemes like /b/, /g/, /z/) in pokémon names, by evolution level

going back to the regular seq2seq networks, I'm trying to do some quantitative evaluation. the phoneme features to orthography model gets... ~60% of words wrong, and ~12% of letters wrong (working on samples a few thousand words from cmudict), but its guesses seem... reasonable? not sure how to talk about this

(a) minimalist definition of narrative (b) name of a hit new YA series (c) phrase from a movie review that damns with faint praise (d) something rad to memorize & recite as your last words ("well, that sure was...") (e) all of the above

exploring the latent phonetic nonsense space around "typewriter"—using the best model I've managed to train yet (100 epochs on HPC, managed to keep the reconstruction loss fairly low while also getting some semblance of a low KL loss)

thinking about this formulation from Ursula K. Le Guin (discussing her aspirations for _The Dispossessed_) and wondering how it applies to procedurally-generated works—which, affording a kind of encounter with the infinite, appear to fit into this rubric. but that encounter w/the infinite happens only in the context of asserting an understanding of it—a program promises to keep doing the same thing forever, which is a kind of reassurance, and the opposite of "a permanent source of renewal"

inferring spelling from phonetic feature sequences zoomed along the timeseries axis. (basically, smooshing and stretching the sound of the word and getting the neural network to try to spell out the sound)

race, literature Show more

I've seen things you people wouldn't believe. Attack ships on fire off the shoulder of Orion. I watched C-beams glitter in the dark near the Tannhäuser Gate.

when reference grammars (unintentionally?) come off as new weird fiction

Show more
Mastodon

Server run by the main developers of the project 🐘 It is not focused on any particular niche interest - everyone is welcome as long as you follow our code of conduct!