visualizing the vector in the latent phonetic space while interpolating between "abacus" and "mastodon." (this is after inferring the latent vectors via orthography->phoneme features->VAE). I just arbitrarily reshaped the vectors from (1, 1, 128) to (8, 16), so the 2d patterns are arbitrary. still interesting to see what it's actually learning!
going back to the regular seq2seq networks, I'm trying to do some quantitative evaluation. the phoneme features to orthography model gets... ~60% of words wrong, and ~12% of letters wrong (working on samples a few thousand words from cmudict), but its guesses seem... reasonable? not sure how to talk about this
thinking about this formulation from Ursula K. Le Guin (discussing her aspirations for _The Dispossessed_) and wondering how it applies to procedurally-generated works—which, affording a kind of encounter with the infinite, appear to fit into this rubric. but that encounter w/the infinite happens only in the context of asserting an understanding of it—a program promises to keep doing the same thing forever, which is a kind of reassurance, and the opposite of "a permanent source of renewal"
race, literature Show more
welp discovering the shape of this particular venn diagram
birdsite screenshot, mh mention Show more
nice one pentametron
poet, programmer, game designer, computational creativity researcher. assistant arts professor at NYU ITP. she/her.
Server run by the main developers of the project It is not focused on any particular niche interest - everyone is welcome as long as you follow our code of conduct!