Allison Parrish is a user on You can follow them or interact with them if you have an account anywhere in the fediverse. If you don't, you can sign up here.

Allison Parrish

Pinned toot

since the dawn of the written word people have made amulets and talismans from holy texts and in this sense a pinned toot is like a horseshoe you hang over a barn door

nitpicky programming stuff Show more

nitpicky programming stuff Show more

uspol, social media, birdsite Show more

working toward an answer to this question, I offer this recording of me reading "The Road Not Taken" overdubbed with me reading "The Road Not Taken" but with each word replaced with a random similar word (similar in sound and meaning)

these datacamp tutorial videos seem like guided meditations when you watch them with ambient drone music playing in a different window

uspol, very negative Show more

I took that combined phonetic/semantic similarity vector space I was playing around with and tSNEd it, this is the result of just picking a random line segment in the 2d tsne space and finding the word closest to evenly-spaced points on the line, it's weirdly unnerving

video games, metroid spoilers sorta Show more

here's the jupyter notebook for this btw. the flask thing at the bottom apparently only works with 0.12.1(?), I need to track down the bug in flask and do a PR

still had this open in textedit, an artifact from when I was making an animation for a presentation about predictive typing

I made a clone of ios quicktype that you can train on arbitrary texts... here's an example trained on pride and prejudice

javascript, the language where you have to do a web search whenever you want to write a for loop because there's already like fifty ways to write a for loop and you're pretty sure they've probably added a new one since you looked last

(could improve this by also concatenating an average vector of all of the context leading up to the n-gram, maybe? although at that point you're basically just hard-coding what an LSTM is supposed learn to do on its own, more or less)

realized you could make a sort of markov chain text generator using concatenated word vectors instead of the tokens themselves, which has the benefit of being able to cope pretty well with out-of-vocab strings. anyway, here's word-vector-markov Jane Austen elaborating on what the Internet is

[I want to add that for the record I'm not a fan of Colab—it really does feel like an attempt on Google's part to embrace/extend/extinguish jupyter notebook, up to and including parallel but incompatible feature implementations (like widgets). I hope they prove me wrong. the collaborative authoring is probably a killer feature for some folks but overall it feels clunky and weird to me—I'd probably only use it for things like this where I'm hoping a non-programmer will dive in and run the cells]

the code makes use of this little python library that I just released for doing nearest-neighbor lookups in a clean and easy way (it's just a wrapper around Annoy but takes care of some of the little details for you)

I wrote up this Colab notebook that shows you how to make a corpus-based chatbot using semantic similarity the benefit of the colab notebook is you can run it on a server that Google lends you for free, but you can also download the plain jupyter notebook here