@phildini prequel jukebox musical called "Mamma Mia: Calvary" that finally puts the "Abba" in "Barabbas"
@phildini we'll A/B test it obvs
@acb I think I vaguely recollected the death of Prince Albert* and Victoria's life-long mourning and was like "which queen was that?" and ran with it until further notice
*they should have left him out of that can, poor thing
GUITAR PICK 1: i really wanna fuck that sock
GUITAR PICK 2: it's huge, you'd die
GP1: there's gotta be a way
GP3: its hopeless
GP4: just physically impossible
GP5: ...unless
GP1-4: unless?
GP5: what if a whole bunch of us worked together
GP1: like a voltr--
GP5: like a voltron sock yeah
GP1-99: ...let's work out a blueprint
~ later ~
ME: where's my other fucking sock
ME AGAIN: where are all my fucking guitar picks
@courtney as the burned in Spanish subtitles on the sketchy copy I'm watching would say:
Majestad.
like the fact that Prince Philip was hospitalized recently is really undercutting the potential drama of a young Philip insisting on taking up flying
like he ain't gonna go out like a Downton Abbey character now, obviously, so so much for that
@selfsame looks like you hit a hole in everything
@drwho @phooky One of the odd little wrinkles of how this kind of "put some novel text on a page" stuff has evolved over time is one of the bigger sources of easily available text to use is fanfic. People just absolutely churning out text in a casual narrative register that manages to vary from text to text without being bizarro technical text full of easy-to-Bayes jargon and a high density of unlikely collocations. Perfect sheep's clothing in a lot of ways.
@drwho @phooky Might not literally be a Markov model, mind; there are other similar stochastic methods for generating text that could produce similar kinds of strangeness. But it feels like a good bet, esp. since Markov chain implementations are easy to build, easier to just use as a library, and cheap and fast to run. And none of this text needs to be convincing, just looks sort of like human writing on novel subjects to avoid the easiest sort of detection.
@drwho @phooky It has very much the look of a Markov chain model to it in this case; that recurring colocation of the unlikely string "pina colada colony" is a familiar artifact of that process, where the stochastic jump from one bit or source text to another (said source text presumably a large-ish corpus scraped from any number of sources on the web without permission) tends to happen across more common collocations (especially glue phrases like "is a", "but the", etc).
@phooky Cheerily Pooping Daddies
@pixouls I think those are both useful structures, totally. "Lean into discomfort" is a good framing for people who might find the idea of just sort of experiencing static discomfort overwhelming: treating it as a process, an exploration of self, instead of just "now think about what you've done" is a different framing that can be helpful.
Creating a brave space is a good process, but a much more narrow/specific one that requires more community and mutuality than the default situation.
Runs http://www.metafilter.com, makes stuff. email at josh@joshmillard.com