ran across this very good overview of text generation techniques with neural networks blog.usejournal.com/generating though it's notable mostly for this very odd/hilarious illustration of "meaning space"

odd because it's an example of the map becoming the territory. the whole point of contemporary nlproc is to be able to work with the meaning of a sentence as something other than the sum of the semantics of its component words, but in this case, the sentences on the left are shown as being "similar" to one another *simply because of their lexical similarities*. it's hard to think of an actual human speech context where "I love tacos" means anything remotely close to "we are all tacos"

whoops, that should read "the sentences on the right"

@aparrish I’m imagining something terrible happening to tacos and politicians going on TV to say “today, we are all tacos”

(Please nothing bad happen to tacos)

@aparrish Going through the idea of "you are what you eat", it can be equivalent to "We all love tacos.", which is definitely close to "I love tacos." ...

@aparrish does tumblr shitposting count as a human speech context?

Sign in to participate in the conversation

Server run by the main developers of the project 🐘 It is not focused on any particular niche interest - everyone is welcome as long as you follow our code of conduct!