I can't bring myself to believe that artificial intelligence will bring an end to creative labor. you can't reduce creative labor to the artifacts it produces, and that the value of those artifacts doesn't inhere purely in their form. sure, I think a lot of the value in art comes from, like, visceral spectacle, but some amount also comes from being able to identify and appreciate the choices made in their production—and people are really good at identifying and distinguishing these choices

which isn't to say that new technology doesn't alter displace creative labor, because it obviously does that (and has for thousands of years). but there's a reason that it takes twenty minutes for the credits of a pixar movie to scroll past, and I think it's at least partially because the availability of automated tools opens up larger possibility spaces for sophisticated and interesting creative choices

here's an extreme example of what I mean—from an AI paper about interpolating latent space in GANs (the technology lately touted as producing "the first piece of AI-generated art" christies.com/features/A-colla) has a lot of math but then ends in a sentence stating, essentially, "I want to make it pretty, how do I make it pretty???" (from this paper users.aalto.fi/~laines9/public)

a glance at any AI research having to do with producing creative artifacts or augmenting creative processes will likely reveal many similar choices about the aesthetics of the output. [these are almost always couched in the language of empiricism ("here's how many people thought it was good") or obviousness ("of course this is good") if it takes time to analyze its own ideas about aesthetics at all.]

and hey, guess what, if you're making choices about the aesthetics of an artifact, sorry: I know you told your mom you were a big fancy AI researcher, but you are doing creative labor. (turns out the two are not mutually exclusive)

i keep thinking of the popularity of speed running (and maybe just video game streaming in general?) as a touchpoint in this conversation. people like to watch other people accomplish virtuosic feats with their tools, even when "perfection" is possible through automation. in speed running, the tool-assisted work actually *feeds back into* the purely non-automated performances. the presence of automation opens up new interesting expressive possibilities, it doesn't close them off.

(the metaphor here being that, say, a movie can partially be understood as the result of a "speed run" of [e.g.] Final Cut Pro, or an illustration as a "speed run" of Illustrator/Photoshop, etc, where "speed run" here is defined as an artifact created from a constrained performance with a certain tool or set of tools. the question of "wow how did they DO that?" is, i'm proposing here, an inseparable part of the value of media)

(i guess the secret motivation of all research into automating creative labor with AI is that *those researchers themselves* want to be the people of whom it is asked, "wow, how did they do that?"—a very artistic and human motivation imo)

(this is my own overt motivation for doing the work that i do btw)

@aparrish people who claim not to care about authorial intent will find out quite how much they mean it, soon enough

@LogicalDash @aparrish not caring about authorial intent could mean not needing to know what the authors intent was to be able to appreciate the work rather than not caring whether there was any intent at all. I often think of art as a way of connection/communication between people whether one knows the intent behind the work is apparent or not usually it is apparent that there IS intent. I think that sort od communication/connection probably requires a thinking agent on both side and ai art will be lacking that. I think you can not care (need to definitively know) about what the artists intent is and still care whether there is someone there making decisions in an attempt to communicate something important to them.

@somem @LogicalDash yes, I was careful to avoid the word "intent"—I'm trying to cover a wider range of hermeneutics than just asking "what did the author meeeeean" that includes choices made subconsciously or under other constraints etc. I also disagree strenuously with the idea that AI art has no "thinking agent"—there is always someone (usually many people) who put together the system in question (and/or the data it operates on)

@aparrish @LogicalDash sure i was more responding to @LogicalDash saying that people who dont care about authorial intent will be proved wrong by ai art (maybe thats not what they meant but i thought it was). I pretty much agree with all your takes. I think i was trying to articulate what didnt resonate with me about what they said, after reflecting on your statement about the idea that ai art has no "thinking agent" I definitely agree with what you are saying. To reframe my previous statement to take that into account how what you said clarifies it for me. I think im trying to say that i dont need to know what the artists intent was or be certain about the reasons they made their decisions but the process of thinking about why they made it the way they did and the decisions they made is very important to me, probably equally or more important than the visceral spectacle of the artifact. Thinking about some examples i think this holds, some neural net art i look at and think about the process and decisions that went into it and its a very rich experience for me, some others i look at and think hmm it looks like someone just plugged their picture into deepdream and this is what came out. Of course there are all the decisions that went into choosing or creating the image to process but sometimes it becomes quite obscured and i have a hard time engaging with that thought process. I guess just thinking about the multiple "thinking agents" behind the creation of a piece and connecting to them on different levels based on what i can identify of their creative decisions in it is important (to me). Probably the mistake I made in my earlier statement or the distinction that i accidentally spoke to comes from how after seeing many examples from a similar neural net I start to dissociate the decisions that went into its training from a specific instance of what it produces, which leaves a much smaller set of creative decisions about what the input was for me to think about, and ends up in me not really connecting to what was produced or the agents who made the decisions.

@aparrish I completely agree. I have yet to see a compelling argument for why "AI' is somehow qualitatively different from past automation in its economic. It's not magical. This is why I think we should focus on helping displaced folks rather than on trying to slow down the deployment of automation.

@aparrish The Pixar credits is a great analogy. I hope you’re right.

@chartier @aparrish Imagine how long the credits would be if animators had to draw each cell by hand. Machine tools accelerate development, and displacing human labor is a common goal. That doesn't make it a bad thing.

@aparrish Exactly. Mature "creative AI" will be a tool for creative people, not for researchers.

@aparrish By "creative," I take it you're restricting the term to the decorative arts?
I read that last toot first, and though "of _course_ you're doing creative labour! If you're building an AI, you're, well, creating stuff."

@aparrish cold take:
(auction) sales do not correlate to artistic merit

@aparrish Ha, I think you nailed it. Yet the terrifying (potential) tipping point for many is when we begin reading primary agency into the AIs themselves -- when even the researchers who got the ball rolling go "Wow, how did they do that?" But some AI researchers already seem to enjoy the mystery of not understanding the competent end products of their deep learning algorithms. Human/AI relations transition from engineers creating to scientists discovering, a different flavor of wonderment.

@aparrish in speedrun you know the path, while in creative process you know the tools but not the path ...
I would compare speedrun to accoustic analog instruments performances f.ex. where you interpret a given track adding some bias
in that sense, "Beat Saber" is not far from a "Coucours Reine Elisabeth"

@aparrish This dates me maybe, but the segments on doing cool tricks or mini projects in Photoshop or Final Cut Pro were my favorite parts of MacAddict magazine. I think that part of the appeal of these and of speed runs is that in many cases the “final” process is transparent and appreciable even though it hides the “how did they figure that out?” Maybe we can see YouTube makeup tutorials with the same lens?

@aparrish As an indie creative laborer (gamedev) whose non-homelessness depends on people paying me for that work, I must agree generally with the statement. Although procedural generation is massively on the rise, and game / film corporations view 'creative work' - including writing - as simply 'content filling afterthought'. Like I've got colleagues still in the machine and they're like, "Yeah they don't even bring the hundred writers in till 3 months before release."

@aparrish Like the most important thing in the most popular & profitable games is giant cowboy / fantasyland / cyberpunk vista-generators, then at the last minute, pay a hundred writers to 'sprinkle some quest / lore salt' on our character designs and combat. "We got molecule-accurate elf-cleavage physics, cyborg dragons, you can destroy entire cities, amazing reflections on these water textures... Ok, let's pay 500 unemployed English majors to try to make sense of this. Great, SHIP IT!!"

@aparrish

The fact that big tech companies are trying to replace humans recommending art to other humans with robots recommending art to humans isn't a good sign. It violates the basic "automation only replaces humans doing things they don't actually want to do" pretty blatantly.

The best version of the future is where artists curate and grow neural networks carefully pruned and fed by their own art. The neural network then is personal extension of the artist.

@aparrish

I think the conversation around fear of what artificial intelligence will and will not do for humans misses the point that artificial intelligence is terrifying because of how it allows centralization of power (in businesses and governments).

What technically artificial intelligence does is less important than the fact that it pushes technology further in the direction of supporting authoritarianism and away from supporting free societies.

@aparrish

If we deal with this tendency of artificial intelligence to enable a small number of people to have massive power over others (perhaps by distributing ownership of ai in some fashion/regulation) the actual technology will end up benefitting us. However artificial technology and its effect upon humans can't be considered seperate from its tendency to centralize power.

This is a political issue disguised as a technological one.

@aparrish but will artificial intelligence appreciate art?

Also sometimes wonder if there is an algorithm that can for a long time produce meaningfully different things that are meaningful to us.

Something like a minecraft/minetest map generator produces something meaningful, i.e. maps of different styles, it keeps producing different things, but eventually not _meaningfully_ different.. Don't even know how to define that though.

Sign in to participate in the conversation
Mastodon

Server run by the main developers of the project 🐘 It is not focused on any particular niche interest - everyone is welcome as long as you follow our code of conduct!