gay robot noises

An actually useful definition of 'slop'

I'm tired of all of this semanticslop.

date:
BROKEN NECK

If you have a functioning Internet connection, you've probably seen someone use the word "slop" to refer to something they don't like, especially if that something was produced by AI. I don't think this is useful. Arguing about definitions of fuzzy terms is inherently a fuzzy endeavor because by definition (heh) I can't be wrong about what other people mean by a term. So to be clear: my definition of 'slop' isn't "this is what other people mean by the term", it's "this is a definition that I think results in useful analysis".

'AI' vs 'machine learning'

I don't think it's worth trying to say that LLMs aren't "real AI", because this doesn't align with my own experience with what was considered "AI" when I was working on related projects over a decade ago and because it strikes me as the equivalent of trying to distinguish between "computers" (which are good) vs. "automated arithmetic" (which is bad). But that's its own post.

With that out of the way, my definition is:

Slop is something that has all the surface value of high-effort production, but is actually extremely shallow and not thought-out.

It doesn't matter how much of the thing was made by humans and how much was made by a generative AI model. Slop is slop.

Not all slop is AI

One of my motivating examples for this is actually a satire, but it's so well-executed that it's perfect as an exemplar:

This video (Youtube original by Funkyzeit Games, with voice acting by RyanStewartVO and ArielHck) is parodying a specific trend in video games, but even if you aren't much for games yourself, there's a lot of things that stand out:

It's a perfect example of Nintendo, Hire This Man game development. It's slop (or it would be if it was serious). And it's all human generated. Anyone that looks at a lot of digital art (especially anime-style) will be familiar with the sort of images that tend to get popular: very "rendered" (i.e., detailed with effort put into lighting), often focusing on attractive women, but very little individuality. If it's fanart, there will be emphasis on visual signifiers and a deemphasis of the character's personality.

SEO spam is also one of the quintessential examples of human-written slop; although a lot of the pre-LLM SEO spam was certainly generated using mechanical assistance (thesauruses to rephrase content copied from other sites, fill-in-the-blank templates, and so on), the bulk of it still involved a lot of humans who were presumably paid poor wages to do it. It was/is meant to trick you into thinking that the content is worthwhile so you would click around, get ad impressions, and make the creator(s) money.

Not all AI is slop

Back before Stable Diffusion and NovelAI and DALL-E, there was DeepDream: broadly, you take an image model designed to recognize something (in this case, probably dogs), take an image (in this case, the Mona Lisa), and iteratively adjust the image to maximize how 'dog-like' it is. This image looks like a weird fucked-up experimental piece of generative computer art, and that's exactly what it is. A lot of early generative images had this quality, and a lot of early generative text did as well (going back all the way to Markov chains like the dissociated press).

Similarly, look at the video for Make Me Feel by the Chainsmokers. The video here clearly also makes heavy use of AI image generation using a human dancer (whose face you can occasionally see "breaking through") as a base, with almost no "cohesion" between frames and very little attempt to ensure that the generated images are at all coherent. But the video is good! It's using the weird "morphing" of non-temporally-coherent video generation as an an effect like a very fancy image/video filter. You don't necessarily have to like the video or the music (I personally like both), but you do have to acknowledge that there's some very human intentionality going on here. I'm sure someone more conversant in the history of videos and effects could draw all kinds of historical comparisons, but the only comparison that comes to mind for me is the scramble suits from A Scanner Darkly.

Steve Harvey being chased by a horror movie monster, as viewed by a trail camera

And finally, you have shitposts like "Steve Harvey being chased by a horror movie monster trailcam footage" (source unknown). There's no pretense here. It's just something that someone thought would be funny, typed into the funny image machine, and shared with their friends. For that matter, the low-quality of it is almost the point: it's the sort of throwaway shitpost content that's the equivalent of shoving something into an image editor, writing "TFW WHEN YOU / BOTTOM TEXT" without even setting the text tool's background to transparent, pasting it directly into a #memes channel on Discord, and getting two Peter Griffin pogchamp reacts. I'm not trying to argue that this is the aboslute peak of humor or that the novelty of any particular genre of AI-generated image won't burn through in roughly 30 seconds, just that I don't think there's much you can say about the effects of this kind of image that's also true of overly-polished Github projects with fancy HTML landing pages that fall apart as soon as you poke under the covers.


So what do we get out of this definition? The big one is that we can see that slop has been with us since before generative AI models, but the cost to produce slop has changed drastically. Writing "decent-looking" SEO bait is far easier than learning to draw "front page of artstation" images, but generative models can handle the latter just as well as the former. And the flip side is that even if we were to somehow ban generative AI models, we wouldn't fix the slop problem. Google search's slow slide in quality started in the mid 2010s at the latest, well before generative text models were widespread enough and cheap enough to be blamed. If we want to stop the slop problem, we'll have to change the incentives. Figuring out how to do that is out of scope and therefore left as an exercise for the reader.

I know this post is just me typing into the void; I'm not under any illusions that I'm about to singlehandedly reform all of discourse online. And there's no way to know that anyone using "slop" is or is meaning it in a certain way. But I hope that at least thinking about this particular meaning will prevent you from falling into thoughtterminatingclicheslop.

// α-5/h