AI Slop: "It's Not Wrong"
A few months ago, a colleague asked me to review an AI-generated training strategy. If you work in the learning space, you know that it's impossible to provide substantial feedback without full context but I took a look at it. The document was several pages long, touching on many solid practices with examples. Overall, nothing stood out as missing. It was just meh and had a few internal inconsistencies. However, I figured everything would wash out once the analysis was complete and they had real use-cases, so my feedback was that "it contained many of the the things I considered when developing a framework."
Today, I recognize it as "AI Slop."
AI-generated work requires a different set of evaluation standards than human-generated work. When a human produces something uninspired, they have brought their education, experience, and skills together to make sense of a context and choose a strategy that is most rational to them. They might need to do more problem definition or learn new techniques but they own the work. It shares a perspective.
Human-created work is a starting point to an iterative collaboration process with other human beings. It documents what someone knows in a rational process that can provide insight. Other people can react and add to the shared perspective and ideas. The communication sets the stage for co-creation of the context and problem space.
Worthwhile AI-generated work also provokes human collaboration. One easy evaluation heuristic is whether it elicits an emotional reaction. You love it. You hate it. You are intrigued. You are inspired. It has a clear point of view and makes you think differently. You share it and discuss it with other human beings.
Without human impact, AI-generated content is just bits and bytes produced by an automated MadLibs machine. AI is very good at embedding indicators of professional polish, looking exactly like uninspired human content. But it is actually very different. It was produced by filling in the next most likely word in a sentence. It doesn't help the AI make sense of the world. It doesn't allow the AI to learn through sharing its perspective with others. It doesn't co-create.
It takes humans to bring value to machine-generated content.
