Skip to content
AI’s Getting Dumber Cause It’s Cannibalizing Itself

Photo by Alexey Demidov for Unsplash

Liberty Project Staff
Liberty Project Staff

May 28 | 2025

Why generative models are drowning in a spiral of mediocrity

It’s true of life, and now it appears it’s true of AI:  You get what you give. You only take out what you put in. And neither life nor AI are doing very well these days.

Let’s leave aside the catastrophic political scene and a dying ecosystem and focus for a while on what’s happening to AI.

It’s getting dumber. Having consumed enormous amounts of language written by humans, it’s now feeding on itself. What it’s churning out these days reminds one of a photocopy of a photocopy of a photocopy of a photocopy: the image only gets blurrier; nuance, detail, precision are lost.

The more we use AI, the faster its decline. Recent articles in Nature, Futurism, and Forbes detail how artificial intelligence systems, especially language models and image generators, are becoming increasingly sub-par. Flat language becomes flatter; mistakes are more prevalent; and the illusion that an actual person wrote what you’re reading is harder than ever to maintain.

When Machines Teach Machines

AI’s success is, ironically, the source of its downfall. (Luddites everywhere rejoice!)

GPT-4, Stable Diffusion, and other AI models draw on vast amounts of internet content – books, articles, blogs, social media posts, you name it. AI-generated content has flooded the web. AI models treat AI text and images exactly as it treats content created by humans. It can’t tell the difference. As time passes, AI drifts towards statistical sameness. It becomes blander, less accurate.

The Mediocrity Spiral

Lars Danielsson of Forbes describes how basic critical thinking is degraded when AI output is treated as fact. We become accustomed to “information” that’s quick, clean, confident – and often wrong. The work image-generators produce warps anatomy, lighting, texture; faces become Halloween masks. Hands go haywire. The choices feel more like algorithms at work than an expression of individual aesthetic vision. Language seems ever-more hollow. AI is pumping out cardboard and doesn’t even know it. 

It’s Not Art – It’s Imitation 

Art celebrates ambiguity, complexity, originality. It isn’t “generated” – it’s created. It bears the marks of its time, the culture in which it was created; it reflects actual experience in the world. It might affirm. It might deny. It is – or should be – one person’s unique, inimitable voice.

None of the above applies to art assembled by AI. Some believe that AI will replace human creativity. That can’t be replaced. The problem lies in AI itself. The more it mimics itself, the further we get from what makes human expression so valuable. We are degraded along with the AI models.

AI learning from AI risks distorting the cultural, social, and historical record. Imagine future students – if there’s a world left for students to study in – gathering their knowledge from copies of copies of copies and never the original sources. How accurate would that ersatz education be?

Major platforms are struggling to filter AI-generated content. Spam sites are filled with low-effort machine writing. Stock photo databases include AI images labeled as real. And search engines are starting to serve up AI-created answers based on AI-written articles that were never fact-checked.

AI’s getting dumber. And so are we.

So What Do We Do?

There’s no simple fix.

Start with vigilance. Researchers need better tools to detect and filter synthetic data during training. Platforms must clearly label AI content. Stay alert – favor content made by real people, with real ideas and real intentions. Real information, accurate and correct. Real knowledge, born of experience and possessed of insight. 

Otherwise, AI is only putting makeup on a corpse.

Related Articles