We are living through the greatest flood of content in human history, but its sheer volume doesn’t change the fact that most of it is eye-glazing. Sure, AI-drafted articles, outreach emails and optimized posts are technically coherent, but most feel hollow, and strangely soporific, since they’ve been engineered to satisfy an algorithm’s needs, not a human being’s. And as far as I’m aware, algorithms don’t have souls yet.
To understand why this so-called “AI-slop” is so pervasive and so difficult to filter out, it helps to understand how it’s actually generated. For an explanation, I turned to product technology expert Craig Unsworth, who explains in his Substack that a Large Language Model or LLM isn’t a thinking machine or knowledge receptacle. It's a probability engine, one trained on vast amounts of text to predict the most likely next word, then the next, then the next – and is optimizing for what sounds plausible but isn’t necessarily accurate. This means it can produce fluent content that can pass spell-check, hit keyword targets, and make approximately the right points in approximately the right order, but it does so without any underlying understanding of whether the content it’s spewing is accurate. And therein lies the problem. For, as Unsworth notes, fluency is precisely the danger, because we instinctively trust confident language, even when it has been generated by a system that has no idea whether it’s correct. In other words – and these are mine, not Unsworth’s – AI is conning us.
Audiences, who are always sharper than we give them credit for, are onto the con. They can instinctively sense the difference between content that feels human versus content that feels questionable or hollow. The result? A collapse in trust, which audience engagement numbers make painfully legible: organic engagement on Facebook and X fell to just 0.15% in 2025, according to research by growth-onomics. Instagram reach dropped from 10–15% of followers in 2020 to 2–3% today.
“Audiences are onto the con. They can instinctively sense the difference between content that feels human versus content that feels hollow.”
Algorithmic changes are often blamed for these metrics, but the message in them isn’t just that audiences have become harder to find. It’s that they’ve become increasingly suspicious of the content they’re reading and harder to convince of its truth. The prevalence of AI slop, it turns out, has come with a steep price: not just less ability to capture an audience’s attention, but also a reduced baseline assumption of its good faith.
Feeling algorithmically-overloaded and perpetually pitched, today's consumers have developed a finely-tuned internal alert system that prompts every post, every outreach, every campaign to trip a silent alarm: What’s the angle? Did a human actually write this? Does anyone behind this brand actually believe in or feel passionately about what it’s selling?
When content is ubiquitous and cheap to produce, every piece of content – including the genuine, carefully considered and crafted variety – is viewed with the same suspicion. Which means that we’re not all just drowning in sub-par content. We’re also losing our ability to distinguish between the quality goods and the slop. At the same time, it has become harder for us to trust both. The stories that resonate as human register as almost radical right now, because so much of the content that surrounds them doesn’t. According to the Because of Marketing newsletter, the dearth of evocative human stories accounts for socially-native formats like street-style interview programs — Subway Takes, Are You Okay? — outperforming traditional marketing activities.
“We’re not just drowning in eye-glazing AI-generated content. We’re losing our ability to distinguish between the quality goods and the slop.”
All of this explains why storytellers are, once again, in high demand and being wooed to swoop in and save the day. In December, The Wall Street Journal published “Companies are desperately seeking storytellers”, a story documenting how organizations are scrambling to find leaders who can help them articulate who they are, why they exist, and how they can connect with increasingly skeptical audiences. The article references how last year, executive mentions of “storytelling” on earnings calls jumped from 147 in 2015 to 469 in 2025. And corporate job listings? In 2025, LinkedIn’s postings for “storyteller” jobs doubled – over 50,000 in marketing alone. Google is hiring a customer storytelling manager. Vanta is offering a salary of $274,000-a-year for a head of storytelling. Notion recently merged its communications, social media, and influencer functions into a single 10-person team, simply calling it the storytelling team.
What companies are really searching for here, in my view, isn’t so much storytelling as a skill set – although storytelling skills remain valuable – as much as the capacity to discern the real story: what their organization does, and does uniquely, why it does it, the ability to translate those ideas into a coherent message that feels honest and true, and put it all together in a way their audience will actually believe. That's a fundamentally different brief than “produce more content”, and it’s one that no language model, however sophisticated, can fully answer, because, along with the ability to write coherently, it requires the judgment to know what's worth saying and the curiosity and patience to keep looking until you find it.
Perhaps the best way for me to explain what I mean by that last point is by sharing a bit about my history. I came to brand strategy through journalism, which means I came to it by first learning how to interview, ask questions, and sniff out a story. Before I understood brand positioning or messaging architecture, I learned the craft of interviewing. If you’ve ever watched a great journalist conducting an interview, or you’ve been interviewed by one, you’ll know that an interview isn’t simply a conversation. It’s a controlled conversation. A great interview also depends on the interviewer organizing questions strategically and developing the particular discipline of sitting with the subject long enough, and listening deeply enough, until the real story surfaces – the one they didn't plan to tell you, or maybe didn’t even realize was the real story.
That journalistic training turned out to be the most useful skill I brought into this work, because it taught me that a good story is less constructed than uncovered. That knowledge is just as applicable when you’re eliciting an individual’s story as it is when uncovering an organization’s: it’s in the people doing the work, in the synergy – or lack thereof – between the board and people on the ground, in the decisions that never made it into the annual report. AI can generate a coherent mission statement. What it can’t do is sit across from your executive director, notice the way she pauses before answering a particular question, and understand that it is in that pause where the story lies.
“My journalistic training turned out to be the most useful skill I brought into this work, because it taught me that a good story is less constructed than uncovered. That takes judgment.”
The journalist, the brand strategist, the editor with 20 years’ experience, all know how to ask the right questions, distinguish between a reliable and unreliable source, an illuminating quotation and one that’s simply a space filler, and a narrative arc versus a list of facts simply dressed up as one. AI can perform genuinely useful tasks: compress research, detect patterns, accelerate the structural tasks that precede the hard work of writing, if prompted by someone equipped to offer context and critical judgment. What it cannot do is act as a substitute for that judgment itself.
The editorial experience and instincts I’m describing have been developed over time. Those who possess it know what to leave in, what to leave out, and how to determine what’s missing. That’s why, almost 10 years ago, we saw the value of pairing our brand strategy skills with journalistic and editorial ones, and we created a complementary newsroom of sorts by partnering with our sister agency, Content Writers Group.
Speaking of purpose-driven organizations in particular here, whose stories are already richly layered and human, this moment represents a genuine competitive advantage, if you choose to claim it. Mission-driven work generates exactly the kind of specificity, human stakes, and values-in-action that cut through the noise. In an algorithmic economy that keeps raising the cost of being heard, one where AI-generated content keeps raising the volume of content competing for our attention, the one currency that still compounds, that no platform can throttle and no model can replicate, is trust. Your challenge isn't finding stories or struggling to earn trust. You already enjoy an embarrassment of riches in both. It’s deciding to rely on living, breathing storytellers to ensure your stories rise above the slop, where your audience can find and embrace them.