I understand the predicament (and the benefits). People are now required to produce more with less, using the new tools available. With AI, it’s easier to kick out highly technical, sophisticated, quite interesting videos and writing than it’s ever been before; of course the imperative to create more of that content has increased. Given the pressures, lots of brands are choosing to publish at less than 100% quality—and I get that, too.

That said, I want to make the argument for maintaining that high quality bar for publication, despite the time and energy it takes. It’s not a character flaw to use AI tools to help draft content. Far from it! But I think it’s worth keeping your final draft standard high (even if it means outsourcing, or publishing at lower volumes) for a few key reasons.

First, it’s a relationship issue. As your audience, I want to feel that you respect my time and attention enough to thoroughly QA your work.

More importantly, it’s a results issue. AI drafts, without rigorous human direction and QA, tend to be somewhat generic and get lost easily in the sea of information out there. The now-ubiquitous AI “tells” (e.g. cliché and hyperbole, repetition, overuse of certain sentence structures, inaccurate information, circular logic) are quality problems, some more severe than others. These AI-related quality problems make it unnecessarily harder for audiences to trust the content, which makes it harder for the content to do what it’s meant to do. And ineffectiveness is a fatal flaw.

Content can certainly get results without being trustworthy, but not the durable results that most business leaders want. I’m thinking here about deepfakes, click/ragebait, and AI slop. That stuff is pretty darn effective in what it’s supposed to do, which is to grab attention. It’s easy to act on whatever we just saw without stopping to interrogate it, especially when we’re tired, stressed out, or in pain (and we’re all at least one of these things at least some of the time). Billions of dollars have been made on that human tendency.

It’s uncomfortable to know that our brains are wired this way, but we’re not all reactive lizard brain. We have compassion and discernment, unlike AI. And fortunately, we’re the deciders, using content to help us in knowledge-sharing, analysis, and messaging. Unless we cede that decider role, content—and the business results it gets—has us behind it, making judgment calls about intent, method, and standards for success. That’s a good thing. It means accountability is still possible, which drives trust, which drives results.

Keeping that 100% quality bar is worth it because it builds trust, and in the end, it seems to me that businesses that put trust first have a higher chance of thriving—no matter how sophisticated the tools become.

What do you think? I’m at kreilly@steyer.net and I’d love to hear from you. Does it bother you to see AI tells in professional content? Or is the standard changing? Are the AI models getting so good at writing and revising that the tells are vanishing?

Thanks,
Katelyn

Photo by Arthur A on Unsplash