AI & Marketing Strategy · April 2026
I picked up the Wall Street Journal this morning and found two stories that had no business being on separate pages. One was about brands rushing to put "No AI" disclaimers on their content. Aerie made it a whole campaign. The other was about a company called Eko running a warehouse in Bentonville, Arkansas, where hand models, food stylists, and former theater directors spend their days photographing products so that AI can work with the images downstream. And the Washington Post that same morning ran photographs of a proposed Trump Presidential Library with a big label on each one: "AI Generated."
I am not alarmed by any of this. I have been in marketing long enough to remember when "all natural" meant nothing legally and everything commercially. "No AI" is headed the same direction. It is a positioning move, and positioning moves fade when the novelty does.
What interests me more is the question nobody in these stories is asking directly: which parts of marketing actually need a human, and which parts have we just assumed do?
I Read That Story Differently Than You Might
When I saw the AI disclaimer piece, my first thought was not "brands are protecting authenticity." It was "brands are responding to a specific fear that is real for some people and irrelevant for others."
If you are a photographer or a production artist, AI-generated imagery is an existential conversation. I get that. But I am not reading it from that seat. I have spent 25 years trying to get marketing organizations to say something worth saying. Most of the content quality problem I have seen in my career had nothing to do with the tools people used. It had to do with the absence of a clear perspective behind the work.
That has not changed. AI has just made it faster to produce content that reveals the absence of a perspective. The volume goes up. The signal does not.
The Eko Story Is the One Worth Paying Attention To
Eko runs what it calls a "capture factory" in Bentonville. Hundreds of employees photograph products for Walmart, Best Buy, and Target, from every angle, on movie-studio-style stages, with lighting adjusted by hand and fingerprints wiped off metal surfaces before each shot. Walmart has put more than $300 million into Eko since 2018. Creating an Eko file takes 10 minutes for a bottle of vitamins and half a day for a large refrigerator.
This is the part I want marketing people to sit with: Eko is not replacing humans with AI. Eko is using humans to make AI reliable. The hand model is still there. The food stylist is still there. They just moved earlier in the process, into the specification work that makes everything downstream accurate.
I have been saying for a while that Adobe, with Nvidia's infrastructure underneath Firefly, can produce specification-accurate creative more consistently than a human production team on a deadline. I believe that. Product imagery, templated assets, regulated content where accuracy is the whole job, that is where AI earns its place and does it better.
But that is a fundamentally different job from writing an opinion, crafting a campaign that needs to connect with a specific cultural moment, or putting a message together that has to shift how someone thinks. You cannot specify your way to that. You need someone with a real point of view, one that comes from actual experience and changes when the situation changes.
Slop Is Not an AI Problem. It Is a Brief Problem.
The fear driving the "No AI" movement is mostly a fear of slop. Generic, undifferentiated content that fills every channel and says nothing. That fear is legitimate. But slop was not invented by AI. Ask a junior writer to produce content with no real brief behind it and you get the same result, just slower and more expensive.
The quality problem in AI-generated content is a specification problem. If the brief has no genuine perspective in it, the output will not either. AI is efficient at revealing that. It used to take longer to discover you had nothing to say. Now you find out faster.
That is actually useful information, if you are willing to hear it.
What I Would Actually Do
If I were advising a marketing team right now, I would not start with a disclosure policy. I would start with an honest inventory of the content the team produces and sort it into two buckets.
Bucket one: work where voice, perspective, and the ability to change your mind in response to what is happening in the world actually matter. Opinion pieces. Campaign concepts built around a specific cultural tension. Messaging that has to move a specific person in a specific situation. That work needs a human author. Not because AI cannot generate words, but because the words have to come from somewhere real, and "somewhere real" is not a prompt.
Bucket two: work that is fundamentally specification-driven. Product imagery. Templated assets. Content that needs to be accurate, consistent, and produced at volume. That is where AI belongs, with good human specification up front, exactly the way Eko does it.
Most organizations I have seen are not doing this audit. They are either protecting all human production on principle, or they are using AI everywhere without asking whether the work actually requires a human perspective. Both are mistakes. They just point in opposite directions.
Go through your content calendar and ask honestly: which of these pieces requires someone to have actually thought something, and which ones are really just production tasks dressed up as creative work?
The answer will probably surprise you. A lot of what we call "creative" is specification work in disguise. And some of what we treat as production is the stuff that actually shapes how a buyer thinks about you.
Figure out which is which. Then put the right tool on the right job. The label on the content matters a lot less than the thinking behind it.
Coffee, Patrick. "Brands' New Authenticity Flex: 'No AI' Advertising Disclaimers." The Wall Street Journal, 7 Apr. 2026, p. B1.
Nassauer, Sarah. "Company Is Working to Create AI-Ready Catalog of All We Buy." The Wall Street Journal, 7 Apr. 2026, p. B5.
