DC Index139.39+2.54%OpinionA WHOOPing Masters from Rory McIlroy
dollar·commerce
Platforms

Would Miranda Priestly be out of a job in five years’ time?

Andrew Watson·September 3, 2025
Would Miranda Priestly be out of a job in five years’ time? — Platforms article on Dollar Commerce
Would Miranda Priestly be out of a job in five years’ time?

“How good is AI at generating creative?” The magic question we get asked all the time. In truth, this is a very open-ended question, because there are things it does well and others it does terribly. Understandably, brands are still a little nervous about handing over the keys to their creative kingdom before having full faith that it can both create and adapt imagery to stay on-brand.

If I’m the creative director at American Eagle and I ask GPT-5: “Create an image of a jeans campaign with a trending actress. It needs to be cheeky, play on words, a little controversial, and have a high chance of going viral.” - would I suddenly end up with a Sydney Sweeney ad? I’m not sure the technology has that much character to build it from inception.

Great at doing, not at dreaming: AI’s creative paradox

Within Advantage Shopping Campaigns (ASC), brands can toggle Meta’s AI enhancements on or off, keeping full control of their imagery and creative if they choose. But like any advanced LLM, the more data you feed it, the stronger the system becomes at aligning with expectations. So what happens if too many brands hold back and insist on manual control? In effect, it slows Meta’s ability to learn which creative edits and enhancements drive clicks and engagement within your category.

Zuckerberg has already hinted at the long game. In a recent interview, he explained that the vision for Facebook ads is a fully end-to-end solution inside Business Manager: you set the budget, connect your site, and the engine does the rest. That future implies creative will eventually be auto-generated directly from whatever assets Meta can fetch from your site, spinning them into ad variations it believes will convert.

It’s hard to imagine a world where design-driven brands will rely on software to conceptualize campaigns. Sure, we’ve had CGI for decades, so producing polished visuals isn’t the challenge. The real hurdle is ideation, the kind of cultural intuition that gave us “Diamonds are a girl’s best friend” or “Just Do It.” If Meta’s creative engine still struggles to get the basics right in everyday categories, how realistic is it to expect it will soon evolve into a fully fledged, end-to-end automated marketing machine? Hopefully, we still have time.

‘WTF is going on with this ad’

I got an email from a wonderful client last week that could be summed up as: “WTF is this?!” The brand is in the fashion space, naturally hyper-conscious of branding, messaging, and every visual detail you’d expect from a fashion house. The issue? Meta’s Advantage+ campaign had taken their static model shots and “enhanced” them into auto-generated GIFs. Faces were altered, expressions shifted, and in one case the movement was so unnatural it looked like the model was having a stroke on camera. So yes, the WTF was more than justified.

Even with auto-enhancements turned off and the creative shifted into a non-ASC prospecting campaign, the “bug” persisted. Sure, the quick fix is to kill the ad, but it leaves you with a string of depleting questions: Am I the only one seeing this placement? What if this distorted GIF is the one actually driving sales? Are customers seeing it alongside the statics? What if the model sees it? And do I really pull the plug, knowing I spent thousands of dollars on that shoot?

For me, this raises a bigger question. When a brand shoots content, there are image rights, usage rights, contractual obligations, and clear limitations on how that content can be adapted, especially when it involves altering someone’s face for an ad. If you’re selling coffee beans, you’re unlikely to face much pushback. But if you’re working with models whose appearance is their livelihood, the stakes are entirely different. How long before we see models or brands suing Meta for “enhancements” that distort a person’s likeness?

What’s becoming increasingly clear, especially in conversations with media buyers and agency partners is that Meta is more than willing to sacrifice brand integrity in its race toward full automation. The LLM is already in overdrive, and there’s no slowing it down. In reality, models and brands will need to start building contract clauses that protect them from AI-generated “enhancements,” especially if ad engines begin creating content that was never actually shot on camera. As it stands, until Mark’s master plan comes to life, brands will be left battling bugs and hitting speed bumps along the way.

Meet UpSpring — Creative Strategy Software

Ever wondered which moments of your video actually keep customers watching? Or which hooks are driving your competitors’ top-performing ads?

As media buyers, we know the strongest ad engines are built on feedback loops, giving strategists the tools to learn, analyze, and iterate on creative concepts quickly. That’s where UpSpring comes in. Think of it as Motion on steroids: a platform built to surface deep creative insights that Ads Manager alone just can’t provide.

For any creative strategist or brand struggling to get beyond surface-level metrics, UpSpring makes it easy to uncover what really moves the needle and turn those insights into stronger, higher-converting ads.

Originally published on Substack.
Comments coming soon. Comments use GitHub-backed Giscus once the repo is published. See components/Comments.jsx for activation steps.