DC Index141.00+3.36%OpinionA WHOOPing Masters from Rory McIlroy
dollar·commerce
Platforms

If Meta's falling behind in creative generation, where will it look to next?

Andrew Watson·December 17, 2025
If Meta's falling behind in creative generation, where will it look to next? — Platforms article on Dollar Commerce
If Meta's falling behind in creative generation, where will it look to next?

Two recent acquisitions are easy to wave away as product housekeeping: Figma buying Weavy AI ($200M) and Canva buying Magic Brief (undisclosed). That story is convenient, but incomplete.

What these deals actually signal is a shift in where advertising decisions are made, upstream in the tools where creative is conceived, structured and deployed. Design used to be safely upstream, exported and interpreted elsewhere, but Canva now lets founders publish directly to Meta, which is convenient if you’re doing the dog work yourself.

If design tools can also pull performance data back into the same environment, the question becomes less about workflow and more about ownership. Who actually owns the most important part of the ad stack now?

The value is shifting to whoever owns the creative feedback loop

Weavy sits closer to execution than most design tools. It allows a prompt engineer or creative operator to run requests through multiple large language models at the same time, compare outputs and iterate quickly on ad-ready assets. In practice, this compresses what used to be slow, manual creative work into something closer to real-time decision making, without the usual back-and-forth.

Magic Brief lives further upstream. It captures briefs generated by designers, competitor brand references and the thinking behind why an ad should exist before anything is produced. On the surface, these look like sensible feature expansions. Together, they map both ends of the creative decision process, from early intent to final output.

That distinction matters more than the tools themselves. Design platforms now see creative intent before it becomes performance. They see what founders ask for, what angles get explored and which ideas quietly die in the first round. None of this shows up in Ads Manager, at least not in the same way. By the time an ad launches, the most important filtering has already happened, usually in a design file or a prompt someone wrote in a rush. Sure, ads still largely fail in the creative testing phase of an ad account, but none of the ideation really happens in Ads Manager.

Why is this is a cleaner signal than performance data? Because performance explains what worked after money was spent. Intent shows what people believed might work before committing. If you are trying to predict outcomes, not just explain them after the fact, that difference matters.

Once intent data is paired with deployment and feedback, the loop tightens. Prompts inform assets. Assets inform launches. Results flow back into the same environment. Over time, the tool doesn’t just help you make ads. It starts to suggest what deserves to exist, which is where things stop being purely about workflow.

From an investor perspective, this is the shift. Whoever owns that feedback loop doesn’t need to replace ad platforms. They just need to influence what gets tested next. And in a world where velocity beats optimisation, that influence compounds quickly, whether anyone calls it out or not.

You could always just…buy it?

For a long time, Meta didn’t need to see how ads were made. It only needed to see what happened after they launched. Creative went in, performance came out, and the system learned from the chaos in between. Even if brands didn’t understand why something worked, Meta usually did, or at least knew enough to suggest what to try next.

That advantage starts to erode when creative decisions happen upstream. If prompts are engineered, variations filtered and assets effectively pre-optimised before deployment, Meta only ever sees the final output. It learns from results, but it never sees the rejected ideas, the near misses or the creative intent that shaped the work. That context now lives somewhere else.

This doesn’t break Meta’s AI. But it does narrow its field of vision. The platform learns from what is launched, not from what was considered. Which may explain why Meta suddenly appears very interested in creative tooling again, and why Mark Zuckerberg is reportedly offering $100M+ AI compensation packages, as if talent alone might close a data gap.

This isn’t unique to Meta. Even OpenAI has acknowledged a quiet “code red”, effectively conceding that Google’s Gemini models are developing faster and producing more accurate outputs in certain areas. When the people building the models admit the pace is uncomfortable, it’s usually because something structural is shifting.

If Meta can’t observe the creative feedback loop upstream, the incentive becomes obvious. Either integrate tightly with the tools that already own it, where Canva’s deployment features may help if Meta gains access to the creation phase, or pull that loop back inside the platform entirely by acquiring them. Auto-enhancement features, in-platform asset creation and increasingly opinionated suggestions all point in the same direction.

If that isn’t enough, a more familiar outcome is also possible. A wave of acquisitions in early-stage creative platforms like Motion, Foreplay and others, echoing the Shopify SaaS buying spree of 2021-22. Likely at enthusiastic valuations. After all making money in the traditional sense of due diligence is a little ‘off-brand’ in e-commerce I’ve come to realise.

Originally published on Substack.
Comments coming soon. Comments use GitHub-backed Giscus once the repo is published. See components/Comments.jsx for activation steps.