
If your Return on Ad Spend has dipped or your Cost Per Acquisition has crept up lately, it might not be your targeting or your spend. It might be Meta itself.
The company’s latest quiet rollout, the Andromeda update, is reshaping how Facebook and Instagram ads perform. And for many advertisers, the early results haven’t been kind.
But what looks like a performance drop is actually the start of something much bigger: a new ad era where creative variety, not manual optimization, decides who wins.
Here’s what’s changing, why it matters, and how you can adapt fast.
Launched by Meta’s engineering team in late 2024, Andromeda is a next-generation personalised ads retrieval engine built on NVIDIA Grace Hopper Superchips and Meta’s own MTIA accelerators.
It powers Advantage+ automation, and it’s designed to do something Meta’s older systems never could: evaluate and serve tens of millions of ad candidates in milliseconds, choosing the most relevant creative for each person.
In simple terms:
Meta has built a new brain that learns from creative diversity, not manual targeting.
According to Meta’s internal data, early performance improvements include:
This isn’t just a back-end refresh. It’s a full retraining of Meta’s ad ecosystem to reward variety, personalisation, and creative volume.
If you’ve noticed campaign results declining, you’re not alone. Andromeda’s retrieval system doesn’t learn from narrow targeting or static testing anymore. It learns from creative range.
Instead of optimizing toward one “hero” ad, the algorithm now favors advertisers who can supply it with constant, diverse creative inputs.
That means:
In this new environment, the machine decides what works based on how much creative variety you give it.
Or, more simply put:
In 2025, your ad’s biggest performance lever isn’t targeting. It’s variety.
Andromeda marks the beginning of what many in the industry are calling Creative Darwinism: survival of the most adaptive creative.
Meta’s automation is quietly removing the old advantages of manual media-buying tricks and targeting hacks. The algorithm now optimizes in real time, leaving one variable still under human control: the idea itself.
The brands that thrive will be the ones that evolve into high-velocity creative systems that feed Meta’s engine with diverse, consistent, and high-quality variations. The ones that don’t will be left behind by an ecosystem that favours evolution over repetition.
As with all Darwinian shifts, adaptation isn’t optional. It’s survival.
That’s where Pencil comes in.
We built Pencil to help teams produce high-quality creative variety at scale, the exact input Meta’s algorithm now rewards most.
Here’s how you can make that shift:
Ask agents to explore radically different ideas.
A simple prompt like “Look at A2 and suggest something completely different” can unlock entirely new creative directions in seconds, generating dozens of unique variations without needing new briefs or lengthy approvals.
Creative variety doesn’t mean chaos.
By syncing your brand assets inside Pencil, agents can reference logos, fonts, and imagery to ensure every concept remains visually consistent, even as your creative directions diverge.
Different generative models produce different styles and tones.
In Pencil, you can mix outputs from OpenAI’s Image 1, Google’s Nano-banana, and others to ensure your content always looks fresh.
Meta’s retrieval system thrives on that diversity, and Pencil makes it automatic.
Meta rewards variety. Pencil makes variety effortless.
Andromeda isn’t just a technical update. It’s a philosophical one.
Meta’s machine learning has automated the old world of manual optimization.
What’s left for advertisers is the creative frontier — a space where imagination becomes performance data.
The winners in this new landscape won’t be the ones tweaking bids or chasing targeting shortcuts. They’ll be the ones bold enough to experiment, diversify, and evolve ideas faster than ever.
We’re moving from human-assisted automation to AI-assisted imagination, and the gap between the two is where the next generation of creative leaders will emerge.
Veo 3.1 is now available in Pencil — introducing Google’s latest video generation model with sharper motion, stronger prompt adherence, and new tools for creative control. Explore how Reference Images and End Frame continuity work together to help creators maintain consistency, build seamless sequences, and direct cinematic-quality AI video from a single workspace.