.png)
If you are juggling pilots, pressure, and a very human team that deserves clear answers, this recap is for you. Based on our live conversation with Dan Fogelman from Indeed, and our COO Tobias Cummins, here is how a large, mission-driven brand went from cautious trials to a working AI operating model that ships real creative, safely and at speed.
Trust unlocks adoption.
Give people a ring-fenced space to explore multiple models with brand context built in, and keep human reviews mandatory. When the environment feels safe, creative teams lean in.
Dan’s take: Pencil created “a safe space for our creative team to explore and expand,” because the models are ring fenced and aligned to brand guidelines. The surprise for leaders is how quickly previously “off limits” models become usable once governed inside Pencil.
“AI may become the producer. Humans are the prompters, reviewers, editors, teachers.”
Win a lane, then widen.
Indeed ran a clear path that everyone could understand: vendor selection, user-acceptance testing with the project management POC, rollout to the creative team, then expansion across marketing. UAT was timeboxed and tied to a few high-value use cases, which made pass or fail decisions fast and unambiguous.
The journey took roughly sixteen months from first tests to cross-team rollout. Momentum was kept visible with weekly office hours led by Pencil and by internal “side quests” where early adopters shared wins and taught peers. That mix of structure and public wins reduced friction and pulled more teams in.
“Early adopters set the tone. We let them present their wins and adoption followed.”
Training sticks when it fits Monday morning reality.
Enablement should mirror your briefs, channels, and handoffs. Indeed built an internal course and handbook that shows exactly where Pencil fits into the wider tool stack, plus short videos for new joiners, and checklists that live inside existing project steps.
Once training matches how work actually flows, confidence rises. People know where AI enters the process and what good looks like for the brand.
“We developed an internal training program and a handbook so people know where Pencil fits in our larger tool ecosystem.”
Targets live at the task level, not as one giant number.
Indeed set goals at specific stages like concept proofs, variation rounds, and localisation. A simple scorecard tracked time saved per stage and was updated as skills improved. Leadership could see real progress without waiting for a single headline metric.
They set a ballpark target of up to 50% percent time saved on a few key steps. Some stages hit that early, others needed more runway, which was acceptable because the progress was visible. There was also a creative upside. As Dan put it, AI collapses the cost of proof.
“We can try seasonal concepts or locations without flights or reshoots and decide what works, faster.”
Make frequent model releases useful with a simple activation playbook.
Aggregate models in one place so teams do not tool hop. Build a habit of same day activation for meaningful releases. Share simple guidance on when to adopt, when to retire, and how to communicate change so the stack grows without breaking process.
Handled this way, rapid updates feel like momentum, not risk.
“We started with 8 models. We are now at 33. That growth happened in months, and the team’s enthusiasm grew every day.”
AI is a partner, not a replacement. Humans prompt, review, and decide. The win is faster proof at lower cost, which unlocks better choices.
Start with safety, scale in phases, teach the workflow, measure the work, and turn model velocity into an advantage with simple governance.
“We are meticulously optimistic. The mission stays human. AI amplifies it.”
Interested by these 5 moves? Feel free to adapt them, and make them yours.