How generative AI can help advertising agencies pitch 

Saatchi & Saatchi London senior creative Avani Maan explains how generative AI has transformed the agency's pitch process.

This piece is going to age terribly. As I write this, Google has just updated Gemini AI. You can now feed it an hour of video and it’ll understand the entire thing. Every frame, plotline, word of dialogue. It’s seriously impressive, and it captivated the internet… for about an hour. That’s all OpenAI gave Google before announcing its text-to-video platform, Sora, which instantly made everything else feel outdated.

They serve very different purposes, but the rate of innovation keeps ramping up. What’s new today will be old in a week. And we’ll get used to it quickly. 

We’re already getting bored. 

We’ve had our fill of making people look like Shrek, turning emails into Shakespearean prose and generating songs about fruit in the style of Eminem. 

But now some of the initial wonder has gone out of AI (and before it becomes sentient), we can focus on what it really is — a tool. 

And it’s perfect for pitching.

Back in the day 

Last year Generative AI began to have a transformative effect on how we pitch.

While there’s some very serious conversations to be had around the legality of using some of these models in a production setting, it’s perfect for helping to conceptualise an idea and bring it to life. Then we can craft, work with talented humans, and do it all for real in production.

Last year, we used Generative AI to very quickly build out entire campaign worlds for pitches. We could show what a Christmas store window would look like if it were filled with carnivorous plants. Generate images of scenes from the script that perfectly reflected what we had in our heads. And just generally have fun bringing a lot of different ideas to life and seeing how they panned out. 

All of that’s possible without AI — we managed to do our jobs pre-2023 — but the speed and agility it affords has made it an invaluable tool. It also opens up the playing field. A copywriter that can’t draw can write a prompt. Endless Google Image searching can come to an end. And sharing the visual workload frees up designers to focus their energy on what they do best — design. 

A good idea should stand on its own without all the bells and whistles. But in a pitch situation, it certainly doesn’t hurt.

Cool… so you used an image generator… 

See, we’re bored already. Things move fast. So what’s next for AI and pitching?

Ask your data to explain itself 

It’s going to be a lot easier (and dare I say it, fun) to engage with data and use it to our advantage. We can throw just about anything into GPT4 to learn more about the pitch client’s business — combining the agency's own research with industry reports, economic data, social media posts…And then we can just start chatting with it. We can literally ask the data to explain itself.

The AI can perform sentiment analysis, plot graphs, uncover trends, and if that’s not exciting enough, it can explain it all as if it were a Fast and Furious movie. (Maybe I’m not actually ready to let the early wonder go.) With a bit of trial and error, it’s surprising how many new insights appear that can meaningfully inform the work and the pitch platform. It’s definitely worth getting the data team to double-check Vin Diesel’s scatterplot on shifting consumer sentiment in the confectionary aisle, though.

The pitch film 

With text-to-video tools we can rethink the pitch film. Instead of trawling for found footage, we can just conjure up exactly what we want. (And IT departments can breathe a collective sigh of relief as creatives spend less time on sites like www.youtube-downloader-free.ru). 

This won’t always make sense. Found footage isn’t going anywhere. A fake video of Greece masquerading as a real one probably won’t help win the tourism pitch. Authenticity is important. But if the pitch idea is less grounded in real life, we’ll have more options at our disposal. 

If there’s a certain character that’s central to the idea, the film could feature a version of them in the various scenes. They could talk to camera instead of having a VO narrator. We could visualise things that would normally require extensive VFX, like how a city would look after years of climate change. Or we could do away with montages and instead generate a single one-shot take, no editing or cutting between scenes required. 

Taking it a step further, we could even generate key scenes from a proposed TVC script. Or the entire thing. 

It raises some interesting questions. Could showing too much detail during a pitch backfire and restrict us later — or will it help to sell the vision and win the business?

And if Alan Partridge used AI for his pitches to the BBC, would the world have monkey tennis by now? 

Whatever happens, it’s going to happen fast. And this will all be old.

Want more insights into the biggest challenges and opportunities facing the ad industry in 2024? 

Join us at Contagious Live in London on 14 March, as we break down the findings from this year’s Radar Report, which combines survey responses from over 100 industry executives, as well in-depth interviews with leading creatives, strategists and marketers.

We’ll also be chatting with:

  • Tom Roach / VP of brand strategy, Jellyfish
  • Emma Perkins / head of LEGO agency
  • Lucy Jameson / founder and CSO of Uncommon Creative Studio
  • Aditya Kishore / insight director, WARC

You’ll also learn about some of the best campaigns from around the world in our rapid-fire Pitch Battle. And, of course, there will be pizza and beer and the opportunity to mingle with like-minded peers.

Tickets are just £35. Get yours here.



This article was downloaded from the Contagious intelligence platform. If you are not yet a member and would like access to 11,000+ campaigns, trends and interviews, email [email protected] or visit contagious.com to learn more.