Google – AI Roadtrip

Brief: An episode on Instagram Reels explains that the two characters are going on a road trip powered by AI….
Google "AI Roadtrip"
Case Study

Brief: An episode on Instagram Reels explains that the two characters are going on a road trip powered by AI. When a fan comments with a location idea, our team uses a purpose-built tool to generate a custom video response within minutes. Over 16 hours, we plan to create as many unique replies as possible.

Approach: The Mill worked with Google Creative Lab to combine Google's AI tools with a custom, cloud-based render pipeline developed by The Mill that was able to create these films quickly at scale, all with suggestions from fans on Instagram. We used a stack of Google AI models to design a tool that balances machine efficiency with human ingenuity, and with the creative team in the loop at every step of the process. Once the initial AI assets were generated, creators could adjust camera cuts and timing, and also add 2 layers of animation for each phone. Each video could then be submitted to a bespoke, automated Unreal Engine render pipeline hosted on Google's Compute Cloud. The live campaign went for 14 hours, creating hundreds of videos with fan-suggested places, which were then posted back to Google's Instagram account with the user tagged.

Impact: Engaging directly with the fan community of #BestPhonesForever and hoping some of our takeaways inspire you to explore your own creative applications of these technologies.

425

Videos Rendered

300

Videos Posted

00:02:15

The average time between posts for 16 consecutive hours

How it Works

Non-Linear Sequence Editor

With the scene elements created, we needed to build a way for creators to adjust camera cuts and animations for each phone. We engineered the editing tool to auto-populate a timeline based on the dialog audio, meaning that a creator could theoretically click through the entire tool and still get a viable video. However, the level of control in the tool allowed for a much higher degree of customization for each video.

Automating the Unreal Engine Render Pipeline

Once the scene elements were set, and the timings refined in the sequence editor, the data was saved from the web tool and queued for our render farm to process.

The Mill developed a bespoke automation system for Unreal Engine, and a special queuing system for 30 Virtual Windows Machines (VMs) in Google's Compute Cloud.

As video jobs came available, an available (VM) would pull data from the next available video render job via its local Node server. This would in turn kick off an automated process that would open an instance of Unreal Engine and load our project template. We configured our Unreal Engine project to dynamically load in our project assets and data, and then configure animations and camera timings based on the data loaded in.

Results

Read more from Google Creative Lab here.

Credits
Credits

Client
Client: Google Creative Lab

Design & Experience
Design & Experience: The Mill
Technology Director: Adam Smith
Creative Director: Michael Schaeffer
Managing Director: Angela Lupo
Executive Producer: Katie Tricot
Senior Producer: Dara Ó Cairbre
Producer: Michael Reiser
Director of Experience: Jocelyn Birsch
Art Director: Austin Marola
Senior Designer: James Gardner
Designers: Christopher Szeto, Eddie Livingstone, Tyler Scheitlin, David Rowley, Thomas Heckel, Yimeng Sun
Technical Artist: Jayleen Perez
Junior Technical Artist: Jack Chen
Lead Compositor: Heather Keister
Compositors: T’Naige Wallace
Animation: John Wilson
Developers: Joshua Tulloch, Keith Hoffmann, Jeffrey Gray, Chad Drobish, Dave Riegler
UX Designer: Annie BanYard (Keogh)
QA: Alexis Zerafa, Kristen Lawrence

Colour
Colour: The Mill
Colourist: Ashley Ayarza Woods
Colour Executive Producer: Alexandra Lubrano
Colour Producer: Colleen Valentino
Colour Assist: Amonnie Nicolas, Nick Yelesin
Colour Coordinator: Joanne Lee

Partner
Partner: Left Field Labs

Speak to Us About Our Work