
COMMUNITY PAGE
There's nothing left to prompt
Notes from a Friday night in Venice with 80+ directors, filmmakers, and studio veterans on what an AI animation pipeline actually looks like in production. Director Colin Brady (Pixar, ILM) and Floyo broke down the Star Rifter pipeline shot by shot. Director-first, all the way through.
TL;DR
A real AI animation pipeline doesn't run on text prompts. On Star Rifter, every creative choice came from a director and team of artists before a single frame was rendered. Acting performances captured on phones, 3D camera blocking in Maya, hand-refined character designs, master keyframes that locked the look. Text prompts were barely necessary. The control surface was performance, not language. Floyo gave the team the shared workflow layer that made the pipeline runnable across production.
Here's a one-minute recap of the night.
A couple weeks back we filled a small room in Venice with 80+ directors, filmmakers, producers, studio heads, VFX artists, and creative directors. Disney, DreamWorks, Nickelodeon, Paramount, Studio 51, IW Group, and a handful of independent shops. People who've built careers on stories that hold up. Co-hosted with our friends at Alibaba Cloud / Wan.
We kept the room small on purpose. The conversation was the point. The questions in this room are too important to answer over a microphone to a thousand people half-listening on a livestream.
Colin Brady and I broke down the Star Rifter pipeline shot by shot. Pre-production through delivery. What it actually looked like to make an animated short with an AI-driven render layer at the center.
The takeaway I want traveling beyond that room:
Artists have far more creative control over an AI pipeline than most conversations would have you believe.

The hour before we started. Old colleagues, new collaborators, a lot of catching up.
On Star Rifter, every creative choice in every step of the art development process came from a director and a team of artists before a single frame was rendered. Acting performances captured on phones. 3D camera blocking in Maya. Trained character models built from hand-refined designs. Master keyframes that locked the look before animation began.
Colin acted out every shot or recruited his daughter and buddy for some of them. Built a puppet to choreograph Mooch, the alien sidekick.
When the creative is driving every aspect, from character and environment design to camera moves to body, face, and eye performance, there's nothing left to prompt. The AI handled rendering, simulation, and environmental detail. Humans handled the soul.
The story we hear in the news is about people typing descriptions into a text box and accepting whatever comes back. That's not a pipeline. That's a slot machine.
What we're building, and what creators like Colin are validating in production, treats AI as a render layer. A new kind of execution muscle for choices that still come from humans with taste, instinct, and decades of feature animation experience.

The room in Venice. Eighty seats, no livestream, no back row.
Some of the most interesting moments of the night were the small ones. Colin pointed out that the closer the camera pushes in, the more detail Wan adds. Everything we learned about CG over the last 30 years is the opposite. He called it "the upside down." We're also rebuilding Pixar-era animation tooling inside an AI pipeline (eye targets, performance isolation, lip-sync cleanup) because the models don't ship with that level of control yet. The same problems Colin and his peers solved in a 3D pipeline 30 years ago are problems we're solving again, with new code, sitting right next to the model.
The room agreed. The room also pushed back. We talked openly about what gets harder, what gets faster, what feels different on the floor with junior artists, what doesn't translate yet, where the seams still show. Someone asked if this was just a different form of motion capture. Colin's answer was honest: yes, kind of, plus a different form of modeling, and cloth, and particles, and hair, and rendering. The same old print process, with everything inside it changing at once.
A bigger question kept surfacing: can a film that used to cost $100M get made for $10M without losing the soul? Animation has been leaving LA for years, well before AI showed up. If the answer is yes, it's not because tools replaced people. It's because directors with real instincts get to make more, smaller, better. And the artists, animators, story people, and gag writers we've been losing get put back to work driving the pipeline.
What stays human when the toolset changes overnight? More than you'd think, if the director is in the chair the whole way through.

More rooms like this coming. Thanks to everyone who showed up.
The full Star Rifter case study, including runnable workflows you can try yourself, lives here.
Build your first production pipeline this week.
Browse production-ready workflows or talk to the Floyo team about a custom enterprise setup.
Matt Shih
CoFounder and Creative Director at Floyo
20+ years of creative experience in advertising and production. Has designed AI production pipelines for animated shorts, commercial campaigns, and studio teams shipping work for major film and advertising clients.
Last updated: March 27, 2026