Nine days ago we had a screenplay and nothing else. Today, all seven main characters exist in three dimensions.
They're not production-ready. Some of them are barely presentable. But they're real geometry with real textures, and you can rotate them and look at them from any angle. For a project that was entirely 2D a week ago, that feels significant.
The 3D Generation Pipeline
We tried three different AI tools for generating 3D models from our approved character turnarounds. Each one has strengths and weaknesses that weren't obvious until we actually ran characters through them.
TRELLIS was our first attempt. We fed it the approved turnaround sheets for Gabe, Nina, Mia, and Leo. The results for the adults were decent—recognizable characters with reasonable proportions. Gabe came out looking like himself, which after weeks of fighting character consistency in 2D felt like a minor miracle.
But TRELLIS struggled with the kids. Leo in particular came out looking flat. Not flat in the artistic sense—literally flat, like someone had taken a 3D character and squished it to half its depth. His dinosaur pajamas were there, the plush T-Rex was there, but the whole model looked like it had been run over by a steamroller. Side view? Barely a silhouette.
Hunyuan3D handled Ruben, Jenny, and Jetplane. Tencent's tool produced better depth than TRELLIS for characters with unusual proportions. Ruben's rumpled janitor uniform actually had folds and volume. Jetplane's chunky dinosaur body came out with proper roundness. Jenny was serviceable but a bit generic—the personality from the 2D design didn't fully translate.
Meshy became our rescue tool for Leo. After the TRELLIS flat-Leo disaster, we set up a Meshy API pipeline specifically to regenerate him. Meshy's approach to image-to-3D produced a model with actual depth. Not perfect, but at least recognizably three-dimensional. We also used Meshy to add PBR textures to the Gabe and Nina models—proper diffuse, roughness, and normal maps instead of the basic vertex colors TRELLIS gave us.
Why Leo Came Out Flat
This is worth explaining because it's a trap that anyone doing AI 3D generation will hit.
Turnaround sheets are designed to show a character from multiple angles. Front, three-quarter, side, back. The AI 3D tools use these images to reconstruct the character's shape. But when a character is small—like a five-year-old in pajamas—and the turnaround has less detail per view, the AI doesn't get enough signal about the depth dimension.
Adult characters have more surface detail. Clothing wrinkles, body contours, accessories. The AI can infer depth from these cues. A kid in simple pajamas? Not much to work with. The model defaulted to the most conservative interpretation: mostly flat, with the front view dominating.
The fix was switching to Meshy, which handles low-detail inputs more gracefully, and giving it additional context—not just the turnaround but also reference images from the storyboards showing Leo from various angles in actual scenes.
The Storyboard Consistency Push
While the 3D pipeline was running, we also completed a full consistency audit of Act 1. This was the big follow-up from character lock. Every panel in scenes 1 through 10 got checked against the approved turnarounds, and panels with off-model characters got regenerated.
The numbers: 27 panels regenerated across Scene 1, plus additional panels across scenes 2-10. Each regeneration used the approved turnaround as an image-to-image reference, so characters actually look like themselves now. The Scene 1 Panel 07 fix was notable because it had three extra kids who shouldn't have existed—the AI had hallucinated additional children into the scene.
Instagram Strategy
We also spent time today on something we'd been putting off: social media. Specifically, planning the Instagram presence for @rex_the_movie.
The strategy is built around showing the process, not just the results. Behind-the-scenes content of AI generation, before-and-after comparisons, character evolution posts, and short video clips of the production pipeline in action. People are curious about AI filmmaking, and showing the messy reality is more interesting than polished marketing.
We also built posting automation using the Instagram Graph API, so the assistant director can schedule and publish content without manual intervention. The idea is that as production generates assets, the best ones automatically get queued for social media with appropriate captions.
Whether this actually works in practice remains to be seen. Automated posting is easy. Automated good taste is hard.
Production Page Update
The production page on the website now has a 3D character models section. It's basic—showing the first-pass models for each character—but it gives visitors a sense of where the project is headed. We went from screenplay to storyboards to 2D characters to 3D models in nine days. The pace is real.
What's Next
The models need work. Leo needs another pass. Everyone needs better textures. And we haven't even started rigging—adding the internal skeletons that let characters move and be animated. That's the next major milestone.
But we've crossed a threshold. The characters exist in a form that can eventually become animated scenes. The pipeline from concept art to turnaround to storyboard to 3D model is a real pipeline now, not a wishlist. Each step feeds into the next.
Day nine. Seven characters in 3D. The movie is taking shape. Literally.