Cold Blooded
Every frame tells a story, but not every story is what it seems.
When I think about the movies and video games that stayed with me over the years, I always assumed it was because of the visuals, the cinematography, the lighting, and the worlds that were built. But over time, I realized that the visuals weren't the only reason those experiences stuck with me, it was the story underneath that gave them meaning and life.
That realization is what led me to create Cold Blooded. I wanted to tell a gripping story in ninety seconds, with structure, symbolism, and themes. I wanted to create something that was not only fun to look at, but something where every visual choice drove the story forward. In many ways, it's a tribute to the movies and games that shaped how I think about visual storytelling, the ones I remember not just for how they looked but for the lasting impression they left on me.
Below I walk through my process for creating this cinematic short. I've also documented my progress in a 12-part WIP series on YouTube. For any artists out there who want to tell their own stories in Unreal Engine, I hope you find them useful!
- Mocap was done with Move.AI Pro. The output was noticeably cleaner than the iPhone version, which cut down cleanup time a lot.
- For hands, I used a custom hand pose library I've been building across projects.
- For body and facial animation, I used UE's native Selection Sets in the control rig. Makes selecting specific joints much faster than plugin widgets.
- I use indirect manipulation to keep the viewport uncluttered while animating: Indirect Manipulation
- I used the preset templates from MetaHuman Creator inside the UE editor. These are the most optimized for blend shapes out of the box, so they're a practical starting point to just get going.
- I also textured the MetaHumans with layers of dirt and sweat to push the realism further: MetaHuman Texturing
- Live Link + iPhone: keyframe the base dolly/pan first, then record handheld shake over it. I kept the scene as light as possible to avoid latency while recording. For the car chase scenes, I disabled the car transforms so I wasn't having to chase the vehicles during virtual camera work: Virtual Camera Workflow
- I used Live Link Hub with a webcam to stream my facial expressions onto the MetaHumans in real time. Being able to act it out on camera rather than hand-animating everything made a huge difference: Live Link Facial Capture
- Same Marvelous Designer pipeline (MetaHuman FBX → Blender → alembic → MD → UE5). I added a wind controller in MD to simulate shirts moving during the car chase. Retextured the clothes in Substance Painter.
- All foliage uses Nanite Foliage, which let me build a dense forest without the scene becoming too heavy to work in: Nanite Foliage
- Purchased asset, but I rebuilt the mesh so the doors could open and separated out parts so I could animate individual pieces independently. Textured in Substance Painter.


