Extract camera movements and human motions from reference videos for professional video generation
Who it's for: creators who want this pipeline in ComfyUI without assembling nodes from scratch. Not for: one-click results with zero tuning — you still choose inputs, prompts, and settings.
Open preloaded workflow on RunComfy
Open preloaded workflow on RunComfy (browser)
Why RunComfy first
- Fewer missing-node surprises — run the graph in a managed environment before you mirror it locally.
- Quick GPU tryout — useful if your local VRAM or install time is the bottleneck.
- Matches the published JSON — the zip follows the same runnable workflow you can open on RunComfy.
When downloading for local ComfyUI makes sense — you want full control over models on disk, batch scripting, or offline runs.
How to use (local ComfyUI)
1. Load inputs (images/video/audio) in the marked loader nodes.
2. Set prompts, resolution, and seeds; start with a short test run.
3. Export from the Save / Write nodes shown in the graph.
Expectations — First run may pull large weights; cloud runs may require a free RunComfy account.
Overview
This advanced Uni3C workflow extracts camera movements and human motions from reference videos and applies them to your input images for professional video generation. Unlike traditional manual animation tools, Uni3C automatically understands motion patterns from reference footage and seamlessly transfers both camera work and character animations to new scenes. The workflow supports motion transfer between different visual styles, video-referenced camera movements, and natural human animation synthesis from any reference footage. With 4x speed optimization, you can generate high-quality videos with extracted motion patterns in minutes.
Important nodes:
video reference extraction
Good Examples:
Key Point:
Preserve Original Motion:
Modify Actions:
Notes
Uni3C ComfyUI Workflow | Video-Referenced Camera & Motion Transfer — see RunComfy page for the latest node requirements.
Description
Initial release — Uni3C-Video-Reference-Motion.
