Bringing a VRM avatar into Unreal Engine 4 with real-time motion capture: a multi-tool pipeline connecting Unity, VirtualMotionCapture, and Unreal via OSC.
This post documents a technical pipeline configuration rather than a game project — specifically, the process of importing a VRM character into Unreal Engine 4 and driving it with real-time virtual motion capture. It sits at the intersection of VTubing technology and game engine integration, connecting a chain of open-source tools to produce a live-animated avatar in Unreal.
You can watch the configuration process here: YouTube
What VRM Is
VRM is an open file format for 3D humanoid avatar characters — widely used in the VTubing and virtual avatar space. It extends the standard glTF format with avatar-specific conventions: a standardized bone hierarchy, facial expression definitions (blend shapes), look-at constraints, and spring bone physics for hair and clothing. A VRM file is a complete, self-contained avatar specification that any VRM-compatible application can load and animate consistently.
The format originated in Japan’s VTubing ecosystem and is the standard format for VTuber avatars, supported by tools like VRoid Studio, Vtuber Maker, and VirtualMotionCapture — which is the motion capture application at the center of this pipeline.
The Pipeline: Five Tools in Sequence
Getting a VRM avatar live in Unreal requires a chain of tools, each handling one step of the conversion and streaming process:
VirtualMotionCapture is a Windows application that reads motion data from VR controllers and trackers (or webcam-based face tracking) and applies it to a VRM avatar in real time. It’s the performance capture layer of the pipeline — it takes physical movement as input and produces animated avatar output. Crucially, it also broadcasts this animation data via the VMC Protocol — an OSC-based (Open Sound Control) network protocol that other applications can receive.
ueOSC is an Unreal Engine plugin that adds OSC network communication to Unreal — the ability to receive and send OSC messages over a local network connection. This is the network layer that allows Unreal to receive the VMC animation data broadcast by VirtualMotionCapture.
VMC4UE is an Unreal plugin built on top of ueOSC that specifically handles the VMC Protocol — it receives the bone transform data broadcast by VirtualMotionCapture and applies it to a skeleton in Unreal. This is the bridge that connects the motion capture broadcast to the Unreal character.
VRM4U is an Unreal plugin that handles VRM file import — it takes a .vrm file and produces a complete Unreal character with the correct bone hierarchy, materials, and blend shapes. Without VRM4U, Unreal has no native way to load VRM files.
UniVRM and VRMMapExporter handle a preprocessing step via Unity: the VRM file’s bone map needs to be exported in a format compatible with VRM4U’s expectations in Unreal. This Unity-side step converts the VRM bone naming to Unreal’s convention and exports the mapping data that VRM4U uses to correctly retarget the motion capture data to the avatar’s specific skeleton.
The Complete Flow
The full pipeline from avatar to live animation in Unreal:
- A VRM avatar file is obtained (from VRoid Studio or a character creator like the Nicovideo example linked in the original post)
- Unity + UniVRM + VRMMapExporter process the bone map and export it
- VRM4U imports the VRM file into Unreal, using the exported bone map for correct skeleton setup
- VirtualMotionCapture loads the same VRM avatar and captures motion from VR hardware or webcam
- VirtualMotionCapture broadcasts bone transforms via VMC Protocol over OSC
- ueOSC receives the OSC broadcast in Unreal
- VMC4UE applies the received transforms to the Unreal character’s skeleton in real time
The result is a VRM avatar visible in Unreal Engine, animated in real time by VirtualMotionCapture’s motion capture output — a live virtual character that responds to physical movement, usable for VTubing, virtual production, or interactive applications.
Context and Relevance
This pipeline sits at the convergence of VTubing technology, virtual production, and game engine integration — a space that has grown significantly since this configuration was documented in early 2021. The tools involved are all open-source and community-developed, which means the pipeline required more manual integration work than a commercial motion capture solution would — but also required no licensing costs beyond the hardware.
From a background in VFX production at DreamWorks and Sony, the underlying concepts — motion capture retargeting, bone mapping between skeletons, real-time animation streaming — are familiar from professional production pipelines. The VRM/VMC ecosystem applies the same concepts with open-source tools targeting the consumer VTubing market rather than production-grade hardware, but the technical structure is recognizably similar.
Reflection
This is one of the most niche technical posts in the series — it documents a specific multi-tool pipeline for a specific use case that most game developers won’t encounter. Its value is as a reference for anyone working at the intersection of VTubing technology and Unreal Engine, and as a demonstration of the breadth of Unreal’s integration capabilities: the engine can receive real-time animation data from external sources via OSC, import non-native character formats via community plugins, and serve as a rendering target for live virtual character applications well beyond traditional game development.
Leave a comment