Setting up VR from scratch in Unreal Engine 4: camera and motion controllers, animated hands, grab and drop, locomotion schemes, and 3D widget interaction.
This project is a VR fundamentals demo built from scratch in Blueprints with Unreal Engine 4, developed following this Udemy course. It is the foundational VR project in this series — the one that establishes the baseline every subsequent VR project builds on. Where the Fruit Slayer project assumed a working VR setup to add game mechanics on top of, this project starts from an empty project and constructs the full VR interaction framework from first principles: camera, controllers, hands, locomotion, grab, and UI.
You can watch the demo here: YouTube
VR Project Configuration from Scratch
Setting up a VR project in Unreal 4 from a blank template requires several configuration steps that the VR template handles automatically — understanding what those steps are is what makes this project valuable as a learning exercise.
The essential configuration involves: enabling the appropriate VR plugin for the target headset (OVR plugin for Oculus in UE4), setting the default RHI to the correct rendering backend for VR performance, configuring the project’s rendering settings for stereo output (forward shading is typically preferred for VR over deferred rendering due to the MSAA requirement), and disabling features that are expensive in VR and rarely necessary — ambient occlusion, screen space reflections, and motion blur being the main candidates.
Understanding these settings matters because their defaults are optimized for flat-screen games, not VR. A VR project that runs on the default flat-screen settings will likely fail to hit the 72Hz or 90Hz frame rate required for comfortable VR — and a dropped frame in VR is significantly more disruptive than in a flat-screen game, where it reads as a stutter. In VR, frame drops cause reprojection artifacts that break the sense of presence.
VR Camera and Motion Controllers
The VR pawn is built around three tracked components: a Camera component that follows the headset’s position and rotation, and two MotionController components that follow the left and right physical controllers. These three components are the only real-world tracking inputs available — everything else in the VR interaction system is built on top of them.
The camera component is attached to a VROrigin scene component that defines the player’s “floor” — the point in world space that corresponds to the physical play space’s floor. Adjusting the VROrigin’s Z position relative to the character capsule allows the player’s virtual eye height to match their physical height within the play space.
Motion controller components use the platform’s XR abstraction to query controller position and rotation each frame. In UE4, this is the EControllerHand::Left and EControllerHand::Right source tracking — platform-agnostic identifiers that map to whatever physical controllers the headset uses.
VR Hands with Animation Blueprint
Rather than floating controllers, the hands are represented as skeletal mesh actors driven by an Animation Blueprint. The animation blueprint blends between a set of hand poses — open, grip, point, pinch — based on the controller’s trigger and grip input values. Pressing the grip button partially closes the fingers; pressing fully closes them into a fist. The trigger drives the index finger independently for a pointing pose.
This input-driven hand animation is a significant UX improvement over static controller models. The hand’s pose communicates what the controller input is doing — a closed fist signals a grabbed state, an open hand signals ready to grab — giving the player visual confirmation of their input state without any UI. It also produces the fundamental sense of having hands in VR rather than holding plastic boxes.
The animation blueprint driving the hand poses is a simple blend tree rather than a state machine — it linearly blends between open and closed poses based on the grip axis value, producing a smooth continuous animation that tracks the physical trigger pressure in real time.
Grab and Drop
Grab and drop is the core interaction primitive of VR — the ability to pick up and release physical objects using the hand grip input. The implementation uses an overlap sphere on the hand that detects nearby grabbable actors. When the grip button is pressed and a grabbable actor is within range, the actor is attached to the hand’s motion controller component.
The attachment method matters for feel. A hard attachment (the object’s transform is set to the hand transform every frame via SetWorldTransform) produces a snappy, zero-lag grab that feels unnatural for heavy objects. A physics constraint attachment (a weld constraint that drives the object toward the hand position via physics) produces a grab with physical weight — the object has a tiny amount of lag and resistance that makes it feel like it has mass. For lightweight objects, the difference is subtle; for heavy objects, the physics constraint approach is significantly more convincing.
Dropping reverses the process: the attachment or constraint is released, and the object’s current velocity (derived from the hand’s recent movement) is applied as an initial physics velocity. This hand velocity transfer on release is what makes thrown objects feel like they were actually thrown — without it, released objects simply fall from the hand position with no carry.
Locomotion: Thumbstick Movement and Rotation
Two locomotion schemes are implemented, representing the main options for VR movement:
Thumbstick movement translates the pawn in the direction the thumbstick is pushed, relative to the camera’s facing direction. This is the most familiar locomotion scheme — equivalent to a third-person game’s movement — but is also the most likely to cause simulator sickness because the visual motion doesn’t correspond to any physical body movement. Comfort options like movement speed limits, vignetting during movement, and snap-to-grid movement reduce but don’t eliminate this sickness potential.
Thumbstick rotation rotates the player’s view in discrete steps (snap turn) rather than smooth rotation. Smooth rotation is one of the primary causes of VR sickness; snap turn eliminates the continuous visual rotation by jumping the view in 30 or 45-degree increments. The snap is fast enough to be clearly intentional but not so instant that it’s jarring — a brief blackout or flash on the snap frames makes the transition less abrupt.
Teleportation
Teleportation is the standard VR locomotion alternative for players susceptible to motion sickness — it eliminates all visual movement by jumping the player instantly to a target location. The implementation uses an arc trace from the controller — a parabolic projectile trace that finds a valid floor surface — and visualizes the arc as a spline mesh and the landing point as a target indicator decal.
When the player releases the teleport input, the character capsule is repositioned to the target location instantly. To prevent the visual snap from being jarring, a brief fade to black — executed via the headset’s layer compositing — covers the teleport and fades back in at the new position. The fade duration (typically 100–200ms) is short enough to feel responsive and long enough to mask the position change.
Valid teleport targets require the floor surface to be tagged appropriately — a TeleportTag or a specific collision channel — preventing the player from teleporting onto walls, ceilings, or out-of-bounds areas.
3D Widget and Widget Interaction
In flat-screen games, UI widgets are rendered directly to the screen and receive input from the mouse. In VR, widgets need to exist in world space — rendered on a plane within the 3D scene — and receive input from a VR pointer rather than a mouse cursor.
The WidgetComponent renders a UMG widget onto a 3D plane in world space. The WidgetInteractionComponent attached to the motion controller casts a ray from the controller forward direction and, when it intersects a WidgetComponent, synthesizes the mouse input events (hover, click, scroll) that the widget’s Blueprint logic responds to. The laser pointer beam — a visible ray from the controller to the widget — gives the player visual feedback of where their pointer is aimed.
This 3D widget interaction pattern is what enables VR menus, in-world control panels, and any other UI that needs to exist in the physical space of the VR environment rather than as a screen overlay.
Reflection
The VR Basic Interactions project is the most educational of the VR series precisely because it builds everything from scratch. The VR template hides the setup; starting from blank forces understanding of why each configuration step exists and what breaks without it. That understanding — knowing not just how the template is set up but why — is what makes the difference between a developer who can use the template and one who can diagnose and fix problems in it.
Every more complex VR project in this series — the Fruit Slayer, the VR Mechanics, the five VR Prototypes — assumes the baseline established here. The grab, the locomotion, the controller tracking, the 3D widget interaction all appear in those projects as extensions or refinements of the fundamentals built here. Starting from first principles with a blank project is the correct way to develop a genuine understanding of VR in Unreal, and this project does exactly that.
Leave a comment