Unreal 5.2 – VR Prototypes


Five VR interaction paradigms in Unreal Engine 5.2: physical throwing, melee collision, archery, alternative locomotion, and climbing on Oculus Quest 2.


This project is a collection of five VR prototypes built in Blueprints with Unreal Engine 5.2 for Oculus Quest 2, developed following this Udemy course. Each prototype isolates a different interaction paradigm — how the player uses their hands and controllers to interact with the virtual world — covering the main categories of VR mechanics that appear across most VR games and experiences.

You can watch all five prototypes in action here: YouTube


The Foundation: VR in Unreal Engine

Before the individual prototypes, it’s worth understanding the VR setup they all share. Unreal’s VR framework centers on the VR Pawn — a special actor class with a camera component tracked to the headset and two motion controller components tracked to the physical controllers. The camera and controller transforms are updated automatically each frame by the XR plugin, which abstracts the hardware differences between headset platforms behind a common API.

For Oculus Quest 2 specifically, the project targets the OpenXR plugin — Epic’s recommended XR abstraction layer — rather than the legacy OVR plugin. OpenXR provides cross-platform compatibility and is where Epic’s active development is focused. The Quest 2 runs the experience standalone (no PC tethering required), which imposes a mobile GPU performance budget that shapes every rendering and physics decision in the project.


Thor Hammer: Physics Grab and Recall

The Thor hammer prototype explores two distinct VR interaction problems: physical grabbing and object recall.

Physical grabbing in VR is more complex than it appears. The naive approach attaches the grabbed object directly to the controller transform — the object teleports to the hand instantly. This works visually but breaks physics: the object passes through other objects during the grab animation and has no momentum when thrown. The physically correct approach constrains the object to the controller via a physics constraint (a weld or a grip constraint) while keeping it as a simulated physics body. The controller pulls the object toward it; physics handles the rest. This produces natural-feeling grabs where the object moves with appropriate resistance and carries momentum into throws.

The recall mechanic — the hammer returning to the player’s hand after being thrown — requires detecting when the player reaches for it (a grab input with the hand extended) and then moving the hammer back via a physics-driven home force. The hammer isn’t teleported; it applies a velocity toward the player’s hand each frame, accelerating as it approaches. A simple lerp would produce the same visual result, but the physics-driven approach respects obstacles — the hammer deflects around objects in its path rather than passing through them.


Laser Swords: Continuous Melee Collision

The laser swords prototype addresses the hardest problem in VR melee: fast-moving weapon collision. A controller moving at high speed covers significant distance between frames. Standard discrete collision detection — checking for overlaps at the current frame position — misses collisions that occurred between frames, producing the “tunneling” artifact where a fast sword swing passes through a target without registering.

The solution is sweep-based collision (also called continuous collision detection): instead of checking the sword’s current position, the engine sweeps a volume from the sword’s previous position to its current position each frame, detecting everything the sword passed through during that movement. This registers all hits regardless of swing speed.

The other consideration is velocity-based damage: a sword tap and a full swing should feel different. The sword tracks its velocity each frame (the delta between its current and previous transform) and scales damage by that velocity magnitude. This produces the correct result — slow touches deal no significant damage, fast swings deal full damage — and gives the player physical ownership of the interaction.


Archery: Bow Tension and Projectile Physics

The archery prototype requires coordinating two hand interactions simultaneously: the bow hand holding the grip and the draw hand pulling the string. The string pull is tracked as the distance between the draw hand and the bow grip, clamped to the maximum draw length. This distance drives three things simultaneously: the bow string’s visual position (a dynamic mesh deformation or a spline), the draw indicator in the UI, and the arrow’s launch velocity when released.

Launch velocity is proportional to draw distance — a full draw produces maximum velocity, a partial draw produces a proportionally weaker shot. The arrow is a physics actor with ProjectileMovementComponent disabled initially; at release, it’s detached from the bow and its velocity is set to the draw direction scaled by the draw-distance-derived speed. Gravity then acts on it naturally, producing a realistic arc.

Arrow flight in VR requires a rotation fix that most implementations overlook: the arrow’s mesh must rotate to face its current velocity direction each frame, not just its initial launch direction. Without this, the arrow flies sideways or spins unnaturally. A FRotationMatrix::MakeFromX(velocity) call applied to the arrow’s rotation each tick produces the correct behavior.


Kayaking: Alternative Locomotion

Standard VR locomotion — thumbstick movement — causes simulator sickness in many players because the visual motion doesn’t match any physical sensation. Alternative locomotion schemes tie movement to physical arm motion, producing a stronger sense of presence and reducing nausea.

The kayaking prototype maps paddle strokes directly to player movement: when the player moves a controller backward through the virtual water (in the paddling motion), the pawn moves forward proportionally. The paddle’s velocity in the horizontal plane drives the pawn’s movement in the opposite direction — physics intuition applied to VR input.

The key implementation detail is distinguishing a deliberate paddle stroke from incidental controller movement. Filtering by the controller’s motion direction (only horizontal backward movement counts) and applying a minimum velocity threshold prevents unintended drift from small hand movements.


Climbing: Grip-Based Locomotion

The climbing prototype is the most physically intuitive VR locomotion scheme: the player grabs surfaces and moves their hands to move their body. When a grip is registered on a climbable surface, the player’s pawn position is updated each frame to be offset from the grip point by the same amount the hand has moved since gripping — the hand appears to stay fixed on the surface while the body moves relative to it.

This is the inverse of normal movement: instead of moving the pawn and letting the hand follow, the hand is the anchor and the pawn moves to keep it stationary relative to the grip point. Two-handed climbing extends this naturally — both grip points contribute to the pawn’s position and rotation, allowing rotation of the body as well as translation.

Releasing a grip while moving imparts the accumulated velocity to the pawn, allowing the player to jump between handholds by releasing with momentum. This requires tracking the hand’s velocity during gripping and applying it to the pawn’s physics on release.


Performance on Quest 2

All five prototypes share a standalone mobile GPU performance budget — the Quest 2 must render at 72Hz or 90Hz to maintain comfort, with no PC to offload rendering to. This shapes every decision:

Draw calls must be minimized — instancing and merged meshes where possible. Dynamic shadows are expensive; many VR games on Quest 2 use baked lighting with minimal dynamic shadow casters. Physics simulation is limited — the throwing and collision physics in these prototypes are designed to be cheap: minimal simulated bodies, simple collision shapes, no cloth or fluid simulation.

The Oculus Performance HUD (accessible in the developer overlay) provides per-frame GPU and CPU timing that makes these costs visible during testing. Hitting the frame time budget is non-negotiable in VR — missed frames produce reprojection artifacts that break presence and cause discomfort.


Reflection

What connects all five prototypes is that they each solve the same fundamental problem differently: how do physical hand movements in the real world translate into meaningful interactions in the virtual world? The hammer translates to throwing and recall. The sword translates to melee velocity. The bow translates to draw tension. The paddle and climbing translate to locomotion.

This variety makes the VR prototypes collection more useful as a learning resource than any single prototype would be. Each interaction paradigm has its own implementation challenges, and working through all five builds a comprehensive mental model of VR input design that carries over to any VR project regardless of genre.

Leave a comment

Create a website or blog at WordPress.com

Up ↑