If you have not previously implemented input motion controllers in Unreal Engine, see Unreal Engine’s Motion Controller Component Setup.
Before you start implementing hand tracking in your app, see the Hand Tracking Design Guidelines for terminology, best practices and interaction models when using hands as an input source in virtual reality.
There are also store guidelines for how hand tracking should be implemented in your app that you should be familiar with:
VRC.Quest.Input.5: Hands must render in the correct position and orientation, and must animate properly
VRC.Quest.Input.7: The application must properly respect when input is switched between controllers and hands
VRC.Quest.Input.8: The system gesture is reserved, and should not trigger any other actions within the application
The following image shows the architecture of the hand tracking implementation for Unreal Engine, and how it input information from hands is routed using the same mechanism that controller input is routed.
As shown in the diagram, device input remains the main source of input data for Unreal Engine. Device input routes hand input through the Unreal Engine input system the same way that controller buttons and sticks are. Pinches and pinch strength are also routed as hand input.
Hand-specific features like the mesh/skeleton and bone rotation are provided through the OculusHandTracking class which is contained within the Input module. The OculusHandTracking class provides the Blueprint library as well as access hand specific data like hand scale, pointer pose, bone rotation, and more.
Turn On Hand Tracking in Your Project
You can turn on hand tracking in Unreal Engine in the Project Settings, which adds the com.oculus.permission.HAND_TRACKING entry to the Android manifest for your project.
Go to Edit > Project Settings, go to Plugins and select OculusVR.
Under Hand Tracking Support, choose:
Controllers: hand tracking will not be enabled for your app
Controllers and Hands: A user can use hand tracking or controllers in your app
Hands Only: A user must have hand tracking enabled on their device to use your app
Setup
To set high frequency, in Project Settings -> OculusVR -> Hand Tracking Frequency, select High.
Integration Details
The hand tracking integration for Unreal Engine features the following.
Updates to the Input Module
In summary, the input model has these additions for hand tracking:
The Input Module supports input from touch controllers and hand tracking.
The Input Module relays hand pinches and pinch strength through Unreal Engine’s input event system.
The module will update and store new hand poses, which can be access through Blueprints or by the OculusHandComponent
Specifically:
FOculusHandState struct has been added. Similar to the controller-state structs, this struct provides the current hand state inputs and tracked state
Pinch inputs are updated with key events and axes for pinches and pinch strength
Register new key names and axes defined for hands in the Unreal Engine input system. These identify fingers of each hand as Thumb, Index, Middle, Ring and Pinky.
Pinches and Pinch strength can be bound to the UE input settings to that their events can be associated with Blueprints
`OculusHandComponent. See Input Bindings in the next section for how to do this.
Updating Hand Pose
Use GetControllerOrientationAndPosition to get the root hand position
You can bind to hand tracking inputs like pinches and pinch strength using Unreal Engine’s input system. To create a new input binding with hand tracking:
Go to Edit > Project Settings Find Engine > Input.
Under Action Mappings or Axis Mappings, add a new mapping.
For the mapping key value, search for the Oculus Hand category to bring up the various hand tracking input bindings. The following image shows an example:
Hand Tracking Blueprints
The Unreal Engine integration offers several resources, including several Blueprints.
The Hand component is the OculusInput module for hands. This component is a subclass of Unreal’s UPoseableMeshComponent, and must be a child of UMotionController, which provides the tracking pose and late-update functionality for hands.
The component handles loading the mesh/skeleton as well as updating the bones. This component also handles setting new materials for the hand, hiding hands when tracking is lost/confidence is low.
Options to update root pose, update root scale, set pointer pose root, and enable physics capsules.
The following image shows and example of these properties, and how to set them in Unreal Engine.
Handling System Gestures
If the user performs the system gesture, to return to Home or access the menu, the gesture will be surfaced through the OVRPlugin as an ovrpButton_Start signal, and a status flag, similar to the user pressing the Home key or menu button on a controller. You will not need special menu logic for hands in this case.
The following image shows the pinch gesture as well as the system gesture.
Dominant Hand
Dominant hand features are surfaced through Hand Status flags of the OVRPlugin. You can access this information by using the Blueprint function GetDominantHand.