Capsense provides logical hand poses when using controllers. It uses tracked controller data to provide a standard set of hand animations and poses for supported Meta Quest controllers. For consistency, Capsense provides the hand and controller visuals seen in Home. Capsense supports two styles of hand poses.
Natural hand poses: These are designed to look as if the user is not using a controller and is interacting naturally with their hand.
Controller hand pose: These are designed to be rendered while also rendering the controller. Capsense provides different shapes depending on the controller type. Currently Capsense supports the Quest 2, Quest 3, and Quest Pro controllers.
Benefits of Capsense
Benefit from best in class logical hand implementation and future improvements instead of investing in a custom implementation.
Known limitations
When using Link on PC, pose data for controllers is unavailable when you’re not actively using them (such as when they’re lying on a table).
Compatibility
Hardware compatibility
Supported devices: Quest 2, Quest Pro, Quest 3 and all future devices.
Software compatibility
Meta XR Core SDK v62+
Feature compatibility
Fully compatible with Wide Motion Mode (WMM).
Using Capsense for hands with body tracking through MSDK will both work simultaneously, but they have a different implementation of converting controller data to hands, so the position and orientation of joints will be slightly different.
Setup
A native sample has been provided in the SDK package for using this feature. It is titled XrHandDataSource.
XR_EXT_hand_tracking_data_source allows the application to create a hand tracker that can receive a hand pose’s controller-generated data as well as the standard camera tracked hands path. This is done by creating a XrHandTrackingDataSourceInfoEXT structure to pass in as a next pointer to the data provided to the xrCreateHandTrackerEXT call. When querying the poses using xrLocateHandJointsEXT, the application can pass a XrHandTrackingDataSourceStateEXT structure into the function that receives the data about which data source was used. The available options for the data sources are:
XR_HAND_TRACKING_DATA_SOURCE_UNOBSTRUCTED_EXT:
This means that the tracker should use the hands tracked via the cameras for hand poses.
XR_HAND_TRACKING_DATA_SOURCE_CONTROLLER_EXT:
This means that the tracker should use the controller data to fill in the hand poses.
If both sources are provided to the hand tracker, then the runtime will use the camera tracked poses if available. Otherwise, it will try to use controller data to fill in the poses.
XR_EXT_hand_joints_motion_range lets the application specify the constraints placed on the joint positions that are generated from the controller data. The developer places a XrHandJointsMotionRangeInfoEXT structure on the next chain of the XrHandJointsLocateInfoEXT struct provided to the xrLocateHandJointsEXT call. The available options for the constraints are:
XR_HAND_JOINTS_MOTION_RANGE_UNOBSTRUCTED_EXT:
This is interpreted as providing the hands for natural/social usage. The poses provided on this path will intersect controller data, so the hands shouldn’t be rendered at the same time as a controller.
XR_HAND_JOINTS_CONFORMING_TO_CONTROLLER_EXT:
This is a request for the hands to be wrapped around the controller geometry. When the hand poses from this path are rendered with the controller model, it should provide an immersive effect of seeing the user’s hands as they are actually placed.
Troubleshooting
How can I confirm Capsense is running on my headset?
In your headset, you should see either hands instead of controllers or hands holding controllers. Also, hand pose data should be provided while the hands are active with the controllers.
Can I evaluate the feature on my headset without changing my code?
No, using Capsense requires some code changes.