Develop
Develop
Select your platform

Hand Visual

Updated: Nov 6, 2025
Hands are typically rendered using OVRSkeleton or OVRCustomSkeleton. However, since those components visualize hand data from OVRHand directly, they won’t match hand data that has been modified.
To solve this issue, Interaction SDK uses the HandVisual component to render hand data. The HandVisual component takes an IHand as its data source. This means that you can include several HandVisual components in your scene and attach them to various stages of the hand data modifier chain. For instance, you can visualize both raw hand tracking data and synthetic hand data at once by creating two HandVisual components and setting them to the associated Hand Modifiers.
Note: If you are just getting started with this Meta XR feature, we recommend that you use Building Blocks, a Unity extension for Meta XR SDKs, to quickly add features to your project.

Learn more

Design guidelines

Did you find this page helpful?