Develop
Develop
Select your platform

Mobile OpenXR Samples

Updated: Jan 23, 2025
The Mobile OpenXR SDK includes a set of sample projects that prove out virtual reality app development on the Android platform and demonstrate high-performance virtual reality experiences on mobile devices. The native OpenXR sample projects are available in the \ovr_sdk_mobile\XrSamples folder in the Mobile OpenXR SDK.

Sample Applications and Media

The sample projects included with the SDK are provided as a convenience for development purposes. Some of these projects are similar to apps available for download from the Meta Horizon Store. Due to the potential for conflict with these versions, we do not recommend running these sample apps on the same device on which you have installed your retail experience.
Note: Due to limitations of Android ndk-build, your Oculus OpenXR Mobile SDK must not contain spaces in its path. If you have placed your Oculus OpenXR Mobile SDK in a path or folder name that contains spaces, you must move or rename the folder before you build our samples.
The following samples are available:
  • XrHandsFB: Demonstrates how to use hand tracking to drive simple pointer-based input and provide visual feedback for hand meshes and simple skinning.
  • XrHandsAndControllers: Demonstrates how to set up simultaneous hand + controller tracking and detect when controllers are held to build a multimodal interaction experience
  • XrCompositor_NativeActivity: A simple scene using the Android NativeActivity class to illustrate the use of the different layer types available by way of OpenXR.
  • XrPassthrough: Demonstrates the use of the passthrough API in one application. Provides an input over controllers to switch through different features and styles, such as basic passthrough, still and animated styles, masked (selective) and projected (onto a mesh) passthrough.
  • XrColorSpaceFB: An educational sample about color spaces that demonstrates how to use XR_FB_color_space to specify what color space an app is authored for.
  • XrControllers: Shows how to access each of the new input actions on the Meta Quest Touch Pro controller through OpenXR and provides examples of using TruTouch advanced haptics APIs.
  • XrDynamicObjects: Provides example usage of the Dynamic Object Tracker API, showing how keyboards can be tracked and presented via passthrough on Meta Quest 3 and beyond.
  • XrSpatialAnchor: Demonstrates the capabilities of our Spatial Anchor system and provides example code for handling, maintaining, and sharing Spatial Anchors that you can use in your own project.
  • XrSpaceWarp: Demonstrates Application SpaceWarp, which is a feature that achieves a step function improvement in both performance and latency.
  • XrBodyTrackingFB: Shows the skeleton joints on your body, arms, and hands, drawn with the corresponding joint coordinate frames overlayed at that joint.
  • XrEyeTrackingSocialFB: Shows capturing eye gaze data for both eyes.
  • XrFaceTrackingFB: Demonstrates weights of the corresponding blendshapes triggered/change when moving your face regions such as mouth, cheeks, eyes, and so on.
  • XrSceneModel: Demonstrates a scene-aware experience that enables rich interaction with the user’s environment, such as floor, walls, and furniture.
  • XrVirtualKeyboard: Demonstrates how to use Virtual Keyboard where the application has control over keyboard positioning, interaction, and rendering.
  • XrMicrogestures: Shows when the user has correctly performed a microgesture thumb swipe (left, right, backward, or forward) or a thumb tap.
Did you find this page helpful?