Input Actions are used to query devices like a stylus or a controller for certain input values, like the value of a squeezable grip pad, the press of a button, the position of the device, or to trigger haptic feedback. These actions must be setup in the Input Actions menu before they can be queried via scripts. The action definitions follow the Open XR Action specification.
By the end of this guide, you should be able to:
Get the pose of certain devices.
Get the state of buttons, thumbpads and triggers.
Determine which devices are connected.
Trigger Haptic feedback.
Note
Input Actions are used to add support for certain new controllers and devices. The recommended way for developers to use Input Actions in Unity is to obtain an SDK package from the developer of the specific device you are looking to integrate.
Note
Not every controller is supported on Meta headsets. Check if your specific device is supported before trying to access it with Input Actions.
Prerequisites
Input Actions require Meta Core SDK v68 or newer for apps running on your headset, and v69 for Link support.
How do Input Actions work?
Defining Input Actions
Input Actions are defined in the Input Actions Menu, accessible in the Project Settings window within Unity under Edit > Project Settings... > Meta XR > Input Actions).
Existing Devices
The device you are using may have come with an SDK or sample which includes an “Input Action Set” which already contains the appropriate action definitions. You can link that Input Action Set asset in the Input Action Sets area of the Input Actions Menu.
New Devices
If you don’t have an Input Action Set, you can create your own list of supported actions in the Input Action Definitions list.
Input Action Definitions have:
An Interaction Profile: this string specifies which device the actions should be used with, e.g. /interaction_profiles/oculus/touch_controller is the interaction profile for the Meta Touch Controller.
Each new Action needs to have:
Action Name: A name used to identify the action in code, e.g. “Tip”, “Force”, “Front Button”.
Action Type: What type of value does this action return? Actions can also be used to trigger haptic feedback via the Vibration action type.
Action Paths: These identify which input on the device this action should return. For example, /user/hand/left/input/a/click would indicate this action should return true if the A button is pressed.
Get Action values
To query the particular actions you’ve defined, you can use one of several different functions within OVRPlugin.
These functions will return either true if the function was successful, or false if an error was encountered, such as an invalid action name or unsupported path.
This will return the interaction profile of the device current held in the specified hand, allowing you to determine if that device is held or not. Note that devices can fulfill multiple different interaction profiles, and may fallback to using one that is supported by your application even if they are not that exact device.
For example, a touch pro controller may instead be bound to the interaction profile of a touch controller if the application does not specifically support touch pro controllers. This can increase the range of devices your application can support, but can also lead to unexpected behavior.
Haptic feedback
Input Actions can also be used to trigger haptic feedback. By calling OVRPlugin.TriggerVibrationAction you can trigger the matching device to vibrate.
Duration is the duration of the vibration in seconds.
Amplitude is the intensity of the vibration, normalized in a 0 - 1 range.
Troubleshooting
One common cause of errors when using Input Actions is binding to incorrect or unsupported paths. If you find you are unable to get results for actions, check that the paths you have entered exactly match those corresponding to the Open XR specification for your device. If you are running on Android, you may be able to find more information about specific errors through Logcat.
Not every device can be used with Input Actions. Currently only Meta first party controllers or the MX Ink Stylus are supported.
Unsupported controllers
Unsupported controllers may bind to several different interaction profiles and may still be accessed via creating Input Action Definitions for a more widely supported interaction profile. For example, if the application has not specifically created Input Actions for the MX Ink interaction profile, the stylus will instead appear to the app as a touch controller and use a touch controller interaction profile. Using this method, you may be able to gain support for devices not explicitly supported by this system.
Learn more
Related topics
Now that you know how to define and query Input Actions, the following links may provide more information about Actions.
The OpenXR Action Tutorial can provide more information on OpenXR actions and how they are used.
The OpenXR Interaction Profiles specification can provide examples of Interaction Profiles and more information about how they are determined and bound when using different controllers.