Design

Controllers

Updated: Feb 4, 2026
Explore the use of controllers to improve user experiences in immersive experiences. This page delves into controllers as an input method, outlining their benefits, challenges, capabilities, best practices, operational mechanics, and design principles.

Usage

Controllers are handheld input devices that can be used to extend the input capabilities of hands.
Equipped with buttons, joysticks, and various other controls, these devices facilitate a wider range of interactions and allow for more precise interactions.

Terminology

Here are the various components, key characteristics, and commonly used terms you should know:
Controller terminology
1. Trigger buttons
  • Located on the front of the controller.
  • Primarily used for selection and triggering actions.
  • They are pressure-sensitive, ensuring a responsive experience.
  • Quest Pro Controllers feature additional touch sensitivity, offering touch motion interaction options.
2. Y, X, A and B buttons
  • These are the face buttons on the controller.
  • These buttons can be used for triggering different application/context actions, depending on the design.
3. Thumbstick
  • Each controller features a thumbstick.
  • It enables smooth manipulation, such as scrolling in a UI or navigating within a virtual environment.
  • The thumbstick can also be pressed down to act as an additional button.
4. Grip buttons
  • Located on the sides of each controller.
  • Simulate gripping actions, such as holding onto interactables. Some apps use them for different purposes, like triggering an action.
5. Menu button
  • Located on the left controller.
  • This button is designed to open and close the menu of the active application, allowing access to settings, options, and other features specific to that app.
6. Meta button
  • Located on the right controller.
  • This button is used to open and close the Quest universal menu, allowing users to adjust settings, view notifications, and change or close apps.

Technology

This section offers insights into the technology behind controllers, focusing on their functionality, accuracy, calibration, and inherent technological limitations. Also, strategies to mitigate these limitations, enhancing the design of effective interactions.

How it works

Meta Quest controllers are equipped with a variety of sensors, including an accelerometer and gyroscope, which continuously monitor their movement, orientation, and position. Additional sensors such as cameras and an array of infrared LEDs enhance the precision of tracking in 3D space. The Meta Quest headset’s (HMD) own sensors also play a crucial role in accurately determining the controllers’ location and movements. In addition to spatial tracking, the controllers feature buttons, triggers, and thumbsticks that enable users to interact with virtual content in diverse ways. These inputs are wirelessly transmitted to the Meta Quest headset and processed to produce corresponding actions.
Various factors can influence the accuracy of the controllers, including environment variability. For further details on these challenges and the strategies used to mitigate them, refer to the section below on limitations and mitigations. Understanding these factors is crucial for designing effective and immersive user experiences.
The technology behind the controllers has evolved over time, resulting in different capabilities and best practices. Our controllers can be categorized so far into two main groups:
6DOF controllers These controllers have 6 degrees of freedom (6DOF), enabling both orientation and positional tracking (rotation, pitch, yaw, and movement along the x, y, and z axes). This allows controllers to function as virtual hands, interacting spatially with the virtual world.
These controllers shipped with PC-based HMD, such as Rift.
Self-tracked controllers Self-tracked controllers are a type of 6DOF controller that uses onboard sensors to track their position and orientation in 3D space. They do not require external sensors or cameras to operate, making them more convenient and flexible than other types of controllers.
These controllers ship with standalone HMD, such as Meta Quest, and offer a more immersive and interactive experience for users.

Accuracy

Controller tracking can deliver sub-millimeter precision, through the use of both classical computer vision and machine learning models. Machine perception cameras identify and triangulate the infrared LEDs on the controller providing sub-pixel accuracy, while machine learning models are trained to estimate the pose of the controller when LEDs are occluded and in difficult lighting conditions. These systems leverage each other to maximize controller performance.

Limitations and mitigations

Every technology comes with its own set of limitations and challenges that must be addressed in order to ensure optimal performance and usability.

Tracking volume

For Meta Quest 1/2/3 controllers, there is a fundamental limitation based on the tracking volume defined by the headset cameras. Primarily, the headset needs to see some portion of the controllers in order to estimate their pose, especially when starting tracking.
Controller tracking
Mitigation - Design: Controllers utilizing an inertial measurement unit (IMU) which can be used to estimate position and orientation for a short period of time when the controller is out of the FOV of the tracking cameras, however this will drift over a short period of time, suffering a rapid degradation in accuracy. Self-tracked controllers do not suffer from this limitation.
Mitigation - Code: For use cases which are known to stress the position/orientation of the controller with respect to the tracking volume, several physics based constraint systems are utilized to assist with controller tracking.

Lighting conditions

While controllers are fairly robust to a wide variety of lighting conditions, there can be scenarios where specific conditions will cause degradation in tracking quality. For Quest 1/2/3 controllers, the most impactful conditions are high intensity lighting or direct sunlight. This is primarily due to the wash-out of the infrared LEDs by the high intensity light emitted by the sun, however particularly bright indoor lighting can also wash out the LEDs.
Lighting levels
Mitigation - Design: Our devices adapt to varying light conditions, with continuous improvements through updates and new models. To ensure optimal functionality, advise users to be mindful of their environment’s lighting, for example through a note at application launch.

Design

Learn about input primitives, grasp the design principles, take into account comfort factors, and explore essential dos and don’ts.

Interactions

Discover various input capabilities and interaction methods utilizing the HMD as input modality:
Targeting
Target objects using controllers in two ways: directly by touching (colliding with) an interactable or indirectly using a ray cast.
Selection
Letting the user choose or activate that interactable with the controller by using for example the trigger button.
Trigger Buttons
Select, accept
Grip Buttons
Grab
Thumbstick
Menu navigation, Locomotion
X/A
Select, accept
Y/B
Back out, cancel
Meta Button
The Meta button allows you to open or close the Quest universal menu. It is an OS-level feature and is not customizable.
Menu Button
Bring up or dismiss a pause or in-app menu
These guidelines describe the recommended default behavior for most applications and situations. If there is a situation which doesn’t work well with these conventions, the application should choose whatever mapping makes the most sense.
For a comprehensive overview of all input modalities and their corresponding Input Primitives, please visit the following page: Input primitives

Design principles

Explore the fundamental concepts that shape intuitive and user-friendly interactions for controller input modalities.

6DOF controller best practices

The recommendations and best practices in this section are specific to 6DOF controllers, like: Meta Quest and Meta Touch controllers.
Input mapping
Input mapping involves assigning specific functions to different controls on a controller. To avoid forcing users to relearn controls or repeatedly look at their controllers, it's crucial to keep functions aligned with common practices to make the controllers become intuitive. For guidelines on mapping input functions to maintain consistency across a wide array of Meta Quest titles, please refer to the input primitives page.
Multimodal
Multimodal interactions refer to providing multiple input modalities, such as controllers, hands, voice, and more, for the user to interact. By incorporating multiple modalities into an application, users can enjoy the experience by naturally choosing the interaction method that serves them best in the given moment. This creates a more seamless and enjoyable experience, as well as increased accessibility.
Accessibility
Users may be left-handed, right-handed, or ambidextrous. To accommodate all users, ensure that interactions can be performed with either hand when two controllers are available. Ideally, plan to support multimodal interactions, allowing for single-controller usage. Exercise caution with two-hand interactions. For further guidance, refer to the accessibility page.
Controller representation
  • Choose hand models that users can customize to increase comfort. Avoid overly realistic models that don't match the user's physical hands. Semi-transparent or robotic hands are generally more acceptable as they cater to a diverse range of users. See the hand representation page for more guidance.
  • Avoid animations that move the hands without user input, except for minimal, expected animations like gun recoil.
  • Implement object grabbing by snapping the object into the correct alignment when gripped.
Movement representation
Maintain a 1:1 ratio between the movement of the user’s controller in the physical world and the movement of the virtual representation. This can be rotational or translational movement in space. If exaggerating the user’s movement in fully immersive experiences, make it so exaggerated (For example: 4x) that it is readily obvious that it is not a natural sensory experience.
Haptic feedback
Incorporate haptic feedback to enhance the realism of interactions and confirm user actions. For more detailed guidance, refer to the haptics page.

Comfort

Correct hand

The hardware is ergonomically designed to ensure maximum comfort for the hands. To enhance user experience, visually indicate which controller is intended for the left or right hand. This is particularly beneficial in scenarios where controllers are set down and picked up frequently, ensuring users consistently select the correct controller for the appropriate hand.
Controllers comfort swapped arms

Dos and don’ts

Below is a list of recommended practices when using hands in an immersive experience:
DO Using the Menu button (≡) for menus is strongly recommended, as it provides a more consistent experience across titles.
DO Implement Haptic Feedback: Use haptic feedback to enhance the realism of interactions and provide confirmation of user actions, which helps in creating a more immersive experience. See haptics for more.
DO Maintain Consistent Input Mapping: Align controller functions with common practices across different applications to minimize the learning curve and make the interaction intuitive. See input maps for more.
DON'T Rely Solely on Controllers for All Interactions: While controllers are a primary input method, they should not be the only modality for interaction. Ensure accessibility by incorporating alternative input methods that cater to different user needs and contexts.

Next steps

Designing experiences

  • Input Modalities: Discover all the various input modalities.
  • Hands: Examine hands-based input methods.
  • Head: Examine head-based input methods.
  • Voice: Learn how to design voice-enabled experiences.
  • Peripherals: Learn how to design experiences that leverage peripherals.
Did you find this page helpful?