Design

Microgestures

Updated: Jan 29, 2026
Microgestures (MG) are smaller than regular gestures. They involve subtle, indirect, thumb-based movements that do not target specific objects. These low-effort gestures speed up navigation and provide shortcuts. This page explains how microgesture design principles expand hand-based input to enable comfortable, lean-back interaction.

What is a microgesture?

Microgestures are a set of small, thumb-based gestures that are part of the Input Hierarchy within Hands. The side of the index finger guides the thumb’s placement. This allows the thumb to rest directly on the finger, creating a pseudo-haptic interaction experience.
Attention
Thumb Tap

The thumb briefly touches the side of the index finger.

System feedback
Thumb Double Tap

A quick sequence of two thumb taps in a row. Tap-tap.

Attention
Horizontal Swipes

A swift movement of the thumb along the side of the index finger, from left-to-right or right-to-left.

System feedback
Vertical Swipes

A swift up-and-down motion of the thumb against the side of the index finger.

Terminology

The following are characteristics and frequently used terminology surrounding microgestures to become familiar with:
CV
Computer Vision or CV Camera is the visual sensing method of detecting micro-gestures, and all other gestures.
Discrete
Refers to a distinct recognizable action or state. The current microgesture acts as a discrete input, indicating the gesture's completion, different to a stateful input. Example: on tap, not release.
Continuous
Refers to an ongoing, continuous action or state. A continuous input indicates an input that persists over time, different to a discrete input. Currently, microgestures are discrete inputs.
Indirect
Indirect interactions occur when users engage with interactables without direct contact. Examples include microgestures like raycasting, gaze targeting, and voice commands.
D-pad
A D-pad (directional pad) is a flat, typically thumb-operated, four-way control found on game controllers, remote controls, and other electronic devices. It allows for input in four directions: up, down, left, and right. Microgestures are comparable to a directional pad.
Non-targeted
In interaction design, "non-targeted" refers to an interaction that is not directed at a specific UI element. It is often described as a directional signal that can be interpreted by the system for general awareness or background processes, rather than for direct manipulation of an object.
Lower effort interaction
Refers to an interaction that requires minimal physical or cognitive exertion from the user to achieve a desired outcome.
Shortcut
This refers to enabling users to move through content or virtual spaces more quickly and efficiently, often by reducing the number of steps or physical effort required.

Technology

Microgestures can be detected using Computer Vision (CV) with camera sensors on Meta Quest headsets. The set of microgestures remains the same for all users, regardless of the technology used.

Computer vision (CV)

Designing Microgestures example
Pros

  • No extra hardware needed, can leverage existing Meta Quest headsets.
  • Detects hand position so can be used as a continuous targeting option.

Cons

  • The field of view is often a limitation for gesture detection.
  • Users must keep their hands higher to keep them in the field of view. This sometimes increases physical effort.

Design

Familiarize yourself with the input primitives this will help you understand the design principles, consider the ergonomic factors, and discover the key microgesture dos and don’ts.

Interaction primitives

To be an effective input modality, microgestures need to be capable of performing the following interaction primitives:

Selection for system navigation and shortcuts

Selection does not support microgestures. Instead, it uses the Index-tap gesture as the primary input for consistency and reliability. This approach frees microgestures like Thumb-tap and double Thumb-tap for contextual actions and shortcuts.
Index-tap for Selection The Index-tap gesture, by using the index finger, is the recommended and standard method for selection also while using microgestures. It is taught as the primary selection method for hands, offering reliable interaction and a continuous signal. Do NOT use Thumb-tap for selection. Thumb-tap is used for contextual actions instead.

System navigation best practices

Do's

  • Use Index-tap to select and hold for continuous control.
  • Use Index-tap to select and microgesture swipes to scroll.

Don'ts

  • Use Thumb-tap for select and hold.
  • Use Thumb-tap for selection.

Contextual actions

Thumb-tap for contextual actions:
Since Pinch is designated for selection, the Thumb Tap microgesture can be effectively used for contextual actions. These include locomotion, media controls (for example, play/pause and mute/unmute), and other application-specific shortcuts. This makes Thumb Tap a versatile, low-effort input option.

Contextual actions best practices

DO use Thumb-tap for teleportation.
DO use Thumb-tap to play/pause a music player.

Targeting

Microgestures are “non-targeted” discrete interactions that enable navigation through directional signals, similar to D-pads. There are two types of cursor designs: static and dynamic.
  • Dynamic Cursor: User moves cursor to target.
  • Static Cursor: User moves content in static target.

Targeting best practices

Avoiding the combination of microgestures with hand-ray or head-ray, as users can experience these combinations to be fatiguing and unintuitive.

Paging and scrolling

Browsing content can follow two different methods: paging and scrolling, this can happen either vertically or horizontally. However, as the current input is discrete and not continuous, only paging is recommended as a step-based navigation.
Paging:
Moves through content in discrete steps or pages. Each swipe or gesture takes a user to the next page, making it clear where one section ends and the other begins. This method uses discrete input and is ideal for content with distinct sections like media-feeds or slideshows.
Scrolling:
Continuous and smooth navigation through content. The user’s thumb location is mapped to the contents location as it scrolls. In this way users can move fluidly through large amounts of information without breaks. This method uses a contentious input and is perfect for content without clear endings, like web browsers or articles.

Paging and scrolling best practices

Do's

  • Use swipe for paging.
  • Use Index-hold for scrolling.
  • Use Thumb-swipes for paging.
  • Use discrete version of microgestures.

Don'ts

  • Use scrolling for microgestures.
  • Use Thumb-swipes for scrolling.

Locomotion

Microgestures can be used for locomotion, allowing users to navigate through virtual space with low effort. They are similar to using a thumbstick on controllers, making it a natural choice for locomotion. Since current microgestures only capture discrete interaction (on tap, not release) and do not capture continuous interaction, currently only support a subset of locomotion types.
To enable locomotion, initiate a single thumb-tap when your hand is in a neutral or idle state (not raycasting on a UI). This action will activate the locomotion method your users have selected in the user preferences of your app:
Teleport
  • Teleport: Move your arm around to aim at your chosen next location and tap to teleport
  • Snap turn: Using your right hand, swipe your right thumb left/right to turn in the desired direction.
  • Steps: Using your left hand, wipe your left thumb left/right/up/down to move a step in the desired position
  • Crouch: Swiping your right thumb down.
Telepath
  • Telepath: Move your arm around to aim at your chosen next location and tap to walk.
  • Snap turn: Using your right hand, swipe your right thumb left/right to turn in the desired direction.
  • Steps: Using your left hand, wipe your left thumb left/right/up/down to move a step in the desired position
  • Crouch: Swiping your right thumb down.
  • Jump: Swipe your right thumb up.
.
Walkingsticks
  • Walkingsticks: Walk by swinging your hands/arms.
  • Snap turn: Using your right hand, swipe your right thumb left/right to turn in the desired direction.
  • Steps: Using your left hand, wipe your left thumb left/right/up/down to move a step in the desired position
  • Crouch: Swiping your right thumb down.
  • Jump: Push off the ground by moving both hands down.
More information can be found under Hand-based locomotion and tested in Meta Interaction SDK Samples app.

Design principles

Explore concepts that shape user-friendly interactions with microgestures.

Gesture tiers and custom gestures

It is possible with ISDK to create custom gestures. Though often it is only relevant to a subset of apps that need inputs options that go beyond the tier 1 system gestures. Think about how well a certain pose or gesture will scale to your apps targeted audience of users. Below is a simple framework for how to think about the different gesture tiers:
Tier 1 - System gestures with high reliability:
  • Index pinch
  • Microgesture tap
  • Microgesture swipe left/right/up/down
  • Stick to only using our tier 1 system gestures in a way that is consistent with OS panel interactions.
Tier 2 - Custom static poses detected by heuristics:
  • These can be authored by the developer in ISDK.
    • Included is thumbs up, thumbs down, rock, paper, scissors, and stop poses.
Tier 3 - Custom expressive gestures detected by heuristics:
  • These can also be authored by the developer in ISDK.
    • This includes a swipe gesture in the ISDK samples.
  • Lower reliability for gestures that are changing both the hand pose and transform space at the same time.
  • Higher reliability for a gesture that’s a static pose moving through space.
For all categories, occlusion is a concern. Avoid poses and gestures where critical joints in the gesture (For example: index tap) might be occluded by other fingers or the back of the hand.

Gesture tutorials

As a new input method with no visible controls, some instructions are recommended for first time users to learn the gestures and demonstrate how to correctly perform the gestures. This can involve videos or 3D hand animations in the immersive experience to illustrate how users should curl their hand and move their thumb for each gesture.
Tutorials should provide clear instructions and positive reinforcement when the gesture is successfully performed. Opportunities for practice should also be given to allow the user to build muscle memory for the gestures.
In this example tutorial, a narrator walks the user through each gesture, breaking down the thumb motion into steps with an accompanying animated virtual hand. Particles are emitted from the thumb when the intended gesture is successfully performed and travel in the same direction as the swipe to reinforce the direction of movement.

Directional bias

When used for 2D content, users may have different mental models for whether directional swipes are inverted or non-inverted. When used in this context, applications should support the ability to invert the direction of movement to match individual user expectations. The following video shows the use of non-inverted (left) and inverted (right) horizontal swipes to page between different images.

Interaction disambiguation and gating

Microgestures may be used in a multi-modal environment supporting different forms of hand interaction, such as direct manipulation and far field raycasting. In some cases, this may increase the chance of unintentionally performing an action.
To address this, gating logic based on the user’s interaction context and hand posture can minimize accidental activations. For instance, microgestures can be disabled on the user’s hand when they are near a grabbable object or poking a virtual plane.
Microgestures can also be gated through hand postures, such as requiring the user to slightly curl their hand with their thumb facing upwards, as if holding a remote control. This gate additionally leads the user to perform the gestures in an optimal way for the headset tracking volume by having the thumb un-occluded by other fingers.

Ergonomics

Lower-effort interaction

Microgesture is a low-effort input by leveraging a relaxed hand posture — typically with the palm oriented sideways and fingers slightly open. Input is performed through minimal, discrete thumb movements, allowing for swift interaction with minimal physical strain.
Hand in a comfortable pose

Hand in a comfortable pose.

Mitigating fatigue

Consideration should be taken to minimize user fatigue that can occur from physical arm and finger motions. Thumb swipes
For thumb swipes in particular, fast and repetitive swiping over longer periods of time may lead to users feeling some fatigue. We recommend using the gestures in applications with a more moderate pace where users can pause between gestures if required, such as for casual browsing of media content or for moving around a space in a social VR setting.
Holding the hand up to raycast and use microgestures is fatiguing.

Holding the hand up to raycast and use microgestures is fatiguing.

Targeting

Fatigue may also occur when the user’s arms are held in a static position in mid-air. Avoid scenarios which require high, outstretched static arm postures for extended durations while using microgestures. Users should be able to move and position their hand and arm freely.

Microgestures design dos and don’ts

Below is a list of our recommended practices when using hands in your immersive experience:
DO utilize MG to navigate a system, using the thumb for directional input, similar to a D-pad or tv-remote controller.
DON'T use MG with a head-ray as the default option. The head-ray is used as a fallback input when other options are unavailable.
DO Utilize MG for teleportation and aim by moving hand to direct desired position.
DON'T combine MG with hand-ray for targeting, because holding the arm up and steady is more fatiguing compared to using a pinch and move interaction.

Next steps

Designing experiences

Explore more design guidelines and learn how to design great experiences for your app users:
Did you find this page helpful?