Develop
Develop
Select your platform

Use Oculus Lipsync

End-of-Life Notice for Oculus Spatializer Plugin
The Oculus Spatializer Plugin has been replaced by the Meta XR Audio SDK and is now in end-of-life stage. It will not receive any further support beyond v47. We strongly discourage its use. Please navigate to the Meta XR Audio SDK documentation for your specific engine:
- Meta XR Audio SDK for Unity Native
- Meta XR Audio SDK for FMOD and Unity
- Meta XR Audio SDK for Wwise and Unity
- Meta XR Audio SDK for Unreal Native
- Meta XR Audio SDK for FMOD and Unreal
- Meta XR Audio SDK for Wwise and Unreal
This documentation is no longer being updated and is subject for removal.
There are three phases when you are using the native API:
  • Setup
  • The main loop
  • Shutdown

Walkthrough

After you have completed the steps in Oculus Lipsync Setup for Native Development, to add Oculus LipSync support to a new application, follow these steps:
  1. To initialize Oculus Lipsync, call ovrLipSync_InitializeEx().
  2. Then call ovrLipSync_CreateContextEx() and check the return value to see if it succeeded.
  3. You should then integrate Lipsync into your application’s audio loop. This involves:
    • Obtaining audio input data.
    • Processing the audio input, using one of the ovrLipSync_ProcessFrame functions.
    • Passing the results of the Lipsync prediction to the rendering pipeline.
  4. Destroy the created resources during shutdown. For more information, see the ovrLipSync_DestroyContext() and ovr_LipSyncShutdown() functions.
Did you find this page helpful?