Skip to main content
The Inworld Lip-sync plugin provides a component that lets you detect the visemes being spoken by an Inworld Character.
Lip-sync support is experimental.

Get Started

Follow the steps below to set up the Inworld Lip-sync component on your character to detect visemes.
1

Add the Inworld Lip-sync component to your character

Adding the InworldLipsync component to the character actor
2

Open the Animation Blueprint for your character

Opening the Animation BlueprintOnce the Animation Blueprint has been opened, navigate to the Event Graph.Navigating to the Event Graph
3

Add the Event Blueprint Begin Play to your Event Graph

We will use this event to initialize the functions necessary for lip-sync.Adding the Event Blueprint Begin Play node
4

Get the Inworld Lip-sync component from the owning actor

Off the Begin Play node, you want to get the Inworld Lip-sync component from the owning actor. And you then want to bind a function to the On Viseme Update event in the lip-sync component.This can easily be done by using the Create Event node from the Bind Event node and selecting the Create a matching function option. The image below will show what your graph should look like.Creating the event binding for On Viseme Update
5

Set up the UpdateViseme function

For the purposes of this example, we are naming the function UpdateViseme.In the function you will have an input variable called Viseme Blend — this variable is what we will use to determine the viseme the character is currently speaking.To get the current dominant viseme, you can break the Viseme Blend variable and get the largest float out of the structure. Below you will find a simple example of how this can be done.Breaking the Viseme Blend and finding the max valueNext, we want to convert this value into an Inworld Viseme enumeration that we can use to adjust the animations.First, convert the max value to a byte.Converting the index to a byteThen convert the byte to an Inworld Viseme enumeration.Converting the byte to an Inworld Viseme enumYou can then store that into a local variable.
6

Select a face animation from the current viseme

Once you have the current viseme, you can use a simple Select node to determine animations to play. For this example we will use simple animation assets, but you can use this viseme to set up whatever behavior you need to handle your lip-sync behavior.To set up the Select node, simply create it from the Current Viseme variable, and then right-click and use the Change Pin Type action to whatever asset you want to use for your lip-sync animations.Setting up the Select node with Change Pin TypeYou are now ready to apply lip-sync animations to your character!