Get Started
Follow the steps below to set up the Inworld Lip-sync component on your character to detect visemes.Open the Animation Blueprint for your character


Add the Event Blueprint Begin Play to your Event Graph
We will use this event to initialize the functions necessary for lip-sync.

Get the Inworld Lip-sync component from the owning actor
Off the Begin Play node, you want to get the Inworld Lip-sync component from the owning actor. And you then want to bind a function to the On Viseme Update event in the lip-sync component.This can easily be done by using the Create Event node from the Bind Event node and selecting the Create a matching function option. The image below will show what your graph should look like.

Set up the UpdateViseme function
For the purposes of this example, we are naming the function UpdateViseme.In the function you will have an input variable called Viseme Blend — this variable is what we will use to determine the viseme the character is currently speaking.To get the current dominant viseme, you can break the Viseme Blend variable and get the largest float out of the structure. Below you will find a simple example of how this can be done.
Next, we want to convert this value into an Inworld Viseme enumeration that we can use to adjust the animations.First, convert the max value to a byte.
Then convert the byte to an Inworld Viseme enumeration.
You can then store that into a local variable.



Select a face animation from the current viseme
Once you have the current viseme, you can use a simple Select node to determine animations to play. For this example we will use simple animation assets, but you can use this viseme to set up whatever behavior you need to handle your lip-sync behavior.To set up the Select node, simply create it from the Current Viseme variable, and then right-click and use the Change Pin Type action to whatever asset you want to use for your lip-sync animations.
You are now ready to apply lip-sync animations to your character!

