Facial animation on Unity

I have some speech audio. What is the API for animating the facial expressions of my RPM avatar in Unity? (ie blendshapes / visemes / etc)

I’ve looked here but there are no docs on what to set: Animations - Ready Player Me

Thanks!

I see these blend shape targets under Skinned Mesh Renderer but there should be more?

This seems to be the answer: .glb Morph Targets missing - #3 by labris - Bugs - Babylon.js

I’ll give this a shot!

Hello,
We have a voice handler component in Unity as part of the Ready Player Me package. You can add it to your character and also choose the audio provider to be either an audio clip or real-time voice input.

You can find an example of using this on our blog post: https://readyplayer.me/blog/integrating-ready-player-me-characters-into-diverse-game-art-styles-demo-using-shaders-in-unity
Even though the topic of the blog is about using characters within different art styles, there is a section that covers the speech animation ''Bonus: Animate your NPC characters’ speech".

1 Like

Thanks for the links! You’re the best.

I tried out the voice handler but the result is lacking expressiveness: the mouth movement is mostly just up and down. Is there some way to get more detailed facial movements?

So I found the script for the Voice Handler component and it seems like the facial movements are just computed directly / quite crudely on the device. Will leave it alone for now but will probably get better blend shapes from some other API later on.

Running into a small issue: when I play the audio source through code, the audio plays but the facial animation is no longer working?

This is the code that I’m using:

Whoops – that code is totally wrong and doesn’t match the blog post.

How do I import the VoiceHandler type into my Unity project? If I do:
image

I get this error:

I have both packages already installed:

So I figured out how to import VoiceHandler but the mouth still isn’t animating.

This is the new code:

I have a debug log statement and the blend shape weights in VoiceHandler.cs are being called but the mouth is not changing.

Any suggestions? Thanks!

Hello,
In which method are you trying to run this code? And where is the method being called? On Start()?

I have another script at Assets/[SCRIPT_NAME].cs that finds the RPM avatar game object by name and then tries to play the current audio clip. It’s being called in a subroutine that is conditionally invoked in Update().

Here are the warnings:

I’ve confirmed that I can manually set the facial blendshapes if I don’t use the Voice Handler.

Can you try to call the coroutine on Start()? and make it wait 1 second before running the rest of the code using:
yield return new WaitForSeconds(1); Just to ensure the initialization on the other scripts is done.
Also, if you want the animation to loop, make sure to use GetComponent<AudioSource>().loop=true;

So I did some research and it seems that blendshapes are not supported in MR on the Apple Vision Pro at this moment: Blendshape Support - Mixed Reality (Immersive) Apps - Unity Discussions

Could RPM rerig the models by any chance?

It’s not planned on our side in the short term. Reading the devs’ comments, it looks like they will be supporting blend shapes at some point. On our side, we will try to see if we can prioritize this.