Facial Motion Capture Animation
January 2024
You can help by adding links to this article from other relevant articles.
January 2024
Note! This information is very hack-y, and quite possibly wrong
Using Kinect to drive facial motion capture
Let me start out by saying this is A, experimental, and B, the quality sucks, so C, don't expect much. But, it's still kinda fun. None of the code is compiled, so you'll need the Kinect for Windows SDK, Visual Studio, and of course, Source Filmmaker. But, once you install everything, you should be able to just open up the .sln file and choose 'Start' or 'Debug->Start Debugging' or just hit F5 (after plugging in your Kinect). Once the window loads and starts tracking the face, you press 'Write' to start saving data to a file json.txt (which will show up in your My Documents folder), then press 'Write' again to stop recording. The Kinect takes a minute or two to learn your face, so let it run for a bit before trying to write data, but it sucks at facial gesture recognition anyhow, so even with the training, you data will probably suck anyhow.
Anyhow, once you have that data, in SFM, import an HWM model into your scene, then right click and select 'Rig->Load Rig Script', find the ImportAnimations.py file in the SFMFaceTracker directory you downloaded the code in, then use the window that pops up to load that json.txt file that the Kinect part of this captured.
You can find the code here: GitHub repository for Kinect facial motion capture project
What's going on under the hood
What happens with this code is that it takes in a bunch of information from the Kinect as a set of action units and times for various facial motions. The ImportAnimations.py file then takes that input JSON object, and creates animation curves within SFM for the appropriate controls.