Facial Motion Capture Animation

From Valve Developer Community
< SFM
Jump to: navigation, search
Wikipedia - Letter.png
This article has multiple issues. Please help improve it or discuss these issues on the talk page. (Learn how and when to remove these template messages)
Dead End - Icon.png
This article has no links to other VDC articles. Please help improve this article by adding links that are relevant to the context within the existing text.
January 2024

Note! This information is very hack-y, and quite possibly wrong

Warning.pngWarning:This information is for advanced users only. Errors in Python scripting can seriously mess up your SFM session. In addition, the content on this page has been developed by someone with very little experience messing around with Source, SFM, and graphics programming in general. This stuff could very well irreparably mess up your scene.

Using Kinect to drive facial motion capture

Let me start out by saying this is A, experimental, and B, the quality sucks, so C, don't expect much. But, it's still kinda fun. None of the code is compiled, so you'll need the Kinect for Windows SDK, Visual Studio, and of course, Source Filmmaker. But, once you install everything, you should be able to just open up the .sln file and choose 'Start' or 'Debug->Start Debugging' or just hit F5 (after plugging in your Kinect). Once the window loads and starts tracking the face, you press 'Write' to start saving data to a file json.txt (which will show up in your My Documents folder), then press 'Write' again to stop recording. The Kinect takes a minute or two to learn your face, so let it run for a bit before trying to write data, but it sucks at facial gesture recognition anyhow, so even with the training, you data will probably suck anyhow.

Anyhow, once you have that data, in SFM, import an HWM model into your scene, then right click and select 'Rig->Load Rig Script', find the ImportAnimations.py file in the SFMFaceTracker directory you downloaded the code in, then use the window that pops up to load that json.txt file that the Kinect part of this captured.

You can find the code here: GitHub repository for Kinect facial motion capture project

What's going on under the hood

What happens with this code is that it takes in a bunch of information from the Kinect as a set of action units and times for various facial motions. The ImportAnimations.py file then takes that input JSON object, and creates animation curves within SFM for the appropriate controls.