This article will familiarize the user with the basics of Faceware Studio and will include links to more detailed articles and tutorial videos regarding each subject. Start here if you are using Studio for the first time or if you need to refamiliarize yourself with the workflow.
Faceware Studio is a software application for tracking a facial performance as it occurs or as recorded and animating a digital character based on the performance in realtime. Studio tracks the footage in realtime and streams the data to one of the Live Client plugins. These plugins for MotionBuilder, Unity, and Unreal Engine 4 connect the character to the data stream and animate the character to match the performance.
Faceware Studio represents a realtime pipeline: tracking video footage and generating animation as the performance is happening. Studio is ideal for projects with realtime digital character interactions for events and theme parks, on-set previsualization for films or games, and live online streaming events with animated characters. Studio can also be used to record facial performances within an engine to be used for games or other animations.
Faceware Studio can be installed on any PC with a Windows 10 operating system. While it can also be installed on Win7 and Win8 machines with some success, we only officially support Win10 installations. iOS and mobile installs are not supported at this time.
A fresh installation requires a Faceware_Studio_X.X.X.XXX_Installer.exe. This can be obtained after logging into the facewaretech.com website and heading to the Downloads page.
Your Faceware Studio license(s) is tied to your Faceware web portal login. When Studio is launched, a login is required using the email and password that is used to access your Faceware account. Internet access is required to launch Studio. No ticket activation is required.
The basic Studio workflow contains the following steps. Click on any of the links or videos for more information.
The Realtime Setup panel is where you will choose your video input for realtime tracking and will typically be the first step to get up and running. You can use live video (such as through a headcam or webcam), a saved video of a performance, or an image sequence as your video source. For the Input Type, select "Live" to use a live video feed or "Media" to use a prerecorded video or image sequence.
Whenever you introduce a new actor/video source, you will be required to select a tracking model based on the type of footage that you're using and then perform a quick, one-touch calibration to optimize the software's ability to detect and track your actor. In the Realtime Setup panel, under Face Tracking Model, select either Stationary Camera or Professional Headcam depending on your footage type.
For calibration, the person that is going to be driving the tracking should remain still, look straight ahead with a "neutral" facial expression, and then press the "Calibrate Neutral Pose" button. Pressing this button will pause the feed for a moment so that the software can properly detect where the face and its features are in the frame. This will typically take about one second and is necessary for ensuring accuracy in the tracking. As soon as you calibrate, you'll see the tracking landmarks appear on your video and the preview character will begin to animate.
Each of the Live Client plugins has a unique character setup process due to the requirements of the individual animation application. Click on the appropriate version below that you are using to find specific information about that plugin, including general installation and character setup instructions:
The final step is to stream your animation data to one of the Live Client plugins to generate animation. Go to the Streaming Panel (View>Panels>Streaming Panel if closed). Enter your desired port number, making sure it matches the port in the Live Client plugin, then turn on the Stream to Client button to start streaming animation data.
Get familiar with Studio's Animation Tuning and Motion Effects Learn how to adjust the shape data coming out of Studio to get the exact results you want from your actor's performance on the character.
Working with Epic's MetaHumans - Learn how to animate characters from Epic's MetaHuman creator with Faceware Studio.
FAQ/Known Issues - Have questions about or issues with Studio? Take a look here first.