Model Files (*.CHR, *.CDF)
The model mesh itself is stored in a CHR file. These files are used throughout the engine for characters that include skeletons and some familiarity with them is assumed here.
From the facial animation point of view, the two important aspects of a model are the morph targets and the skeleton. Expressions refer to these things by name, so it is possible to implement them differently in different models. This is very important, particularly for morphs, since a morph must be customized to suit each individual mesh.
If an expression refers to either a bone or a morph that does not exist in the model, then the expression will be ignored.
Characters that appear in-game are generally combinations of a primary model and several attachments. In particular, the head is often a separate model that is attached to the body. This composite model is defined in a CDF file.
The animation system can handle the situation where the model with morphs is not the main mesh. In this case, it finds a model that has morphs and assumes it is a facial model. It is currently not possible to have morphs in different attachments used by the facial system.
Only the main skeleton is affected by facial expressions that move bones. However, skin attachments share the skeleton of the master model, so they will be affected by these expressions as well. This is usually how head attachments are set up.
Facial Expression Library (*.FXL)
Facial expressions are stored in a facial expression library or FXL file. These files are basically a list of all the expressions, both primitive and compound, that a sequence can apply
As mentioned in the previous section, expressions in the library refer to aspects of the model, such as morphs or bones, by name. Depending on the model being animated, these things may or may not be present, and in addition they may vary between files. For this reason a given expression may have a different appearance depending on the model that is loaded.
When a model is loaded, an expression library file is often auto-loaded for it. When that model then plays a sequence it uses the expression library associated with it. The FXL file that the engine should load for a given CHR is specified in the .chrparams file for that model - use the $facelib pseudo-animation.
Facial Sequence (*.FSQ)
A facial sequence file stores the animation curves of the expressions over time, as well as an associated sound filename and various other data.
The contents of the file are basically a series of channels, each of which is the name of an expression and a curve that animates the expression over time. These expressions are referred to by name, and are defined in the expression library. Therefore playing the same sequence with two different expression libraries, or with two different models, may have quite different visual results.
At run-time there are various means of triggering an FSQ to play, such as via the Flow Graph or the dialog system.
Channels in the sequence can be grouped into folders. These are usually used for logically grouping channels into a hierarchy to keep the sequence organized. However in certain cases these can have a direct effect on the sequence, as we shall see below.
Some sequences may include channels that do not refer to expressions. These can be one of the following things:
- Balance channels;
- Procedural head animation channels;
- Vertex drag channels;
- Phoneme strength channels.
Balance channels control the balance of other expressions over time. There are two types of balance channel, which differ by which expressions they affect. The standard balance channel affects all other channels in the same folder all sub-folders. The expression folder channel affects all expressions that are in a given folder in the expression library.
Procedural head animation channels are a quick way of auto-generating some basic animation for a sequence. Adding a procedural head animation channel will cause the animation system to animate expressions procedurally based on the phonemes in the sequence. This will work in concert with any custom animated channels. The value of the procedural channel represents the amplitude of the procedural generate at that time - if this value is 0 for a given time then procedural animation will be ignored.
At the time of writing, vertex drag channels have no effect.
Phoneme strength channels control the amplitude of expressions that are played as part of lip-synching. If there is no such channel, then this amplitude will default to 1 throughout the timeline. Lip-synching can be muted or disabled by creating a phoneme strength channel and adjusting its values.
Joystick Layout File (*.JOY)
Joystick files store the layout of joysticks and the channels they refer to. Therefore the same joystick layout can be used for multiple sequences, and each animator can use his or her own preferred layout.
A joystick layout can be used with any sequence. However, channels are referred to by name and position in folders, so it is possible that different sequences may use different expressions for a given channel. It is also possible that the sequence will lack a given channel, in which case the joystick axes that refer to it will not be operational.