Skip to end of metadata
Go to start of metadata

Deprecated in CRYENGINE 3.7


Character Editor is deprecated in CRYENGINE 3.7. Most of its functionality and more can be found in Character Tool.

Character Editor can still be found in the View menu: select View/Open View Pane/Character Editor (Deprecated).



The Character Editor is used by artists to 'build' characters using many individual parts (static attachments, skin attachments) and it's also used by animators to preview animations and test the blending-features. The Character Editor offers a visual interface to the programmer interface provided by the underlying animation-system, the CryAnimation module. Understanding the functionality of the Character Editor requires a deep understanding of the animation module.

Skeleton-based Animation

The primary focus will be on skeleton animation technologies, since it is by far the most flexible animation technique today. The skeleton animation system is combining a variety of existing skeleton animation technologies, including playback and blending of animation data as well as IK-based pose-modifications. Procedural algorithms like CCD-IK, analytic IK, example-based IK or physical simulations are used to augment pre-authored animations. All procedural methods have in common that a computer procedurally follows the steps in an algorithm to generate artificial motions. To avoid the typical computer-generated look when combining artificial and captured animations, use a warping technique that can preserve the style and the content of the base motion, despite the transformations needed to comply with the constraints

Morph-based Animations

Skeleton-animation is a very flexible animation technique, but it is not the ideal solution to create deformation caused by muscles and tendons on parts of the human body or the face. While it is possible to drive these deformations completely with a skeleton-system, the number of joints involved is high and the animation setup is very difficult. Generally, the combination of a morphing system with the skeletal system gives the greatest flexibility. The number of vertices that change in each morph target is very limited and the targets can be clearly defined. By far the most powerful feature of morph-targets is the ability to create facial animations. In many cases they can help to eliminate the need for complex joint-setups for detailed animation: typical examples are deformation artifacts at knees, elbows, neck and shoulders, where morph-targets can be used to improve skeleton-deformations. Morph-Targets can even be used to generate entire animation sequences, where an artist is creating an entire mesh by hand for every frame of an animation pose (i.e. opening a parachute).

Character Customization

The character pipeline uses a robust character attachment system which allows for attachment of skinned, animated, or physicallized attachments to the skeleton or polygonal faces of a character, to the extent you can even replace entire body parts such as heads, hands, or upper and lower body. A hardware based shape deformation system allows flexible variation of the character meshes. The system supports manually and even procedurally generated examples to ensure a small memory footprint. An additional variation system based on shaders is used for dirt, decals for clothes, and camouflage shaders for the skin.

The File Menu


Deletes the character instance and the associated model.


Loads a character file. It's also possible to load static models (CGFs). Supported file formats are CHR, CDF and CGA.

  • A CHR is the actual character. It depends on the artist how much data is in a CHR file. In the simplest form it can have just a 'naked' skeleton. Or it can have a skeleton plus a physical proxy for collision detection. In the most complex form it will have a skeleton plus a weighted mesh.
  • A CDF is an XML containing a base-model plus all attachments and materials.
  • CGAs are for non-skinned meshes. They are basically a collection of rigid-body parts connected by a hierarchical skeleton-system. The steps to create CHRs and CGAs in an offline-tool are very different. CGAs on the other side don't require a rigging-process. Artists can just arrange some rigid objects in an offline-tool, connect them with nodes and apply a TCB-spline animation. CGAs are used to animate mechanical objects (weapons, vehicles).

You can set the default-loaded asset for the Character Editor by setting the ca_CharEditModel CVar to the path of the model in a config file.
The default setting is "ca_CharEditModel = objects/characters/human/sdk_player/sdk_player.cdf"


Saves the current Character-Definition File (CDF). A CDF is a XML-file that contains all the individual parts of a character.

It is assumed that the current CDF is not loaded from a PAK-file. Save only works if the CDF was loaded from a real folder.

Save As

Saves the current Character-Definition File (CDF) in a specific place. A CDF is a XML-file that contains all the individual parts of a character.

The View Menu

With the View Menu you can turn various tabs and control boxes on/off in the Character Editor.

The Toolbar

Used to control and placement of attachments. Just the movement and the rotation button are implemented. Pressing one of these buttons automatically puts the character in the default pose and stops all animations.

Search and Filter Animations

The search panel allows you to browse the available animations for the character within a folder structure. This window shows the animation-names that are setup in the .chrparams file. It does not show the actual asset-name. An "animation-names" is an alias for the real asset-name. It might be possible that different animation-names point to the same asset on disk. It is not allowed to use the same animation-name twice in a .chrparams file. This will produce a warning.

When you open a folder, you will see various icons in four different columns:

Column 1

The first column has three different icons:

The A stands for a normal animation-asset. That's usually a motion with keyframes in it. Aim-poses are also regular assets. There are simply used in a different way. They do not make much sense when you click on them and play them.

LMGs (Locomotion-Groups) represent a group of animations that all represent the same motion in a slightly different style, e.g. a run, but for different parameters, e.g. direct (forwards, sideways, backwards), speed (slow walk, fast run), and more. The Locomotion-Group enables the system to easily treat the animation set as one animation. By passing in the appropriate parameters (e.g. direction and speed) the animation system will automatically blend the animations to achieve the correct result – for example a moderately fast walk to 30° to the right.

The last type is morph targets. They have no icon.

Column 2

The icon with the arrow in one direction is a transition asset. The first and the last keyframe in the animation is not identical, so the animation is not looping. The icon with the green circle-arrow represents a looping-animation. This icon appears when the first and the last keyframes in an animation are identical.

Non-looping animation on the left, and looping animation on the right.

Column 3

The '+' is an additive animation. If not, then the asset is a regular overwrite asset.

Column 4

The movie symbol means this is a streaming asset. The animation is loaded on demand into and unloaded from memory when finished playing. Usually it's used for cut-scene animations and/or for all animations that are rarely used.

Character Editor Rollup

Choose Base Character

Switches back to the base model. Because the attachment system is fully hierarchical and each attachment can have it's own set of features (morph targets, animations, etc...), you can test these out in detail by clicking on the name in the attachment view. Use this button to switch back to the base character.

Layer System

Layered blending allows you to apply an animation to only a few select bones, rather than to the whole skeleton. This technique is handled by the layer system. It supports 16 virtual layers and all of them can be combined into one single physical layer. Layered Animations are all applied at the same time onto a character.

Layer0 is the primary-layer. Usually it contains the base (fullbody) animation. All animation in layer zero are full-body animations. If an animation is played in layer 0 has no controller for a specific bone, then take the default transformation from the rig. Layer 0 is the only layer that considers the root-bone. The locomotion locator is supported only in layer 0.

Each layer can play and blend animations. Animations in higher layers will overwrite animations in lower layers. As long as they don't share the same joints, the animations won't interfere. An animated root-bone will have no affect on layer 1-15. Creating partial-body animation-works in the same way like creating all other animations. The only difference is that you delete all not needed controllers.

Combining motions work really well for mechanical objects, i.e. weapons, vehicles, doors, robots ...etc. Because isolated movements for these objects are no problem, you can create independent animations for groups of joints. For example, you have a helicopter with five moving parts: main-rotor, rear-rotor, landing gates, doors and windows. If you were to control each of these five parts independently by full-body motions, then it would be necessary to have 32 assets to control all combinations. With full-body animations, this would result in a combinatorial explosion in the amount of motion-assets after a while, since one must effectively have a motion-asset for every possible combination. With motions combinations you can control all five parts on the helicopter independently, by using just 5 partial-body-assets instead of 32 full-body-assets. As you can see, this is a useful tool to add variety to characters-animations by using a very limited amount of motion-assets.

This system was designed mainly for mechanical objects (i.e. weapons, vehicles) but in some cases it can be used on humans as well. Human animations in a game often consist of a base animation (standing, walking, running) and a number of modified derivatives (looking around, aiming with a gun, reloading a gun). CryENGINE. AnimationOverview offers several solutions to create these derivative animations for humans, and to cut down the number of animation-assets that need to be authored for all combinations. One solution is to generate them procedurally, another by combining existing motions. For example, you have a character with a rifle standing or walking along who is aiming and firing at someone. As soon as he runs out of ammo, he has to stop the current animation and start the reloading. For this special case, the "weapon-reload-animation" could be added onto walking, running, standing and sitting animations, without having to author complete "walk and reload" or "run and reload" animations. The timing of the moving legs won't interfere with the partial body animation. As long as the animations are properly planned the two should work together regardless of the length of the animations involved.

Combining animations is equally useful for animating characters that have a large variety of equipment or weapons, while reusing basic cycles such as walks and runs.

The easiest way to create assets for partial-body animation is to remove the animation-channels for certain bones in the DCC App. That means that you have to create a set of special animations that influence just a limited set of bones and not the entire skeleton.

Start selected

It is possible to select two animations (ctrl+animation1 and ctrl+animation2) and then start both at the same time. The playback of both animations is time-warped (linear alignment). With the Blend-space control you can change the blend-weight.

Blend-space Control

The animation-system supports a data-structure called Locomotion Groups (LMG) to create parameterized motion-clips. An LMG is defined in an XML file. This is a motion-family of related motions. Use the kinematic , physical and/or abstract parameters stored in the asset to specify the desired properties of a motion and to map them to fixed controllers. These high-level parameters are mapped onto corresponding features in captured motion-clips. Motion parameterizationhas significant practical importance video-games, as they allow us to create controllable interactive animations. The current implementation is limited to three dimensions. The first three slider below allow to manipulate the blend-weights.

Certain features of a motion are directly mapped to fixed control-sliders. Most of them can be controlled independently.

The top bar blends
Locomotion speed is mapped to the x-axis.

The second bar blends
Turning angle is mapped the y-axis.

The third bar blends
Uphill/downhill is mapped to the z-axis.

The fourth bar blends
Blends between 2 different animations. Moving the bar will cause the animations to influence what is shown in the main view – one animation will become more dominant over the other.

To select the animations you wish to blend between, by going to the available animations browser, selecting the first animation, then holding ctrl and selecting the second. Then hit the 'start-selected' button to start both animations. An animation can be an LMG or a simple motion-clip. This feature is used by animators to test if certain motion-clips are 'blendable'. Always use linear time-warping to align both motion-clips.

Parameterzing a motion-clip:
This is a technique to describe the actions of an animation. This can be something as simple as selecting a motion-style and telling the character to walk in direction D at speed X, look at position A or aim with the rifle at object B. It simple means, the game specifies a motion-parameter and the low-level animation system generates the desired motion. Animation can be very complex, but it is possible to describe many actions by a relatively small number of motion-parameters. For example, a locomotion cycle might be characterized by the speed, body-orientation, travel-direction and turn-radius . Many typical IK-tasks can be controlled by specifying its reach-, look- or aim-target , which is an arbitrary position in world-space. Such kinematic and physical properties can be combined with abstract properties like mood or style . In a game, these features need to be controlled on an interactive level by simply stating what these features should be. A parametric animation system provides an intuitive interface to control a character by its natural motion-properties. The game simply specifies a parameter and the parametric system generates the desired motion based on the initial skeleton-configuration. Many of these properties (style, speed, turn- radius, body- and travel-direction) are already capture in the motion-clips. These are the natural motion-parameters. You can extract them directly out of the clips and use blending techniques to control them

How to parameterize motion-clips:
Pre-recorded motion-clips are the fundamental unit of operation to create motion parameterizations. The most common way to do this is to collect multiple variations of a motion and arrange them in a spatial data-structure (=blend-space). At run-time apply interpolation and extrapolation techniques to create an infinite number of motions between and around these example motions.

The basic idea of this interpolation technique is to multiply each example motion M by an assigned blend-weight w and to sum them up.

The general equation used to compute the new motion M (p) is defined as:


The parameter p is describing the properties of the motion. E is the number of example-motions. All parameterization methods are using either this equation or a variation of this equation as a back-end. For interpolation the weights are usually in the range [SANDBOX:0 ... 1]. For extrapolation it might be acceptable for some motions to have negative values or values bigger then 1. In all cases, the weights should be normalized to always sum to 1.0 to avoid scaling of the animated character.

The big problem with this approach is that the blend-weights are totally unintuitive. They provide no information about the motion properties (speed, style, direction, reach-target). Eq.1 is basically a direct mapping of the blend-weights to the data set and these results in a new motion with the properties p. The problem is that it is nearly impossible to know what p is before executing the blend. For some applications (e.g. blending between abstract features like style and mood and simple interpolations in one- and two dimensions) it might be possible to control a motion by simply varying the interpolation weights directly, but for controlling kinematic properties in a bigger set of motions with complex correlations it is hardly possible to obtain precise control. In order to produce parameterized motions you need the inverse map ping of Eq.1.

More concretely, the blend-weights are generated according to the desired motion properties not the opposite way. Following two parameters extract then the features from the LMG and assign the right blend-weight.

If this checkbox is selected then it's possible to control the speed of a character by selecting meters per second

If this checkbox is selected then it's possible to control the turn-radius of a character by selecting radians per second

If this checkbox is selected then the facing direction and the travel direction are pointing in the same direction.

Stop layer
Stops playing the selected layer.

Stop all
Stops playing all selected layers.

Resets the character to its default pose, it had when it was exported from the DCC Tool.

Animation driven motion
This flag is set per character instance. Set this flag and then extract the real movement (translation and rotation per frame) of the character from the motion-file and pass this information via callback to the game-code to move the character in the world. This feature requires carefully authored motion-assets with locomotion locator. A locomotion-locator is describing the logical movement of the motion-clip in the simplest way. For an idle-animation the locator has one single key-frame, for a locomotion-cycle it has two key-frames. All the subtle details of the motion are transferred to the pelvis.

Animation Flags

All these flags are set 'per-animation'. Different animations used on the same instance can have different animation-flags. All flags are directly attached to an animation. That means you have to select the flags first, and then start the animation. Changing flags while an animation is playing will have no effect.
The first three flags are used to control the playback of animations. All playback flags are exclusive, that means only one of them can be enabled for an animation at a time.

Order of priority:

  • CA_MANUAL_UPDATE, (disables looping and repeat)
  • CA_LOOP_ANIMATION, (disables repeat)

Manual update
With this flag the animation is not being updated automatically. The user needs to set the time manually. This is used for steering-wheels and mounted-weapons, where you convert the rotation of an object into animation-keytimes In other words, the game-code is using this flag to map the aim-direction of a gun or the rotation of a steering wheel on a specific full-body keyframe.

Loop animation
Plays an animation in an endless loop until you stop this animation or start a new animation. Without this flag, the animation plays once and is then being removed from the FIFO-queue (First-In-First-Out).
Please refer to this documentation to create seamless animation loops.

Repeat last key
Plays an animation once and then repeats the last key-frame. The animation stays in the FIFO-queue until you stop it or start a new animation.

Transition Time Warping
This flag is used to align the length of locomotion animations (walking, running, sprinting, crouching) when transitioning between them. It is using simple linear time-warping to align motion-clips with identical features but different amount of key-frames. Without this flag you can get foot-sliding and/or foot-crossing when doing transitions.

Disable Multilayer Animations
Many animations are a combination of a base-animation (locomotion in layer 0) and several partial body animations in higher layers (reloading rifle, raising rifle, signals, etc). Partial body animations are not always compatible with the base animation. Reloading a rifle might work with a walk-cycle, but it will look odd when jumping over a fence or entering a vehicle. With this flag you can disable multi-layer animations for certain base-animations.

Transition Time
A transition is a smooth flow from one motion to animation. This control-box allow you to adjust the transition time e.g. how long it takes to blend from one motion to another. Rule of thumb: similar motion need shorter transition times (less than 0.5 seconds) then motion that are visually different.

Allow animation restart
By default it is not possible to start the same animation twice. In some special cases this is sometimes necessary (e.g. recoil-animations). By enabling this flag you can restart the same animation using the previously described transition-rules.

Post Processing

This is a list of procedural modification that can apply to an articulated model. This happens in a post-process after playing animation-date on the skeleton.

This is a procedural implementation of Look-IK. It works with humans and with human-like characters. To identify the bones you need to manipulate, use the bone-names. The following Biped-bones are required for Look-IK to work.

"Bip01 Spine"
"Bip01 Spine1"
"Bip01 Spine2"
"Bip01 Spine3"
"Bip01 Neck"
"Bip01 Head"

This is a procedural implementation of Aim-IK. Because aiming is much more complex then looking and because there are special rules how to use weapons, use animator created example poses for Aim-IK. Requirements for this method to work are to assign an Aim-Pose to an animation (this happens in the AG) and the character has the required bones. To identify the bones you need to manipulate, use the bone-names. The following Biped-bones are required for Aim-IK to work:

"Bip01 Pelvis"
"Bip01 Spine"
"Bip01 Spine1"
"Bip01 Spine2"
"Bip01 Spine3"
"Bip01 Neck"
"Bip01 Head"
"Bip01 R Clavicle"
"Bip01 R !UpperArm"
"Bip01 R !ForeArm"
"Bip01 R Hand"
"Bip01 L Clavicle"
"Bip01 L !UpperArm"
"Bip01 L !ForeArm"
"Bip01 L Hand"

This is a procedural implementation of Foot-IK. To identify the bones you need to manipulate, use the bone-names. The following Biped-joints are required for Foot-IK to work. The ankle-joint is trying to reach the spinning cube. You can control the position of the cube with the Numpad (4=left, 6=right, 8=backward, 2=forward, STRG+8=up, STRG+2=down). The IK-configuration are not always realistic. There are no joint-limits and body intersections are possible

"Bip01 L Thigh"
"Bip01 L Calf"
"Bip01 L Foot"

This is a procedural implementation of Foot-IK. Internally Crytek uses a two-bone-solver. To identify the bones you need to manipulate, use the bone-names. The following Biped-joints are required for Foot-IK to work. The ankle-joint is trying to reach the spinning cube. You can control the position of the cube with the Numpad (4=left, 6=right, 8=backward, 2=forward, STRG+8=up, STRG+2=down). The IK-configuration are not always realistic. There are no joint-limits and body intersections are possible.

"Bip01 R Thigh"
"Bip01 R Calf"
"Bip01 R Foot"

This is a procedural implementation of Arm-IK. Internally Crytek uses a two-bone-solver. To identify the bones you need to manipulate, use the bone-names. The following Biped-joints are required for Arm-IK to work. The hand-joint is trying to reach the spinning cube. You can control the position of the cube with the Numpad (4=left, 6=right, 8=backward, 2=forward, STRG+8=up, STRG+2=down). The IK-configuration are not always realistic. There are no joint-limits and body intersections are possible.

"Bip01 L !UpperArm"
"Bip01 L !ForeArm"
"Bip01 L Hand"

This is a procedural implementation of Arm-IK. Internally Crytek uses a two-bone-solver. To identify the bones you need to manipulate, use the bone-names. The following Biped-joints are required for Arm-IK to work. The hand-joint is trying to reach the spinning cube. You can control the position of the cube with the Numpad (4=left, 6=right, 8=backward, 2=forward, STRG+8=up, STRG+2=down). The IK-configuration are not always realistic. There are no joint-limits and body intersections are possible.

"Bip01 R !UpperArm"
"Bip01 R !ForeArm"
"Bip01 R Hand"

Foot Anchoring
This is a feature to reduce foot-sliding caused by blending. Requirements are pre-calculated foot-plants in the resource-compiler. Use the console command ca_DrawFootPlants=1 to see the foot-plants.
This feature is deactivated in CryENGINE by default. Use ca_FootAnchoring=1 to activate it. This feature is currently under heavy reconstruction.

The following Biped-joints are required for foot-anchoring to work.

"Bip01 L Thigh"
"Bip01 L Calf"
"Bip01 L Foot"
"Bip01 L Heel"
"Bip01 L Toe0"
"Bip01 L Toe0Nub"
"Bip01 R Thigh"
"Bip01 R Calf"
"Bip01 R Foot"
"Bip01 R Heel"
"Bip01 R Toe0"
"Bip01 R Toe0Nub"

This is a unit-test for Foot-IK to test walking on uneven-ground. Use the num-pad (4,6,8,2) to tilt the ground. Use '5' to put the ground-plane back to the flat position.

Slider for Playback speed
This slider allows you to change the playback speed of an animation. It is basically a speed multiplier. 1 is normal speed. 2 is double speed. Negative values are playing the animation backwards.

Unit Test for Humans

These are test-implementations of frequently used features of the animation system. All of them are limited to humans.




Third person player control. You can control the player using the WASD-keys and mouse.

Fixed Camera

Camera of "Player Control" is usually behind the player. This checkbox is detaching the camera.

Path Following

Test implementation of path following using the decoupled method.

Attached Camera

Camera of "Path Following" is usually behind the player. This checkbox is detaching the camera.


Test implementation of "idle 2 move" parameterization.


Test implementation of "idle step" parameterization.

Use Morph Targets

Activates/deactivates usage of morph-targets.

Linear Morph Sequence

Play all morph-targets in alphabetical order. Used for tweening-animations without bones. Currently used for the opening of the parachute.


If "Linear Morph Sequence" is activated, then you can use this bar to play the morphs in alphabetical order.


It's possible to change default compression parameters directly in the CE. For this ability you should load an reference model. After that just select an animation or a group of animations and change rotation\position error thresholds in a controls. You can choose a ready preset here or create a new one with different settings.

Change an error threshold for rotations per bone\group of bone here.

Change an error threshold for positions per bone\group of bone here.

Save Preset
Save a preset with a new name. This preset would be available in presets list.

Animation Control Tab

This panel allows users to preview and explore animations frame by frame. It also allows attaching events to selected animation-assets. Events are user-defined notifications for the base application. It uses the callback system to inform the base application that a certain keyframe in an animation has been passed and that this is the right moment for the application to start an action (set foot-plant, print text, play sound, trigger a particle-effect...etc). Every event has a name and a keytime [SANDBOX:0...1]. It can also have a custom parameters or a bone-name. Events are directly tied to the animation-asset (and not to the animation-name in the .chrparams file). Each animation-asset can have an unlimited amount of events.

You can use the Animation Control Tab to create animation segments (Footstep Markup).

For more information on how to precisely time sounds within an animation using anim events, please refer to Sounds in Animations.

When loading a CDF or a CHR, the event database is automatically loaded as well if there is an entry in the .chrparams file.

Usually there is one single event database per model-type. All Humans share the same database. Aliens and animals have their database. It is possible to create a different database per character, but this is not recommended. Try to avoid identical event-names

The animation-control is used to select the keytime to place an event. The length of the animation is always in the range [SANDBOX:0...1].

Animation asset
The animation currently playing. It is important to note that this is not showing the animation-name from the .chrparams file, but the real asset name with the full file-path. The file-path is unique; the animation-name is not.

Modify Events

When loading a CDF or a CHR, the event database is automatically loaded as well if there is an entry in the .chrparams file.

Saves the updated event database

New Event
Creates a new animation event. You need to specify the event-name and the keytime.

Delete Event
Deletes the currently selected animation event

Select Sound
This is a convenient function because events were mainly used to trigger sound-effects. Events are by no means limited to sound. In this case you can select a sound-path as a parameter. A parameter can be any string or number. It's also possible to choose a joint where you want to play the sound. All sounds are in 3d-space.

Use selected Effect
Open the DataBase View. And then select a particle effect. Now you're able to assign this effect to the character and trigger it and preview it. It's also possible to choose a joint where you want to attach the particle-emitter.

For more information, please see the Using Particles in Animations tutorial.

Attachments Tab

The attachment-system (and the shape-deformation system) was one of the main-reasons to create the Character Editor. Many games usually have 10 or 20 different character-models and they clone these models and use them over and over again. Using attachments is one way to avoid cloning and to give a character a unique look. Bone, face, and skin attachments are supported.

You can attach static or animated objects to the bones or the mesh of the character. Examples are different weapons, helmets, pockets. You can replace entire body parts, e.g. heads, hands, legs and shoes.

The attachment system is designed as a fully hierarchical system. That means you can build a character, store the CDF and then attach this CDF to another character. This feature gives the best results if you use it in combination with shape-deformation.

In the attachment window you can create an attachment-socket. A 'socket' is an empty attachment without assigned geometry. It has a name, position and orientation and the attachment features (bone, face, skin). It's possible to visualize them with the console-command ca_DrawEmptyAttachments=1. The properties of this attachment are shown in the attachment properties to the right. Empty attachments can be used by the game-code to attach objects to the characters at runtime, e.g. different weapons attached to the right hand of NPCs.

Before creating an attachment socket it is necessary to put the character into default-pose. You need to click the move arrow to put the character into default-pose.

When the character is in default-pose, you can see the text "reset-mode" in your screen right/top corner. This is important before creating the attachments.

If you move with the mouse-cursor of the character in reset-mode, then it shows the base-model with a red wireframe and all attachments with a green- wireframe.

After creating an attachment for your character, you need to hit apply (sometimes you need to hit apply more than once), and save the file as a .CDF

The CDF file will store references to all your attachments and can also be edited manually by a text editor.

Attachment properties

Bone attachment
Sets the drop down list to the right to show any object attached to this bone. Attachments can be 'relative' to a selected bone or they can be perfectly aligned with the selected bone. Checkbox = Align Bone Attachment

Face attachment
Face-attachments are projected on the surface of the mesh. Face-attachments work in combination with shape-deformation. They always stay on the surface of the mesh, even if the character is thinner or fatter. Bone-attachments don't do that. You can assign face-attachments only to the base-model (that's the model with the red wire frame).



If you want to assign a face-attachment to a part of a character that is actually a skin-attachment, then you have to load the skin.chr create the attachment on it and safe it as a skin.cdf. Then use the skin.cdf and the face-attachment will appear on the correct body part. There might be visual problems if artists remodel the character.

Skin attachment
A Skin attachment is an attachment that has a skeleton of its own. This makes it possible to replace body parts (head, hands, legs). It is very important that that the names of the skeleton in the skin-attachment are identical with the bone-names in the base-skeleton. A skin-attachment can have (and should) have less bones then the base skeleton. For maximum performance a skin-attachment should have only the bones necessary to deform the body part. Each skin-attachment can have its own character features (morph-targets, attachments, etc). A skin-attachment can have NO animation. To animate the skin-attachment and deform its mesh, always use the skeleton-pose of the base-character.

Align bone attachment
This check-box works only in combination with bone-attachments. If set, then it aligns the coordinate frame of the geometry with the bone. In other words, the rotation of the joint and the rotation of the attachment are identical.

Hide attachment
Hides the currently selected attachment. This feature can be set by default, but usually it's a run-time feature.

Physicalized attachments
Dangling character attachments use a fairly simple dedicated code (directly in the animation system) that can only handle one-hinge attachments with joint limits (they don't even handle collisions with the character or with other attachments).

To turn on dangling for a particular attachment, one has to specify 2 planes that form the edge used as a hinge. Planes are specified in the attachment's coordinate space. The first plane is the one the attachment "lies" on in its resting state, and the second is another plane it contacts with. For the most typical attachment orientation, natural plane selection would be positive y, positive z (+y,+z).

Default attachment orientation corresponds to joint angle 0. The maximum rotation angle is specified as the "limit" property. Use the "damping" property to prevent the "pendulum effect" (values up to 10 are reasonable, but can go even higher when needed).

Note that the changes take effect only after you press "apply" button.

Explanation, how the planes are defined:
The axis of the local coordinate system can be seen as normals on planes with the same name. The +y Axis is the normal to the +y plane. the -y Axis is the normal of the -y plane, +z is the normal on the +z plane (see color coding in the picture below). In the example above, choose the +y, +z planes. The hinge axis is created between them.

Shows the full path of the object assigned to the attachment socket. Supported file types are CGF, CHR, CGA and CDF. The name of the object is shown here, with a button to browse and assign a different object "..." to the right.

Material selection of the assigned object. The name of the material assigned to the object is shown here, with a button to browse and attach a different object "..." to the right. "Dflt" assigns a default material.

Deletes the assigned model from the attachment-socket. It doesn't delete the skin-attachment.

Assign model to animation-socket.

Phys Props

Using Phys Props, you can store settings for your rope in your CDF file.

To use Phys Props, you need to create a bone attachment for your first rope bone (Rope01 Seg01), hit apply. And now you can click Phys Props Alive to set the settings for your Rope.

Once you have created the bone attachment for your rope, make sure to save the CDF file.

For more information on Character Ropes, please refer to the Rope section in Cloth setup document. You can also check the Rope Setup document.

Set Material Tab

Open Material Editor...
Opens the Material Editor with the base characters default material selected. You can assign a different material and store it permanently in the .cdf file.

Sets the material selected in the Material Editor to the Character.

Restores the original material.

Shape Deformation Tab (obsolete)


 Click here to expand...

The shape-deformation (and the attachment system) is one of the main features of the Character Editor.

This feature allows changing the shape of character. It's even possible to do this at run-time. Create one base model, and two variations of this model. These variations can be a thin and fat version of the same model. For the variation store just the vertex-positions, which means it has a very small memory impact. In the DCC App assign different vertex-colors to the base model. Then pick a vertex-color and with the slider you can interpolate between the different variations. This feature has almost no impact on the CPU. The video hardware can handle everything on its own.

For shape deformation blend between 3 models. Just the positions of the models are stored in the vertex-buffer. Tangents and UVs are shared. The blend-models are created procedurally by default. It's actually always a 'thin' and a 'fat' version of the same base-character. Generate the 'thin' and the 'fat' version by scaling the mesh using the bone direction. It is very important that the thin/fat variations have exactly the same vertex-count as the base mesh. Instead of the default blend-model you can create your own version. Just take the base-model, change the vertex-position and store the file with a '_thin.chr' and a '_fat_chr' in the same folder. This technique is not limited to thin and fat version. You can blend between any custom meshes. If LODs are used, then it is important to apply the same features to all LODs. Additionally it's possible to use vertex-colors to define the parts of the body you want to blend. You can select eight different parts. Use the RGB channels in the vertex-colors as a look-up index. An RGB-color has 24 bits. RGB colors are either 0 or 255. This results in a combination of 8 look-up indices.








= index 0 (black)




= index 1(red)




= index 2(green)




= index 3(yellow)




= index 4 (blue)




= index 5(magenta)




= index 6(cyan)




= index 7(white)

Draws a wireframe on top of the character. By default the color of the wire-frame is white. For characters with shape-deformation you can see the colors for the different body parts defined by an artist.

Deformation Slider
Select colors from the channels and move the slider left/right to blend between the three models. Values in range [SANDBOX:-1...+1] is interpolation, all values outside this range are extrapolation. Extrapolation can give undefined results.

Uniform Scaling
Scales the entire character including animation.


Debug Options

The debug options can be accessed at the bottom of the Character Editor Rollup.

Debug Option


Display physics

Displays the character's physical collision mesh.

Disable Char. Phys

Stops updating character physics like rope or cloth. If Display physics is enabled, it will freeze the character's physics


Shows the character in wireframe mode.


Triggers the floor grid


Draws the coordinate-frame of the characters. This is the base-matrix of the character in the word. When the "Animation driven motion" is disabled then the base is always identity in char-edit. With ADM you apply the locomotion-locator movement the base. The coordinate frame has three axes (red, green, blue). Red is pointing to the right, green forward, and blue up.


Draws the locomotion locator of the characters. That's the y-vector of the root-joint quaternion. With ADM enabled, the locator-direction and the y-component of the base are identical. The locomotion locator has a position and a direction. The direction is the facing-direction of the character. Usually it's a green arrow pointing forward.


Toggles lighting on and off in the scene


When turned on, the diffuse lighting in the scene will rotate around your character.


Sets the background color of the main view


Sets the ambient lighting on the character in the scene

LightDiffuse 1, 2, 3

Sets the coloration of the 3 diffuse lights in your scene


For per-pixel lighting a so-called tangent-matrix per vertex is stored. This checkbox shows the tangent component.


For per-pixel lighting a so-called tangent-matrix per vertex is stored. This checkbox shows the bi-normal component.


For per-pixel lighting a so-called tangent-matrix per vertex is stored. This checkbox shows the normal component.


Shows the character skeleton and joints


Shows the name of each joint over the character


Shows the position and coordinate-system of the joint and the relative and absolute values


If you activate this checkbox and click on an animation then you can see every single keyframe as a skeleton rendering


Updates only the root bone.


Shows the caps of the LMG you're currently playing


Shows the position of where the animation starts


If enabled, it shows the path of the character in one second (provided the parameters aren't changed)


Draws the locomotion locator and travel direction. The locomotion-locator is a green arrow pointing forward. The travel direction is a yellow arrow. The length of the yellow arrow is the velocity (m/sec).


If enabled, you can use the Num-pad to control the strafe-directions


Converts the 'bad' parameters used for motion-parameterization into more natural parameters, based on turning angle, slope and strafe directions


Shows real speed after motion-parameterisation


Displays real travel speed and real turn speed.


With ForceLOD enabled, you can select and hold a specific LOD regardless of of your position in relation to the camera


Enable ForceLOD, and select the desired LOD level


Shows debug information for any shaders used in the scene


Disables LOD. Decpreciated.


AttachCamera enables a first person perspective attached to the Weapon Bone.


Adjusts the frequency of camera rotation.


Adjusts the amount of camera wobble.


Sets the field of vision of the camera used to view your scene.

  • No labels