Animation: Skeletons

December 23, 2022











Animation

Animation is a large topic! So I've decided to write a couple of separate posts about animation in Chronicle. This first post will cover the "basics" of a simple fbx pipeline for importing skinned meshes and skeletons. Future animation posts will cover things like clips, blend trees, state machines, and IK.

Spaces

Before we dig into those topics I wanted to talk a bit about spaces first. Most game devs will be familiar with dealing with world and local space. World space being a things position in the world, and local being relative to the thing itself. When working with animation you usually always have to deal with what I call bone local space. This will be covered more in the skeletons section, but skeletons are made up of bones, and these bones have parents. So bone local space is the bones transform relative to it's parent(or in some cases the bind/reference pose). This is important for things like IK or making runtime adjustments to a characters pose since parts of the characters skeleton will need to be updated and you do this by recalculating bone locals transforms and working down the bone chain.

Because of this "extra" space I tend to use the terms World, Model, and Local spaces. Model being what most people would call local(the origin of the model is also what everything in local space is relative to as well), and then local being the parent bone relative space. This renaming helps me keep track of what I'm talking/thinking about when working in lower level animation systems.

Skeletons

Before we can talk about animation clips, we need something to animate! This is where skeletons and skinned meshes come in. Skeletons are pretty basic data wise. They are normally just a flat list of bones and each bone will contain it's name(and probably a hash for non-debug builds), parent bone index, and its model transform. The model transform will be 4x4 matrix, or a position vector and quaternion representing the bones T/A-pose position and rotation in the skeleton. T/A-pose is the term used to describe what a skeleton and/or skinned mesh looks like with no animation applied. The parent index is the index of the bones parent, skeletons are almost always a hierarchy. For example think about a humans skeleton, your hips/pelvis can be thought of as your root bone which would have no parent, and off of that you'd have your spine and left/right legs be it's children. The image above of a simple skeleton widget is a good example of a basic human skeleton and it's bone relationships. With these parent indices the bone local transforms can be calculated if needed as well. Sometimes you may also have a bind pose transform in the skeleton data as well. The bind pose transform is used to calculate the delta that needs applied to the vertices of a skinned mesh. The bind pose is almost always just the inverse of a bones model space transform so this can be easily calculated when loading skeleton.

So as you can guess the skeleton and skinned mesh are tightly coupled. A skinned mesh is just like a normal mesh with vertex and material data, but it will have an extra set of data for each vertex called bone weights. For a mesh to be skinned each vertex needs to be assigned bones from the skeleton and a weight with how much influence said bone has on it's position. In most cases you won't need more than 4 bone weights per vertex(faces and cinematics being the outliers that can have more). Bone weights are just a flat list of pairs of an integer and float. The integer being the bone index of the bone, and the float being the influence of said bone(0-1). The influences should add up to 1 in most cases. With the index of the bone being baked into the mesh data you can see how the 2 are tightly coupled, if the skeleton changes the mesh will also need its skinning updated to keep everything in line.

I'm not a rendering engineer but I figured it would be good to give a quick overview of how all of this applied to actually render a moving skinned mesh. It all comes back to the bind pose transforms mentioned above. After a character's animation update as finished and you have it's final pose(model space bone transforms) calculated, you take each bone and multiply it by it's bind pose transform. This gives you the delta from the T/A-pose. These deltas are what are sent to the renderer/shader. Each vertex is then moved transforming it by the bones its weighted to by the amount of its influence. This is all done in the the vertex shader(see image below).

Skeleton and Skinned Mesh Pipeline

Now that I've covered what a skeleton and skinned mesh are I'm going to go over how I handle converting source(fbx/obj) skeleton and mesh data into Chronicles runtime format. There are a lot of source data file formats for meshes, skeletons and animation; fbx, dae, obj, and glTF just to name a few. In my experience fbx tends to be the most often used for games as it's the Autodesk format which are the tools normally chosen for game development (Maya, Max, MotionBuilder etc). So do you pick one source format and only allow that? Do you allow multiple? Do you load those file formats directly in your engine/game? For Chronicle I wanted to support multiple source formats since I tend to source my mesh and animation data from places like Mixamo and Patreon since I'm not an artist. I also didn't want the engine runtime to have a dependency on any of these source formats so I decided to add a conversion step into my pipeline that converts multiple source data formats into Chronicles runtime data format for meshes, skeletons, and animation.

Writing all of these different importers would be quite a bit of work, and wasn't something I was looking to do myself. So I decided to use Asset-Importer-Lib(AssImp) to handle that part. AssImp is a great open-source project that supports are large amount of file formats. AssImp works by having importers for all of these different formats that parse the source data files and converts them all to the same in memory representation. So I use AssImp to parse the source files and then I process the in memory AssImp scene data to export that data I care about into the Chronicle formats.

So let's dig into how Chronicle uses this library and the custom binary file formats for a skinned mesh and skeleton.

Processing Data with AssImp

The above image has a code snippet covering the basics with spinning up an importer in AssImp and loading a source file. First up I create a child Assimp::LogStream class so I can redirect logging from AssImp to Chronicles logging. I also just have a constant setup with all the log severities I want to catch. Now in the ExportFile function first you create Assimp::DefaultLogger and then attach your custom LogStream instance to it, this will ensure any logs from AssImp you care about get directed to your logging system. Now you're ready to create the Assimp::Importer which is what is used to import the source file and return the in memory format.

With the Importer now created you can read a source file into an aiScene object. You'll notice in the above image that the ReadFile method takes in a flags parameter as well. These flags can be used to enable/disable extra processing of the source file, for Chronicle I always use the same handful of flags, and the 1 optional globalScale flag for meshes I want to manually set the scale for. See AssImp github for more information on the available post process flags. Now we finally have an aiScene with our processed source data in it that we can use to generate our own binary format or if so inclined you could even use this at runtime.

Above is the MeshConverter widget that is available in both Chronicle's world/object and animation editors for processing source data into Chronicles runtime formats. It's got a handful of settings for overriding things like scale, rotation, and material parameters. It also has options for if animation and physics data should be generated from the source mesh data. All of these settings are saved in an intermediate file(.meshX) that is just a simple xml file that is only used by this widget. Currently this does make some assumptions about where animation source files live relative to the mesh source file, and just collects all files in the expected location for processing alongside the mesh file.

When the export button is clicked all of these settings are saved to the meshX file and forwarded to the export function that will create an Importer for AssImp like shown above and then load the source mesh file into an aiScene. I'm not going to cover how the mesh data is currently processed, but it's pretty straight forward vertex, triangle indices, and material data. I will however cover how the skinning data is processed since that is a core part of this topic and that data does live in the mesh data.

First step is to walk the aiScene hierarchy to collect all the bones of the skeleton and their hierarchy. The image to the right is just a small snippet showing how to interact with the scene and access the root node. The settings structure being used here is all of the settings/overrides provided by the MeshConverter widget shown above. Below is the GetBoneHierarchy function that I'll go over next.

GetBoneHierarchy is a recursive function that will walk the the children of the currentNode collecting the skeleton information required. Mainly bone names, transforms, and parent indices. The first loop is for finding the index of the currentNode in the boneList to be able to set the correct parentIndex on any children of the node. Currently it's just a string compare and this could for sure be improved, but it works and currently none of my skeletons are that large that this causes an issue.

Next we iterate over all the children of the currentNode. If a node contains any meshes it's skipped as it's not a bone node. There are probably some edge cases where this assumption may not work, but so far for all the data I've used this has held up(Miaxmo and some other models from Patreon). Each node also has an aiMetadata on it that could be used to check for more information you may be expecting in your source data. If a child node is assumed to be a bone we convert the AssImp matrix into a Chronicle matrix and add the new bone to the boneList.

Once all of the children have been added to the boneList we loop over the children again calling GetBoneHierachy on each one. This will recursively build the list of bones out and with each bone having the parent index set we also have the skeleton hierarchy.

Now that we have the list of bones and there final indices we can process the skinning data for the mesh as it's processed. The above is the snippet from the mesh processing that handles this. Note this is inside a loop that is going over every mesh in the scene, a lot of source file formats will have multiple meshes per file. Most of the time the mesh is split up by what material it uses. You'll notice that each aiMesh appears to hold references to all the bones, but it actually only holds references to the bones it uses so the steps above to traverse the entire scene and collect all bones is a required step to make sure all meshes can share the same bone indices. So we loop over every bone this current mesh is using and first thing we do is find it's index in the boneList we filled out earlier. Again this does a sting compare and could be easily improved with an hash map for quicker lookup. Next we store the bones offset matrix in a the vector of bone offsets that will be saved as part of the runtime mesh format. When we found all the bones you'll notice we used the nodes transformation matrix not the offset matrix. The transformation matrix is the model space transform of the bone while the offset matrix is actually the bind/reference pose matrix(inverse of the model space transform).

Next we can finally loop over the weight data stored on the bone. Each weight will contain a vertex id and the weight of this bone on that vertex. To the left you can see the struct being used to store this on the Chronicle mesh data per vertex. As a reminder this all done inside a loop going through multiple meshes that I combine for the runtime data format so that's while you see weight.vertexId + totalVertCount for setting the weight data on a vertex.

Now to the left you'll find the internals of the AddWeight function. In most cases this is just incrementing the number of bones used and setting the bone index and weight being added. Sometimes source data may have more than 4 weights per vertex, and currently Chronicle is set to only allow a max of 4, so this will also handle dropping the lowest weight and replacing it with the new one or ignoring the latest if it's lower than any already set. In most cases 4 weights per vertex is more than enough; the main exception being things like faces.

The last step to skinning data is after all the meshes in the scene have been processed each vertex normalizes all it's weights so that they add up to 1. Then this skinning data and bone offsets are written out to the runtime mesh format alongside the rest of the mesh data. I won't cover this as it's pretty straightforward just writing bytes to file using std::ofstream, but the one thing I will recommend as that you always have a header in all your binary formats with at minimum the file version, so when you decide to make changes to a format you can handle loading old data without needing to re-save everything. For example to the right you'll see all the mesh versions I've had over time as I've added to Chronicle.

Saving the Skeleton

So we've covered how to find all the bones and skinning data from the aiScene, now lets take a quick look at how the skeleton is actually saved. Below you'll see the bit that takes the bones we filled in above with GetBoneHierarchy sets them on an AnimSkeleton and then writes it to a file. I like to put my Saving/Loading functions in the same place so that when you're updating one it is easy to also update the other. Of course WriteToFile is ifdef'd out in non tools builds since you probably aren't saving resources like this at runtime. Note AddOrEditFile is a helper function that will attempt to checkout/add files to source control(Perforce in my case) before the file is actually touched.

Above you'll see the implementation of the AnimSkeleton::WriteToFile. Overall I think this is pretty straight forward. Start by opening the file, and first thing I always write to any binary format is the file version as I mentioned above. Next we write out the length of the skeletons name then the name. Now we're on to the meat of the skeleton, the bones. For each bone we write out the length of it's name, the name, it's parent's Index and it's model space transform. The transform is written out by splitting the matrix into translation and rotation, and writing out the 3 floats for the translation then the 4 floats for the quaternion representation of the rotation.

Now we've got a pipeline for meshes that support skinning, and skeletons! This ended up being pretty long, but I hope you found it informative for how skeletons and meshes work together! My next post should be pretty soon as I originally wanted to cover animation clips as part of this post, but decided to make that it's own post that builds on this one.