Sunday, May 13, 2012

Auto rigging, Motion capture import to Blender, Maya->Blender (WAS Blending morphing with skeleton (on GPU) animations. )

After some time, continuing with fixing few remaining issues in GPU skinning for AnimKit. I'd like to prepare another example, with facial animation and normal map skinning. 

AnimKit's AppAnimKitGL blends animations on CPU side - the code here does it on iPhone 3GS with morphing animation (head) applied first on CPU, then buffers updated via glBufferData(... GL_STREAM_DRAW) and finally skeleton animation applied in vertex shader.

AnimKit's AppAnimKitGL animation blending example

The character is ~2700 vertices it has 18 bones and morphing operates on almost all of the vertices. Though only face vertices get changed, it is a good example to start from. Original "all on CPU" brings ~15 frames per second and "morph on CPU, apply bones on GPU" runs at ~60 fps. For the facial animation I plan to split morphing only to face submesh, then apply bones to all...

After some time, this become an exercise with different tools, more than handling the animation in C++ code. I saw several questions about feasible way to use artwork from Maya and 3DS MAX in Blender, so I'm going to explain the approach I took. I tried with other formats Collada, obj, but following gave the best results:

Converting .mb to .blend and applying textures

First, started looking for a free 3D model I could use for the example; took this one from 3dmodelfree.com. It is only available as Maya binary (file name ingame.mb)  and had no rig. UV textures are available, but not visible in Maya.
Installed fbx exporter plugin to Maya, opened the file and exported to fbx. Then, imported the .fbx file to 3DS Max and exported to (Extensible 3d) .wrl file. This one, when imported to Blender (using 2.57) showed perfectly, 5 meshes (body, bag, head and 2 eyes) and just needed to get scaled.
For each of the meshes: body, bag, head and 2 eyes, opened Edit mode and used UV unwrap to get the UV texture. Replaced original textures with ones available with .jpeg available with the model. Each of the mesh UV map position was offset and of different size compared to the texture - used Blender UV editor to map unwrapped mesh vertices to texture. Finally, after doing this for all the meshes, model showed properly in Blender:



Mixamo auto rigging and using Motion Capture .bvh file

Google showed me a log of Mixamo advertisements recently and I decided to try it. Exported model as fbx and uploaded it. Notice that the model doesn't have standard T-pose - hands are rotated and close to body. Auto rigger asked me to set position for chin, wrists, elbows and knees and ... In a minute I could see perfectly rigged model in Mixamo web viewer (Unity plugin) with all fingers bones (total ~60 bones) in different positions. Exported the Motion capture file (.bvh) from Mixamo and imported it to Blender model. I could see animated skeleton, same size as original model.
There is a bit of a work to apply skeleton to the mesh, but do it several times and it starts to be fast (about a minute seconds for different motion captures):
Select skeleton, enable X-ray so that it is visible through the mesh, go to Edit mode. Skeleton gets displayed in same pose as the mesh just needs to be translated and rotated to fit the mesh. Once you do it, leave skeleton edit mode (that would cause that skeleton appear somewhere else, but no need to worry). Select all soldier meshes (body, bag, head and 2 eyes) and create a group. This is important to do as skeleton would uniformly deform meshes that are part of the same group - otherwise you would see a bag and eyes moving away from the body.
Having all group members selected in object mode, select also skeleton and use command Object/Parent/Set, and then select armature with automatic weights. You might see that for parts of the original meshes, rig is not properly applied - in that case editing bones enveloper or extruding new bones to cover the are would solve the problem.

In AnimKit on iPhone


Finally, I saved the file, added it to AppAnimKitGL example and modified code here to use it. 23000 vertices in mesh, ~60 bones and on iPhone 3GS showed in ~45 frames per second.

Before continuing with morphing and blending, I will need to check feasibility of using .blend file for in game artwork - it is fast for prototyping, but startup time and memory footprint are not that good; should be much better if using PowerVR tooling - for textures only or also for meshes, rigs and animations. Let's see. Found few warning on forums, related to problems with animations when using PowerVR exporters from Blender. Additionally, I spent few learning 3DS Max and character modeling and enjoy some of the features.






No comments:

Post a Comment