It's been a while since I posted something for you to play with.
Well, here you go.
It will take a minute to load (lots of stuff to calculate). Once it does, just play around with the sliders to customize the guy's face. You can click and drag on his face to turn him around. Cool, huh?
Unfortunately, I can't say that I whipped this up after Thanksgiving dinner. Actually I've been working on it for a week or so, trying out different methods of achieving what I wanted -- and some methods of getting nowhere close. Actually, this version is the most complex one I did. The fact that it works is reason enough to post it.
How it works
Now the thrilling explanation. One of the features that Unity lacks at the moment is the ability to handle morph targets (aka blend shapes) imported from 3D tools like Maya or Cheetah. So, the only way to implement them is to code a system by hand. After scouring the Unity forums, I found a script that would handle simple mesh morphs but didn't do anything as complex as multiple blended attributes.
Using the basics of that script, I manged to put together a system that works pretty well. Here's the basic flow of things:
- To set everything up, I assign the base mesh (3D shape) in Unity, then create a list of attributes that I want to adjust. Each attribute has a reference mesh created by editing the original mesh to get the most extreme version of each facial attribute.
- When the scene loads, it first stores several pieces of information (this is why it takes a while to load):
- For each attribute (24 of them in this scene), it builds a list of vertices in the mesh that are affected by that attribute.
- At the same time it builds a list of offset vectors for each vertex affected by each attribute. This information stores the maximum possible offset for each vertex per attribute. Lots of data building up here...
- Then we build a list of which attributes affect each vertex. This is the converse of the first list above, and it's necessary to keep the code running smoothly.
- After all that data has been stored, the default mesh loads along with a bunch of sliders to control the various attributes.
- Each time a slider is adjusted, we calculate the proper amount to adjust each vertex affected by that attribute. This is done by using a weighted average of all the attributes affecting each vertex in question. By using a weighted average, we can combine attributes like the width and height of the eyes without getting conflicts (this part took me forever to figure out).
- Once we have the right offset vectors, we adjust the affected vertices by those values and redraw the mesh. Voila!
Because there's so much manipulation of individual vertices in this code, it was really hard to optimize. I spent almost the entire day trying to get it to run at an acceptable speed, and I still don't think it's good enough. The main problem is that when a vertex (or group of vertices) is affected by multiple attributes at once, the weighted average becomes harder to figure out and requires more processing juice. I attempted to store offset values rather than recalculating the average every time a vertex is affected, but I couldn't figure it out before dinner. This version runs well enough, so it will have to do.
Oh, and happy (late) Thanksgiving!
EDIT: I posted a new version of the file after getting some serious optimization help from Jamie. It still takes a while to start up (but not as long), and it runs considerably faster once it loads. My code now also morphs the normals as well as the vertex positions, which I had overlooked before.