Category Archives: Virtual Reality

Experiments with Animation for Eyes and Head

Update: I published a Unity asset implementing the animation algorithms described in this blog post on the Unity Asset Store.

I summarized my main learnings from reading papers on eye animation in a short message on reddit. In it I said I was experimenting with parameters for head movement. Here I follow up on this. First, the original reddit message describing what I learned about eye animation:

Compute the angle in degrees the eye has to move, separately for horizontal and vertical, then from that the papers give you the duration and max speed for eye movement:

// From "Realistic Avatar and Head Animation Using a Neurobiological Model of Visual Attention", Itti, Dhavale, Pighin
leftMaxSpeedHoriz = 473 * (1 - Mathf.Exp(-leftHorizDistance/7.8f));

// From "Eyes Alive", Lee, Badler
leftHorizDuration = 0.025f + 0.00235f * leftHorizDistance;

Compute these numbers at the start of a new movement, then each frame plug them into Unity’s SmoothDampAngle to get the eye animation. Don’t let the eye’s max speed fall below the head’s speed when the head is moving though, otherwise the eye lags behind.

From other papers I got another important point: eyelids. When you look up relative to your face direction, upper and lower eyelids go up a bit, when you look down they go down, so they move with the eye. Rule of thumb: the iris keeps touching the lower eyelid. When the eye is about to make a horizontal movement of >= 25 degrees or so, blink, longer than the normal short blink.

The social triangle you probably know about already: we look into the left or right eye 75% of the time, at the mouth (normal social context) or further down (neck etc, intimate context) the rest. For intimidating power-stare make it left eye – right eye – forehead.

These are the most important points I got from such papers. There are similar formulas for head movement, but I am not happy with my head animation yet, still working on it.

To find better formulas for head animation, I recorded samples of my own head movement with the DK2’s rotation tracking and used linear regression to find parameters for two kinds of head movement: slow and fast. Slow head movement is used when the actor is idle and her attention jumps from one static element in her surroundings to the next. Fast movement is used when something new appears in her perceptual field and she turns her head towards it.

float kSpeed = isQuickMove ? 54.86f : 8.08f;
float mSpeed = isQuickMove ? 1.94f : 1.824f;
headMaxSpeedHoriz = kSpeed + mSpeed * horizDistance;

float kDuration = isQuickMove ? 0.321f : 0.6231f;
float mDuration = isQuickMove ? 0.0116f : 0.009858f;
headHorizDuration = kDuration + mDuration * horizDistance;

I updated my Oculus demo Coffee without Words with these formulas.

There are other aspects of avatar animation that I learned about from papers, which I haven’t implemented yet:

Pupil dilation: our pupils dilate when we see someone we do like, and contract when we don’t.

Eyebrow flash: when we first make eye contact with someone, we usually briefly raise the eyebrows as a friendly greeting.