The loneliest talk show host: Ars tests Avatar Kinect on the Xbox 360

Microsoft's latest application for its motion sensing controller is called Avatar Kinect, and it allows players to use their avatars in a way other than the usual jumping and dodging of virtual obstacles: sitting (or standing) and talking with other Avatars, using your face to control your avatar's facial expressions and head movements, and even taking to the stage for a comedy or musical act. Unfortunately, the translation of your movements to your avatar is touchy, especially if the room's lighting is lacking.

We sat down with Avatar Kinect to try things out. The Avatar that represents you on the Xbox's dashboard can be placed in a number of settings when the update is applied: magical forest with mushroom chairs, Late-Night-TV-style stage, newsroom, or space. When the service is live, users will be able to invite friends or guests to the sets and speak to them, and even record and share the encounter. So how well did it work?

I directed my avatar to enter a late-night TV setting, where she bounded onto the stage to cheers from the audience and sat down in one of the chairs. The camera framed up on her and she stared back at me, lips making constant tiny motions as if she were babbling under her breath. She couldn't interpret my slack jaw.

A neutral avatar in the hotseat in the magic forest setting.

I tightened my own lips up and she closed hers, and when I gave a toothy smile, she did as well. The avatar's mouth was able to track talking if I didn't go too fast; otherwise it caught only longer, slower mouth movements.

When Avatar Kinect debuted at CES in January, the avatars' ability to track eyebrow movements was one of their more impressive talents, but my avatar struggled with it. If I raised my eyebrows comically high, hers would pop up a second later. If I narrowed them my avatar refused to imitate my anger, unless I tilted my head backward or forward to shorten the line between eyes and eyebrows.

Some eyebrow work in Avatar Kinect.

Better lighting produced better tracking, particularly with facial movements—with studio-style lights pointed at the subject, it works best. But for an average living room, you will be impressed with the accuracy less often than you will be amused, and then frustrated.

Avatars can interpret head tilts side to side as well as slight turns, but will spasm if you try to turn all the way to the side as if addressing a guest on one of your shows. Raising and lowering your arms at your sides, as well as moving them from the elbow in front of your body, translates well, although individual finger movements can't be seen. One of the most puppet-like aspects of the avatars is that they can't appear to touch their own bodies—say, put their hands over their mouths in disbelief, or scratch their heads.

Eggs falling from the sky is an emote in the magic forest. We don't know, either.

Users can also "emote" while performing as their avatar, which includes prompting the setting to do something, like shooting off pyrotechnics on the rock concert stage or cheering toadstools in the magic forest. As with recording the session, you prompt it with a controller button push, meaning if you want to use these features, you're going to have to be gesturing while holding a controller in your hand.

Unfortunately, none of the settings seems to let your avatar move anything below your shoulders, meaning all you can do is sit around and talk. Performances cannot be live, either, which seems like a loss for the platform.

Some gesture and expression experimentation in Avatar Kinect.

Microsoft has pointed out the potential therapeutic applications of a setting where users are effectively using puppets to talk. Avatar provides some anonymity and separation from your own person, which could be freeing for users who may have something to say but are self-conscious about their appearance.

Of course, the total package of a Kinect isn't ideal for the delivery of any kind of performance—the microphone is terrible, and without exquisite lighting your avatar is less a translation of you than an exhibition of alien hand/eyebrow/lips syndrome, less human and more marionette.

Arm control works fairly well, though as in the video, going too much above your head should may cause spazzing.

While the intended use of Avatar Kinect may falter, we hope it doesn't stay isolated in a subsection of Kinect Fun Labs, where it's little more than an exhibition. Facial expressions and puppetlike controls deserve a place in actual games, and we hope developers will be able to integrate them into gameplay in the near future. Dare we imagine a console-based MMO where players can freely beetle their brows at one another? Or a game like L.A. Noire, where your own facial expressions could play as large a role as that of your suspects?

In the meantime, Avatar Kinect may also find use as a vector for animated video sequences. Similar tools like Xtranormal spawned a catalog of popular videos.

Avatar Kinect will be available this afternoon, and since it's free for everyone until it goes gold-only on September 8 the barrier of entry is low. But we remain more interested in what the capabilities suggest for the future of motion-control gameplay.