Starship Devlog – Feb 2017

They each have an attention-focus object and several animations are combined to move the spine head and eyes of the actor to look at that focus-object. They also now have the ability to point at things too!

They can also trigger audio clips and move their mouth in time with the audio so produced, and markup on those audio files can instruct the virtual actors to overlay let more specific animations. For example to nod or shake their head, to shrug or do any other predefined action.

There’s an auto-cam flying about the place to give a flat-screen cut of what’s going on, which is what you see in the valentine test animation, though of course this is all written for VR really where the movement can be taken over by the script, but head-tracking at least is under user control.

I mention a need for voices. If you’d like to help, record a wav of yourself reading a few lines of dialogue from your favorite book and email it to me. I’ll be in touch.

Next month or so will mostly be 3d modeling work I think. Revamping the Simon and Anxi characters, and doing some lighting and texturing and generally improving the ship itself.