In rebuilding our Unity developer toolset from the ground up, we started by rearchitecting the interfaces that receive data from the Leap Motion device. Moving up the tech stack, we then refactored most of the mid-level Unity scripts that pair the Leap hand data with 3D models and manages those representations. Most recently, we’ve moved another step up our stack to the code that drives the 3D models themselves.

These are not only examples that can be used directly in projects, but act as examples of the variety of way hands can be created. In short, the Hands Module unlocks the power to drive many types of hand model implementations, which is critical to supporting as many types of projects as possible.

To boost development with the Hand Module, we’ve added some new capabilities to our Core Assets that allow for multiple hand models per hand, and to enable and disable combinations of these representations at runtime. The Hands Module includes a Hands_Viewer_Demo scene which serves double duty as a hands gallery and as an example of how to use these of controlling model groups at runtime. If you run that scene, try pressing the 0 key to hide all the hand pairs, then press the 4 and 7 keys. This reveals a transparent rigged hand that is dynamically sized to the user’s hand, along with parametrically generated hands that fit inside the transparent hand.

The simple example script CycleHandPairs controls the enabling, disabling, and toggling of the hand pairs in this scene. It illustrates how to call HandPool’s DisableGroup(), EnableGroup(), or ToggleGroup methods with the name that you provide that ModelGroup in the Inspector.

Hand Model Examples

Whether driving a full character, first-person arms, or disembodied hands, 3D meshes animated with skeletal deformations are one of the most common approaches to showing hands in games and VR. With this release, we’ve updated our RiggedHand.cs and RiggedFinger.cs scripts to work with Leap Motion Orion tracking and to improve workflow for 3D mesh hands.

The RiggedHand script is an implementation of the IHandModel class that provides methods that drive the 3D model’s transforms with Leap Hand data. The rigged IHandModel and the Leap Hand data are paired to become an IHandRepresentation and are then managed and driven by the HandPool and LeapHandController classes. RiggedHand gets assigned to the top of a hand model’s hierarchy and maintains a list of references to RiggedFinger scripts. The RiggedFinger components get attached to the top of each finger’s hierarchy and maintain a list of references to that finger’s bone transforms.

Both RiggedHand and RiggedFinger also derive from HandModel and FingerModel scripts, as do the RigidHand and RigidFinger classes, which drive Leap Motion physics hands. HandModel and FingerModel classes provide a collection of methods calculating the positions and rotations of the various transforms of a model. The RiggedHand and RiggedFinger scripts then use these methods for updating the hand and fingers.

The first set of example hands we’ve included in the Hands Module – LoPoly_Rigged_Hand_Left and LoPoly_Rigged_Hand_Right, are low-polygon – stylized non-realistic hands. Consisting of a mere 800 polygons and a single mesh for each hand, they provide performant hands for VR on both desktop and mobile. These hand models can be used with a wide variety of possible shaders to provide the foundational basis for many possible styles.

These meshes are also weighted and sculpted to allow their skeleton transforms to be driven by rotations only, or – with the DeformPosition switch in the RiggedFinger components – to be driven with bone positions as well. This allows the rigged hands to be deformed to match the size and proportions of the user’s hands in real time. In the Hands_Viewer_Demo Unity scene, you can see this by comparing hand pair 1 to hand pair 2, and hand pair 3 to hand pair 4.

Another setting worth noting in these examples is in the RiggedHand component. The Model Palm at Leap Wrist switch is set to True. This allows the script to accommodate typical hand skeletons which have the palm transform located at or near the wrist. Under the hood, this directs the RiggedHand component to drive the rigged hand’s palm transform with a combination of the Leap Motion data’s wrist position and the Leap Motion palm rotation.

Another set of hand prefabs, the PepperBaseCut and PepperBaseFull hands, provide examples of much higher polygon, realistically sculpted hands. At 20,000 polygons each, these hands aren’t suitable for mobile applications, but are included to illustrate a variety of hand models. Again, these models can be used with a variety of shaders. These prefabs have RiggedHand’s Model Palm at Leap Wrist set to False. They provide an example of driving a hierarchy whose palm transform is located at the palm center.

Finally, we’ve included some parametrically generated hands that use the PolyHand and PolyFinger classes. PolyFinger actually constructs its own mesh on InitFinger() and updates its vertices for each frame on UpdateFinger(). This illustrates a completely different approach from RiggedHands, in that the mesh is created dynamically at runtime. Because of the scene persistence feature that we added – where IHandModel calls these methods through its InitHand() and UpdateHand() methods at Editor time – you can see what these PolyFingers look like in the Scene view as you construct your project.

In the next blog post in this series, we’ll walk through the process of modeling and rigging your own hands from scratch in Maya, then setting these hands up in Unity and creating prefabs. And as part of that process, we’ll step through the improved setup for rigged hands.