For all their sophistication, vehicle and flight simulators are relatively self-contained systems — trainees are basically sitting in a box. That won’t fly for training dismounted infantrymen.

One approach to this challenge is embodied in the Dismounted Soldier Training System, which uses a joystick mounted on a weapon to move players around virtual terrain. Another approach is apparent in Pointman, now under development at the Naval Research Laboratory in Washington, D.C., which lets users control their onscreen avatars by moving their legs.

A modification for Bohemia Interactive’s Virtual Battlespace 2 software, Pointman relies on the classic desktop trainer setup, but with a few technological twists. A computer-mounted tracking device follows three LEDs on a headpiece, giving the avatar matching head tilt and body lean. A gamepad, chosen because many members of the military have gaming experience, manipulates the weapon. On the floor, a set of pedals adapted from helicopter training sims makes the avatar walk.

Sliding the pedals back and forth makes the avatar step, while depressing both of the pedals puts the avatar in a prone position. The actions don’t match the real-world motions precisely, but instead strive to provide what project lead Jim Templeman calls “cognitive realism.” In fact, he said, the entire purpose of Pointman is to “extend the range and precision of actions,” meaning trainees can make their avatars use appropriate tactics. More accurate representations can change the outcome of decisions they make within the exercise.

“Your avatar would crouch appropriately,” said David Dunfee, a ground combat element subject matter expert in the training and education capabilities division of the Marine Corps, who saw Pointman tested.

The avatar in VBS2 could crawl up and look out over terrain without being exposed, or peek (“pie”) around corners. Without Pointman’s modifications, testers would have to expose more of their bodies to fire than normal, misrepresenting the scenario and causing operations to fail even though the correct tactics were used.

Pointman is the outcome of about 14 years of research, beginning with the Gaiter project in 1998, which fitted users with a harness to keep them from marching into a wall as they walked in place in a virtual world. Researchers saw current systems had trouble with precise actions, such as pieing a corner or dodging for cover. “For us, the missing point was control,” said Templeman.

Templeman also found that differences between the Gaiter simulation and real life could cause real-world reflexes to intrude on the gaming. For example, the rifleman’s ability to bring the weapon up quickly and precisely didn’t always align with the virtual version, both because of latency and differences between what the game showed and what the human eye expected to see.

“If those two don’t align, you’re going to have problems,” he said. In response, Pointman developers placed a marker on the screen where the sight of the weapon would come up. A flick of the wrist to raise the weapon places the sight on the screen.

Although Pointman has worked on some of the problems associated with dismounted training, researchers still need to test whether it makes a difference in training tactics.

“Those types of things are better for trying to replicate actual movement in a simulation, but we still don’t know what the true learning transference is going to be with those kinds of applications,” Dunfee said.

The VBS2 Connection

Researchers demonstrated the system in 2009, when it was still linked to ManSim, first-person shooter software from Lockheed Martin. When it didn’t find a home, Templeman decided to change the platform so it could interface with VBS2, which is widely used for training in the U.S. military. What they thought would be a quick development has taken the last two years to perfect.

“VBS2 is a very complicated, very rich environment,” Templeman said. “In order to have a complete interface, it turned out to be a lot of work.”

Last year, Pointman got a workout at Marine Corps Base Hawaii by Marines home from Afghanistan, who declared the program realistic to control, view, move, and use for cover and posture. But glitches appeared when they attempted to play with a full squad instead of four-man teams, and the testers dubbed the program inadequate for team training.

“The system was crashing when you had more than four to six people on the system,” Dunfee said. “That’s a problem if we can’t have 13 guys, or up to a platoon, with 42 guys, in the simulation at the same time.”

Templeman said NRL has since improved the system’s ability to handle more users and hopes to iron out any other glitches if the program rolls out for further testing. His hope is that the program will go to the Training and Education Command or, even better, the Marine Corps’ program manager for training systems.

Pointman also has potential as troops draw down and their roles change overseas. Templeman says they plan to add peacekeeping scenarios and make it so that individuals can experience combat but then transition into interacting with locals. It could interface with something like Alelo, which already provides language and cultural training and works with VBS2. The next step would be to add eye tracking, gestural recognition and facial expression tracking.

Some of the work is already done. Researchers finished a version of the hands-free mode, where the weapon is absent and the avatar can gesture or point, in September. However, there remains the question of what kind of cultural training to produce as the U.S. shifts its focus.

“We don’t know where we’re going next. So what cultural environments do you build?” Dunfee said, noting that one plan would be to use generic environments for each command, such as Africa Command.

“We have a number of different problems that are very, very difficult and very diverse, such as the cultural interaction modules or the terrain modules,” Dunfee said. “At the end of the day, what does the squad need that’s going to gain efficiencies utilizing simulations?”