Marine study: Simulations translate to live-fire performance

Simulation systems used for training have a lot of supporters among military leaders, but they still have some doubters when it comes to training on tactical vehicles and weapons. After all, how could a computer-generated virtual scenario compare to actually working with real equipment on real terrain with real targets?

As it turns out, pretty well, according to a study by the Marine Corps Systems Command of M1A1 tank crews working with the simulation-based Advanced Gunnery Training System (AGTS). The command’s Program Manager Training Systems set out to quantify AGTS’ effectiveness and found that it significantly improved Marines’ proficiency—and translated to live-fire performance—while saving the Corps millions of dollars in training costs, according to a release from the command.

The proof-of-concept study evaluated three tank crews performing 10 tasks over time, which added up to hundreds of scores. Although the study’s authors acknowledge that three crews is a small sample compared with the total number of crews across the Corps, they say the scores were consistent enough to show AGTS’ effectiveness.

The study breaks training time up into four quarters, each representing three weeks in the semi-annual training schedule. Between the first and fourth quarters, the crews’ cumulative average scores across all tasks improved from 47 percent to 73 percent. (Crew members, from the II Marine Expeditionary Force at Camp Lejeune, N.C., ranged in rank from Lance Corporal to Lieutenant, averaged 25 years of age and, while they had some experience or familiarity with the M1A1, all were new to their positions and the training exercises.)

The study also measured how well AGTS training translated to live-fire performance for each of the crews. Crew 1, for instance, started training with an AGTS score of 63.6 and finished training with a score 93.0. The crew’s subsequent live-fire qualification score was 90.7, well above the 70 necessary to qualify. Crew 2’s training scores went from 55.2 to 81.9, following by a live-fire score of 85.0. Crew 3 had the lowest scores, but also showed the most improvement, raising its average training scores from 53.6 to 87.0 (a jump of 62 percent) and finished with a qualifying live-fire score of 78.0

The study notes that each crew’s live-fire qualification scores were close to their final training scores (in the case of Crew 2, even higher) and concludes that this can be attributed to AGTS training. “Trends in scores by task for each crew indicate that task proficiency, achieved in the AGTS, transfers to [live-fire qualification],” the report states.

The study’s cost avoidance figures are even more dramatic, for the obvious reason that firing virtual rounds at virtual targets requires a lot less money that firing live rounds. Operating the simulator for three crews costs a total of $7,208, the study’s authors said. If the all those virtual rounds were live, the cost for the three crews would run $1,524,663, giving the Marines a net cost avoidance of $1,517,455. And since there are two qualifications annually, the costs avoided by using AGTS for three crews amounts to just over $3 million per year, the authors state.

The results could support Marine leaders’ push to incorporate more simulations in training. Using simulations currently is an option for unit commanders, but at a conference last December in Orlando, Fla., Brig. Gen. Joseph Shrader, commander of Marine Corps Systems Command, suggested that at least some simulations should be made mandatory. Marine Corps Commandant Gen. Joseph Dunford, in his planning guidance released in January said he expects the Marine Air-Ground Task Force to “make extensive use of simulators where appropriate.”

Having hard numbers from the M1A1 study can help make the case, particularly for ground systems. Simulator training is required as part of training for aviation and some ground systems (including the M1A1) but not for many others on the ground, said Lt. Col. Walter Yates, program manager for Training Systems. Results from the study could help change that.

“My job is to go back now and do the validations and studies, such as this one for AGTS, and submit them to [Training and Education Command] and our requirement sponsors,” Yates said. “Then we can say we put the academic rigor behind it, and we’ve measured that skill is improving.”