E-mail this article

Sending your article

Your article has been sent.

WASHINGTON — It’s a familiar scene in Afghanistan: Donkey carts and hawkers fill the market square. Bearded men in traditional garb drink tea and search for work. A US visitor scans their faces, trying to distinguish friend from foe.

But this village is virtual. The villagers are bits of software code, and the Americans who “visit’’ are players in a videogame-like program designed not only for training purposes but for intelligence analysis.

The program, which loosely resembles the game SimCity, is part of a US government effort to develop sophisticated computer models of real Afghan villages — complete with virtual people based on actual inhabitants — in an attempt to predict their reaction to US raids and humanitarian aid.

The project, spearheaded by a University of Pennsylvania engineer at the behest of an undisclosed US government agency, straddles the line between research and intelligence as part of a wider US effort to design software capable of forecasting human behavior in war zones.

This type of research, often referred to as “human terrain mapping,’’ has attracted increased funding in recent years from US military planners who believe it will become a crucial tool for combating terrorism and insurgencies. If successful, supporters say, it could save lives by helping the US government anticipate — and address — violent conflicts before they erupt.

“War is the worst-case scenario,’’ said Barry Silverman, who has received more than $500,000 in federal funding to build a virtual Afghan village. “The goal is to resolve things long before that.’’

But the military’s effort to build computer models based on the behavior, beliefs, and habits of real people interviewed by Army teams of social scientists on the ground has raised unprecedented ethical questions and sparked a fierce debate across government agencies and in academia. Critics question whether the models really work, and whether they will ultimately harm the people who are being depicted.

“Are we going to detain someone if a computer predicts that he will become an insurgent?’’ asked Hugh Gusterson, an anthropologist at George Mason University.

“The real danger of models is their seductiveness. They can be so realistic and powerful that it is easy to forget they are just a model, and they start to rely on them more and more.’’

The concerns were so great that the US Department of Energy, which controls the national laboratories that own some of the most sophisticated computers in the country, has pushed back against recent efforts to enlist its scientists in the work.

Citing uncertainty about how the military will use this research, Energy Secretary Steven Chu issued a memo late last year barring employees from working with data about individuals, citing fears that it could violate a federal law mandating that human research subjects never be harmed.

“The lack of full disclosure of the purpose and the potential repercussions to subjects recruited for participation . . . undermines any . . . ability to review such work against federal requirements for the protection of human research volunteers,’’ Chu wrote in December.

The project also adds fuel to an ongoing debate over whether social scientists should ply their trade for the military, since some virtual villages are created using surveys taken by embedded social scientists known as human terrain teams.

The American Anthropological Association has come out against the teams, saying that military work violates the profession’s code of ethics, a stance that has made it harder for the teams to recruit.

On Capitol Hill, where Congress is weighing the fate of the human terrain program, staffers expressed optimism that the new ethical condundrums will be sorted out over time.

“As the technology matures, you can shape what the bounds of the research should be,’’ said one senior staffer on the House Armed Services Committee, who was not authorized to be quoted by name.

The US military has built computer models of the real world for decades. In the 1970s and 1980s, the Pentagon developed flight simulators to train pilots, and war games to teach soldiers to counter virtual missiles. By the 1990s, the Pentagon began populating simulators with increasingly realistic virtual people.

The first human models were crude, and were programmed to react the same way every time — for instance, to run when a gun is pointed at them.

But now they can be programmed to respond to an increasingly complex set of motivations, including religion, emotional status, economic incentives, and tribal and familial relationships in the virtual world, according to Greg Zacharias, president of Charles River Analytics, a Cambridge-based software company that designs virtual characters for Air Force training programs.

After the Sept. 11 attacks, the US military and intelligence agencies turned to this technology not only for training, but to help understand — and even predict — the behavior of real people overseas.

Silverman got interested in the work after he designed software to help anticipate human error on NASA satellites. Today, he is one of a handful of modelers who attempt to reproduce human belief systems in the virtual world.

In 2007, he began developing a training program for the US Marines that simulates fictional villages, tribes, and families in southern Afghanistan. This summer, it will be used for the first time.

But just as Silverman was perfecting that program, known as Non-Kin, a US agency that he would not identify asked him to model an actual Afghan village and the people in it, using real data collected from the field. (He won’t say which one.)

“They said, ‘There is no reason why you can’t create a real village,’ ’’ Silverman recalled. “This was not for training. It was for intel collection and analysis.’’

Silverman received initial funding for that village and is currently awaiting a second grant to complete it. In the meantime, he is developing a generic Afghan district using opinion polls, ethnographic data, and biographical descriptions of local leaders collected by the human terrain teams.

Silverman believes that one day, the whole of southern Afghanistan will be recreated in a vast computer model.

“I think the goal in the long run would be to just crank out village after village,’’ he said.

It is not his first attempt to model real people. In the past decade, a US agency has paid him to model the Palestinian intifada, Al Qaeda figures, leaders in the Middle East, and 27 Iraqi political figures. In 2008, he was asked to make detailed guesses about events in Bangladesh, Thailand, Sri Lanka, and Vietnam, based on his models. The US agency measured his forecasts against real events, and they turned out to be more than 80 percent accurate, he said.

Shortly after the human terrain teams were launched in 2005, the Marines paid Silverman to study what could be done with data they had collected. He published a paper arguing that it should be fed into simulators to help forecast events.

Since then, the human terrain teams have shifted their data collection methods from open-ended reports toward more rigid questionnaires that can easily be uploaded into a database, according to former terrain team members. John Allison, an anthropologist who began training as a team member last November but has since resigned, said the teams were taught to upload the data into a classified Pentagon database known as SIPRNet, where is it distributed to a host of US agencies, some of whom pass it on to analysts like Silverman.

Steve Fondacaro, the project manager for the Army’s human terrain system that oversees the data-collection teams, said the information is primarily used by commanders on the ground to design effective development projects. He said the data are not used to harm anyone. But he also acknowledged that he does not know what other agencies do with the information.

“I don’t spend a lot of time tracking down what the government people are doing with the data that we access on the ground,’’ he said, adding that he did not know about Silverman’s project.

But the effort has sparked alarm among some academics who say it is reminiscent of past efforts to use social science for military ends.

Joy Rohde, a historian at the University of Michigan who is writing a book about the military’s use of social scientists during the Cold War, likened the effort to the 1964 Army project known as Camelot, which sought to identify the causes of violent revolutions in Latin America in order to prevent them.

“That’s exactly the national security posture that is creating the terrorist problem,’’ she said.

Randy Borum, a behavioral science professor who served on a Pentagon-affiliated task force that examined human terrain research projects, said the models were useful for training, but could be problematic if used to make important decisions.

“Whether some of these models accurately predict real-world events is almost unknowable,’’ he said.

But others say computer models are an improvement on old-fashioned informants and spies because they are designed to catch connections that human analysts might miss. Models help policy-makers tackle a host of complex problems, from electricity grid expansions to snarled traffic.

Navy Captain Dylan Schmorrow, who works in the Human Social Culture Behavior Modeling program in the Office of the Secretary of Defense, a program launched n 2008 to improve the science behind the models, said he was hopeful that they would one day be as sophisticated as forecasts of the weather and economic activity.

“If we are lucky, in 10 years, people will say, ‘we really started understanding how and what to model,’ ’’ he said.

Schmorrow added that the military’s interest has increased exponentially in recent years, although the programs are impossible to track since many are classified. His own budget has more than doubled to $25 million to support some 50 separate research projects, while the budget for human terrain teams has grown from $20 million in 2005 to nearly $100 million this year.

The funding has flowed to a wide range of projects. Iris Bohnet, a professor at Harvard’s Kennedy School, received funding to study how trust is built in the Middle East. George Cybenko of Dartmouth’s Laboratory for Human Terrain examines what computer networks can tell us about the people who use them.

Edward MacKerrow, a physicist at Los Alamos National Laboratory, simulates the Afghan opium trade, complete with fictional opium farmers and warlords programmed with their own economic incentives and attitudes toward US troops.

MacKerrow, who has never used information about real Afghans — who are now barred from working at national laboratories— said he hopes his replication of the Afghan drug economy will help US officials test different solutions there.

“We understand little bits about the weather, and those weather model simulations are pretty accurate,’’ said MacKerrow. “The big question is: If we know little bits and pieces about issues, attitudes, behaviors, psychology, can’t we try to simulate society?’’

“The jury is still out.’’

Correction: Because of an editing error, this story about the military's use of computer models of Afghan villages incorrectly stated what is banned at national laboratories. It is the analysis of information about individuals.