Cognitive bias occurs because people tend to see what they expect to see.

Commentators have identified bias as a serious problem in the forensic setting. As one commentator noted: “To the extent that we are aware of our vulnerability to bias, we may be able to control it. In fact, a feature of good scientific practice is the institution of processes—such as blind testing, the use of precise measurements, standardized procedures, statistical analysis—that control for bias.” A 1996 National Academy of Sciences report on DNA testing recommended that laboratory procedures “be designed with safeguards to detect bias and to identify cases of true ambiguity. Potential ambiguities should be documented. . . .”

There are two different types of bias: motivational and cognitive. Cognitive bias occurs because people tend to see what they expect to see, and this typically affects their decisions in cases of ambiguity. Motivational bias arises when lab personnel’s often-close association with the police influences their conclusions—often subconsciously.

Motivational bias. Commentators have argued for the establishment of crime laboratories that are independent of law enforcement in order to minimize police pressure that may bias lab results. A prominent forensic scientist has commented that it “is important to recognize that the police agency controls the formal and informal system of rewards and sanctions for the laboratory examiners. Many of these laboratories make their services available only to law enforcement agencies. All of these factors raise a legitimate issue regarding the objectivity of laboratory personnel.” A former lab director has noted: “Many forensic scientists at the state police labs . . . saw their role as members of the state’s attorney’s team. They thought they were prosecution witnesses.”

The problem is not unique to this country. According to a British court: “Forensic scientists may become partisan. The very fact that the police seek their assistance may create a relationship between the police and the forensic scientists. And the adversarial character of the proceedings tends to promote this process. Forensic scientists employed by the government may come to see their function as helping the police. They may lose their objectivity.”

In 2002, the Illinois Governor’s Commission on Capital Punishment proposed creation of an independent state laboratory as a way to provide access to forensic services.

Cognitive bias. Cognitive bias occurs because people tend to see what they expect to see, and this typically affects their decisions in cases of ambiguity. One type of cognitive bias is an “observer effect.” A simple example illustrates the point. When a pharmaceutical company wants to introduce a new drug, test trials are conducted, and they are conducted double-blind. Neither the patient nor the physician knows whether the patient is receiving the new drug or a placebo (the control). Numerous studies have demonstrated that physicians who know that their patients are receiving a new drug tend to see positive results, even when there are none. In short, knowing something affects our perceptions. It is simply human nature.

A slightly different type of cognitive bias is called “confirmation bias.” The psychological literature on lineups provides an illustration. Eyewitnesses with reservations about their identifications often become positive after learning that the person they identified is viewed by the police as the prime suspect. The same phenomenon may occur when external information is provided to lab analysts. For example, Professor Peter DeForest has described investigators who responded to inconclusive lab results by saying to forensic examiners: “Would it help if I told you we know he’s the guy who did it?” One crime lab examiner, “who has worked in the crime lab system since 1998, said she tried not to be swayed by detectives’ belief that they had a strong suspect. ‘We’re all human,’ she said. ‘I tried not to let it influence me. But I can’t say it never does.’”

Confirmation bias also arose in the misidentification of fingerprints in the terrorist train bombing in Madrid on March 11, 2004. The FBI got it wrong, misidentifying Brandon Mayfield, a Portland, Oregon, lawyer, as the source of the crime scene prints. Spanish authorities cleared Mayfield and matched the fingerprints to an Algerian national. To its credit, the FBI initiated an investigation using outside experts. The resulting report raised a number of disquieting issues. The mistake was attributed in part to “confirmation bias.” In other words, once the examiner made up his mind, he saw what he expected to see during his re-examinations. A second review by another examiner was not conducted blind—that examiner knew a positive identification had already been made by the first examiner—and thus was also subject to the influence of confirmation bias. Moreover, the culture at the laboratory was poorly suited to detecting mistakes. As the report noted, “To disagree was not an expected response.”

The experiment. As a result of the Mayfield case, several British researchers led by Itiel E. Dror devised an experiment to test whether external influences can affect the identification process. In particular, they were concerned with confirmation bias as occurred in the Mayfield identification. In the experiment, fingerprint examiners who were unfamiliar with the Mayfield prints were asked by colleagues to compare a crime scene and suspect print. “They were told that the pair of prints was the one that was erroneously matched by the FBI as the Madrid bomber, thus creating an extraneous context that the prints were a non-match.” The participants were then instructed to ignore this information. The prints, in fact, were from cases that each of the participants had previously matched. Of the five examiners, only one still judged the prints to be a match. The other four changed their opinions; three directly contradicted their prior identifications, and the fourth concluded that there was insufficient data to reach a definite conclusion. Dror and his colleagues noted, “This is striking given that all five experts had seen the identical fingerprints previously and all had decided that the prints were a sound and definite match.”

Conclusion. Outside information from an investigation should not be given to the analyst interpreting forensic results — the examiner should generally be “blind” to the case’s circumstances and other evidence. ABA Standards for Criminal Justice, DNA Evidence, Standard 3.1, provides, in part, that testing laboratories should “follow procedures designed to minimize bias when interpreting test results.”