Ecker says this effect, known as 'continued influence effect of misinformation', occurs even if the retraction itself is understood, believed, and remembered.

In their latest set of experiments, which builds on previous research, the researchers manipulated the strength of both the provided misinformation and the later retraction.

Repetition was used to more strongly encode the misinformation. Cognitive loading, when attention is divided between two tasks, was used to weaken or dilute the messages.

More than 160 undergraduate students were recruited and randomly placed in test groups of approximately 20. They were given a news report based on a warehouse fire.

Initially, the fire was reported to have been caused by volatile materials negligently stored in a closet, but subsequent reports stated that the closet was empty.

"Some were also given another task to distract them a bit and then we give them a questionnaire, which checks basic memory for what they read," says Ecker.

"Most importantly, they are then tested with inference questions that target part of the event they read about and can be answered by using the misinformation."

As expected, the team found stronger retractions were effective in reducing the effects associated with strong misinformation encoding; but they failed to eliminate it even when the misinformation was weakly encoded.

"Despite best efforts to correct misinformation it can't be completely eliminated," says Ecker.

Unerasable damage

According to Ecker, there are two types of misinformation; that which is intended to mislead (for example lies and propaganda); and that which has no underlying motive. The latter includes unfolding news events or crime investigations when information comes in a piecemeal fashion and may require future correction.

Both are subject to the continued influence effect.

The researchers explain that this effect has implications for a number of real-world scenarios, such as the avoidance of MMR vaccine due to fears over autism, or when jurors are called upon to discount the last witness testimony or disregard evidence.

"If you make them [jurors] suspicious of why that information was presented in the first place, such as by saying it was a deliberate attempt to mislead you, then they can more readily dismiss it," says Ecker.

"Also, if you explain to people that corrected misinformation will continue to influence their thinking to a larger extent than most of us are aware of, that can also help to reduce the effects of the misinformation."

Unchanging beliefs

Ecker says they have also studied a number of other factors such as strongly-held beliefs (worldview) and emotion on the continued influence effect.

While emotion was found to have no significant effect, someone with a strong opinion is unlikely to change it.

"If you believe in something strongly and it's really important to you as a person [your worldview] you will cling to that no matter what," Ecker remarks.

He says one example of this is climate change.

"People who believe strongly in the free market, those opposed to any kind of regulations … will be much more likely to continue to believe humans are not causing climate change even in the face of overwhelming scientific evidence that humans are causing climate change."