Anyone who has chuckled at an outrageous headline blazing across a supermarket tabloid can tell you that understanding an idea and believing in that idea do not go hand-in-hand. Otherwise, checkout-stand regulars would accept such proclamations as "Cave-Men Looked Like Elvis!" as articles of faith.

Think again, shoppers. Enquiring minds not only want to know; they also tend to believe, at least initially, what they read and hear, according to psychologist Daniel T. Gilbert of the University of Texas at Austin.

"Much recent research converges on a single point -- people are credulous creatures who find it very easy to believe and very difficult to doubt," Gilbert argues in an article scheduled for the March AMERICAN PSYCHOLOGIST.

His contention may spark debate, but it hardly qualifies as unprecedented. More than 2,300 years ago, the Greek philosopher Aristotle said the ability to doubt is rare, emerging only among cultivated, educated persons.

Aristotle's claim has gained support in the past decade from several studies indicating that young children generally accept the statements of adults uncritically -- a tendency that often distorts youngsters' eyewitness accounts of crimes.

However, current psychological theories of belief formation lean more heavily on the notions of another philosopher -- Rene Descartes. The influential 17th-century French thinker maintained that the mind effortlessly and automatically takes in new ideas, which remain in limbo until verified or rejected by conscious, rational analysis.

Descartes' statement of comprehension from critical assessment -- although less well-known than his separation of mind from body -- continues to influence scientific assumptions about how people think, Gilbert maintains. For instance, computer scientists typically design state-of-the-art systems modeling language acquisition and other mental abilities to ingest information in a "neutral" form before determining that information's usefulness or destination.

But Dutch philosopher Baruch Spinoza, writing shortly after Descartes' death, offered an entirely different perspective on thought. Spinoza argued that to comprehend an idea, a person must simultaneously accept it as true. Conscious analysis -- which, depending on the idea, may occur almost immediately or with considerable effort -- allows the mind to reject what it initially accepted as fact.

Spinoza's seemingly preposterous claim finds backing from three experiments reported by Gilbert and his co-workers in the October 1990 JOURNAL OF PERSONALITY AND SOCIAL PSYCHOLOGY. The experiments test a basic assumption of Spinoza's theory: If people initially believe both true and false ideas, interruption of the mental evaluation of those ideas should interfere with the ability to reject bogus claims, while true notions would maintain their seal of approval.

In the first of those studies, 35 college students learned the meaning of fictitious nouns -- which they were told represented Hopi Indian words -- by reading difinitions on a computer screen, such as "a twyrin is a doctor." Immediately after each definition appeared, the computer displayed the word "true" or "false" to indicate whether the statement was correct. On some trials, a tone sounded just after the computer affirmed or denied a statement. Because students had to press a response button when they heard the tone, it momentarily distracted their attention.

Interruption by the tone caused a substantial increase in the number of computer-denied propositions that the students later accepted as true on an identification test. On the other hand, interrupted students were not more likely to label a computer-affirmed definition as false.

By initially accepting both true and false ideas, the volunteers apparently thought in a Spinozan fashion, Gilbert asserts. Thus, distractions undermined the subsequent thought necessary to scrutinize denied claims, but not affirmed ones.

Descartes' scheme, in contrast, assumes that interruptions play equal-opportunity havoc with the rational evaluation of both affirmed and denied statements.

In the second study, 20 students viewed a series of smiling male faces shown on a video monitor. On some trials, the monitor displayed the word "true" or "false" before showing a face, to signal whether the man expressed genuine or feigned happiness. On other trials, signal words appeared after the students saw a face.

Students who were distracted by pressing a button at the sound of a tone just after viewing each face usually misidentified false smiles as genuine, but not vice versa. Even those informed ahead of time that a smile was false often labeled it as genuine if they were subsequently interrupted. In other words, when distractions derailed their train of thought, volunteers who had been given reason to doubt false information nevertheless tended to accept that information as true.

In the final study, the researchers presented 30 students with descriptive phrases about an imaginary animal called a glark. Participants then decided whether new propositions about glarks were true or false. During this task, they were occasionally told to read a statement about glarks as quickly as possible without gauging its veracity. Each of these phrases appeared a second time during the test for evaluation as true or false.

Students probably accepted quickly read propositions at first, rather than treating them neutrally, Gilbert argues. They later reported one-quarter of the speed-read false statements as true, whereas they identified nearly all the speed-read true statements correctly.

"We're naive Cartesians," Gilbert contends. "We assume beliefs are under conscious control at all times. But beliefs can be created merely by passively accepting information without attempting to analyze it."

He points to other lines of research that support his argument. For instance, psycholinguists established nearly 20 years ago that people presented with true and false sentences generally take less time to determine the accuracy of the true statements. One research team wrote that when individuals read assertions, they "start with the truth index set to true."

Psycholinguistic work also suggests that the comprehension of a denial (say, "armadillos are not herbivorous") first involves grasping the concept under dispute ("armadillos are herbivorous"). A Spinozan mind employing this mental tactic should at times believe what has clearly been denied, Gilbert points out.

A 1981 study directed by psychologist Daniel M. Wegner of Trinity University in San Antonio, Texas, illustrates this paradox. In a finding of particular interest to journalists, students who read propositions such as "Bob Talbert not linked to Mafia" reported markedly more negative impressions of the fictitious Talbert than did students who read neutral statements such as "Bob Talbert celebrates birthday."

People also automatically tend to seek out evidence that confirms their beliefs about others. Studies have shown, for example, that volunteers led to believe in the outgoing nature of a young woman later asked her questions concentrating on the extent of her sociability, while neglecting to probe for shy or reticent aspects of her personality.

In related work, psychologists studying persuasion and lie detection have observed that people often believe what others tell them without question. Opinions about others, as well as autobiographical claims, often gain acceptance more readily when the listener performs a competing task that diverts attention from the speaker's message.

"People who sell used cars and vacuum cleaners have long known about the persuasive power of timed interruptions and diversions," Gilbert notes.

Many brainwashing and coercion techniques rely on extreme methods to fragment the attention of political prisoners, he adds. Interrogators often keep prisoners awake for days at a time and then browbeat the exhausted captives with an ideological barrage they find difficult to resist. Forced confessions also exert insidious effects: After writing and reciting a captor's message many times over, weary prisoners start to doubt their own opinions.

The same principles extend beyond used car lots and dictators' dungeons, warns psychologist John A. Bargh of New York University. "My hunch is that control over automatic, unconscious influences on judgment and behavior is not usually exercised," says Bargh, who co-edited a compilation of research on the subject (Unintended Thought, 1989, Guilford Press, New York). "It's not that people are lazy. They tend to think these influences don't exist, and often don't have the luxury of extended thought about what they hear or read from moment to moment."

Moreover, Gilbert argues, just as healthy people immediately believe what they see, doubting their eyes only on rare occasions, so must they initially believe what they read or hear, if only for a fleeting moment.

Gilbert and his co-workers have yet to study whether distracted attention increases the likelihood of believing obviously outrageous assertions. Although Spinoza's theory holds that a statement such as "Hitler was a woman" meets instant acceptance and almost as quickly goes up in flames as contradictory evidence leaps to mind, that prediction proves difficult to study in the laboratory.