Sam Washington processes blood samples for a drug study before putting them in the freezer at the Greenville (S.C.) Hospital System Cancer Treatment Center on Thursday, Nov. 29, 2012. / Heidi Heilbrunn, The Greenville (S.C.) News

by Liv Osby, The Greenville (S.C.) News

by Liv Osby, The Greenville (S.C.) News

GREENVILLE, S.C. -- Conflicting research reports on eggs, hormone therapy, Omega-3 fatty acid supplements and other health issues have left many a bit suspicious about the claims. Even experts say consumers are right to take some research with a healthy dose of skepticism.

Fay Hart, of Greer, S.C., has an egg for breakfast every morning with her grits.

Even though a new study says yolks may be as bad as smoking for the cardiovascular system, she says that doesn't make sense to her.

"I just don't buy their conclusions," she says. "I know people going on 100 years old and have eaten eggs all their lives. There are a lot of question marks to all of this."

Indeed. Over the years, research has alternately painted eggs as a healthy source of protein or a heart attack in a shell.

Hormone therapy was supposed to protect women from heart disease. Then researchers said it doesn't.

In general, says Dr. Jerry Youkey, dean of the University of South Carolina School of Medicine-Greenville, there are epidemiological studies and clinical trials.

In epidemiological studies, researchers look for common factors to explain why one group of patients has some kind of condition and another group doesn't, he says. Like eggs and heart disease.

Clinical trials typically compare different therapies, or a therapy against a placebo, in two groups of patients with a common disease to see which works better, he says.

Though there can be problems with both, Youkey says, the real culprit confusing the public is epidemiological studies.

"The amount of stock people put in them is sometimes unfortunate," he said. "I've told medical students here to try and remember everything they can, but to recognize that probably half the things that are dogma now will be proven ineffective in the next decade."

Study bias

Sample size, or the number of participants in a study, is important, says Dr. Hal Croswell, director of clinical research at Bon Secours St. Francis Health System. And more is generally better when trying to prove a drug is effective or not.

Using a biased group is also problematic, says Hanage, citing the yolk study.

Published in the August issue of the journal Atherosclerosis, it concluded egg consumption had about two-thirds the risk of smoking in development of clogged arteries, and suggested people with a risk of cardiovascular disease avoid eating egg yolks.

But Hanage says it focused on people over 60 with cardiovascular risk factors.

"That is already a very biased sample," he says. "This doesn't mean the work is completely wrong or they haven't found out anything interesting. But you have to think about the reason they did that ... and you need to be careful about applying that to different age groups."

Other forms of bias include how the study population is defined, how a drug or device is used, and the statistics used to measure the response, all of which can impact the validity of the research, Croswell says. And studies often have variations in all of those aspects, he says.

"You need to realize there are other factors going on that they've not taken into consideration that are probably just as likely to be responsible for the differences as what they've chosen," says Youkey.

Who's funding the study is something else to look for.

Hart says that's something she always takes into account.

"People who are putting studies out believe it, and want you to believe it," she says. "But there's a bottom line called money. And I read something discriminately and question ... who funded the research."

Few studies are perfect, says Croswell, and most findings need to be validated and examined in the appropriate context.

"Drawing conclusions from a lot of these epidemiological studies, with all the variables involved, gets confusing for everybody," he says.

But some research is inherently better than others, says Hanage. A meta-analysis, for example, looks at all the existing studies, ranks their validity, and draws a conclusion based on that evidence, he says.

Some studies look to find effects over many years, or go toward providing evidence that will further secure earlier findings, Croswell says.

And the holy grail of studies -- randomized double-blind controlled trials in which a therapy is given to only some study participants and no one knows who they are -- require a lot of funding and broad scientific thinking, he says, but have more significant weight.

"If you have a double-blind randomized multi-site clinical trial, it oftentimes can settle a question, such as whether to use cholesterol medicine or not," adds Diller. "But those things are expensive and rare."

A piece of the puzzle

But while the public tends to look at one study as the definitive work, scientists think of it as one more piece of the puzzle, says Diller.

Youkey says physicians and scientists typically view study findings more contextually. And he advises consumers to ask themselves if what they're reading passes the common-sense test.

"Read them. Find them interesting," he says. "And realize that there are some things that over time have proven absolutely legitimate and some that haven't."

Hanage says the most useful thing for the public to remember is to think about how many people are in the study, what ax the researchers may have to grind, and whether it's biased in any way and controlled for confounding factors.

"And once you're pretty sure they've done all those things and been responsible," he says, "you can start paying it heed."