TUESDAY, May 31 (HealthDay News) -- A bevy of studies linking genes, proteins and other so-called "biomarkers" with certain diseases has vastly overrated the connections, new research suggests.
Analyzing 35 of the most frequently cited studies published between 1991 and 2006 in 10 renowned biomedical journals, study authors found that fewer than half of the biomarkers studied revealed statistically significant links to disease in larger follow-up trials.
Only 20 percent of the selected associations increased a patient's relative risk for a condition by more than 35 percent, the researchers found.
"I'm not terribly surprised because I had seen on a case-by-case basis these highly cited biomarkers that were not keeping the promises that had been made," said study author Dr. John Ioannidis, chief of the Stanford Prevention Research Center in Palo Alto, Calif. "I think the key message for researchers is that one should not depend on the results of a single study, no matter how spectacular the results are. We need to see replication and look at the bigger picture."
The study is published in the June 1 issue of the Journal of the American Medical Association.
The studies, each of which had been referenced in at least 400 later papers, looked at the links between biomarkers -- including specific genes or germs, levels of blood proteins and other molecules -- and the likelihood of developing conditions such as cancer and heart disease.
Because of the common scientific practice of citing previous supporting research in new studies, landmark research is often repeatedly referenced, making its results appear incontrovertible, Ioannidis said. But larger subsequent trials often report less spectacular or even statistically insignificant links between the same biomarkers and certain diseases.
Ioannidis noted that studies with larger numbers of patients, or those compiling the results of several independent studies (called meta-analyses), are more likely to be accurate than smaller trials. For example, in 29 of the 35 highly cited studies -- or 83 percent -- the corresponding meta-analysis revealed a smaller biomarker effect.
The weaker or non-existent associations are not due to fraud or poor study design, Ioannidis has cautioned. Part of the reason that small studies may not be confirmed by larger ones has to do with statistical probability, which determines the chance of a certain occurrence. To take the example of flipping coins, for example, someone may get four heads in a row in a few coin flips, but if he or she flips a coin hundreds of times, the odds of heads to tails will always approach a 50:50 ratio.
"Some biomarkers definitely work, so I think they can be very helpful. But just because one study suggests they're going to be effective, we can't just assume it will work across the board," said Dr. Stephanie Bernik, chief of surgical oncology at Lenox Hill Hospital in New York City.
"These things are very exciting when you find them," she added. "It's important for us to look for markers . . . but we have to keep a critical mind, because what at first seems like a very important finding may not pan out."
Fierce competition among scientists to report significant findings and a hopeful public eager for advances in the fight against dread diseases may add to any exaggeration of a study's importance, Ioannidis and other experts agreed.
The studies analyzed included one linking the BRCA1 mutation with colon cancer, another tying blood levels of C-reactive protein to cardiovascular disease, and one associating levels of the amino acid homocysteine with vascular disease. Researchers have also reported recent biomarker evidence in conditions such as Alzheimer's disease, leukemia and kidney disease.
The researchers reported that meta-analyses found all but a few of the associations had nominal statistical significance, although some appeared to have "no predictive value." If clinicians continue to use biomarkers whose value is questionable or overestimated, this could cause "a major escalation of health costs," with limited benefits, they warned.
"There are a number of reasons people pay attention to the first or more dramatic finding. People get more excited about it," said Dr. Marc L. Gordon, a neurologist and Alzheimer's researcher at the Feinstein Institute for Medical Research in Manhasset, N.Y. "This is not to say that studies shouldn't be done . . . but we need to analyze the whole range of data."
Ioannidis, also a professor of disease prevention at Stanford, agreed. "For the general public, the information is not bad to be exposed to," he said. "It's challenging, it's intriguing, and there's a constant flow of it. But be very skeptical about using the information to make changes in lifestyle . . . or clinical practice. Many of these claims will not be validated; some of them might."
The U.S. National Institute of Environmental Health Sciences has more information about biomarkers.
SOURCES: John Ioannidis, M.D., chief, Stanford Prevention Research Center, professor, disease prevention, Stanford University, Palo Alto, Calif; Stephanie Bernik, M.D., chief of surgical oncology, Lenox Hill Hospital, N.Y.; Marc L. Gordon, M.D., neurologist, Alzheimer's researcher, The Feinstein Institute for Medical Research, Manhasset, N.Y.; June 1, 2011 Journal of the American Medical Association.
All rights reserved