Arriving at the right dose is a big clinical challenge. Ideal dosages vary greatly from patient to patient, depending on factors such as age, gender, weight and other medications taken. Recently, clinicians identified two genes, CYP2C9 and VCORC1, that also affect warfarin tolerance, but nobody was sure about the exact role these genes play.
Physicians usually start patients on a dose of five milligrams per day.
"For about 50 percent of the population, this dose works fine," says Page. But for the other 50 percent or so, serious problems can arise when patients take levels that are either too high or too low.
In the study, researchers collected a variety of models used or suggested by the clinicians at the participating institutions to determine warfarin doses.
"The goal of each model is to look at several variables and say, depending on the weight of each variable and the combination of them all, which dose a patient should get," explains Page, an expert on a modeling approach called machine learning.
Researchers tested the performance of 14 models by comparing them to patient outcomes. They found that two modelsan older technique called linear regression and a newer one called support vector regressionwere equally successful when a combination of clinical information and genetic data was included.
"We found that incorporating a richer set of clinical variables significantly improved dosing," Page says. "Adding genetic information improved the predictions even more."
And while the model worked very well for patients on the standard five milligrams per day, it worked even better for the other halfthose who could either bleed or clot excessively.
In the end, the researchers chose the linear regression model, with eight variables, as the best.
|Contact: Dian Land|
University of Wisconsin-Madison