Alzheimer's Blood Test Raises Ethical Questions
by: Jon Hamilton, NPR, March 9, 2014 1:03:00 pm
An experimental blood test can identify people in their 70s who are likely to develop Alzheimer's disease in the next two or three years. The test is accurate more than 90 percent of the time, scientists reported Sunday in Nature Medicine.
The finding could lead to a quick and easy way for seniors to assess their risk of Alzheimer's, says Dr. Howard Federoff, a professor of neurology at Georgetown University. And that would be a "game changer," he says, if researchers find a treatment that can slow down or stop the disease.
But because there is still no way to halt Alzheimer's, Federoff says, people considering the test would have to decide whether they are prepared to get results that "could be life-altering."
The idea of predicting Alzheimer's isn't new. It's already possible to detect signs of the disease long before symptoms like memory loss begin to appear. But the tests require either a spinal tap, which is painful, or an MRI scan, which is time consuming and expensive.
So Federoff and a team of researchers set out to find something better. They took blood samples from 525 people age 70 and older. Then, he says, they looked to see who developed Alzheimer's in the next five years.
The goal was to find some difference between the blood of people who developed Alzheimer's and the blood of people who remained "cognitively normal," Federoff says. And after sifting through more than 4,000 potential "biomarkers," he says, "We discovered that 10 blood lipids [fats] predicted whether someone would go on to develop cognitive impairment or Alzheimer's."
The results need to be confirmed, and the approach still needs to be tried in people of different ages and from different racial groups, Federoff says. Even so, he says, it raises the possibility that in the not too distant future, a lot more people will know their risk of Alzheimer's.
That knowledge can be a good thing, says Dr. Jason Karlawish, a professor of medicine, medical ethics and health policy at the University of Pennsylvania. That's been shown among people who chose to be tested for a gene that increases the risk of Alzheimer's, he says.
"Knowing their risk of developing cognitive impairment is very relevant to making plans around retirement and where they live," he says. "So there is certainly a role for knowing that information."
On the other hand, people who have the Alzheimer's gene and know it tend to rate their own memories as worse than people who have the gene but don't know it, he says. And knowing you carry the gene also seems to hurt people's performance on memory tests.
But the biggest concern about Alzheimer's testing probably has to do with questions of stigma and identity, Karlawish says. "How will other people interact with you if they learn that you have this information?" he says. "And how will you think about your own brain and your sort of sense of self?"
The stigma and fear surrounding Alzheimer's may decrease, though, as our understanding of the disease changes, Karlawish says. Right now, people still tend to think that "Either you have Alzheimer's disease dementia or you're normal, you don't have it," he says.
But research has shown that's not really true, Karlawish says. Alzheimer's is a bit like heart disease. It starts with biological changes that occur years before symptoms appear. And there is no bright line separating healthy people from those in early stages of the disease.