Gut Check is a periodic look at health claims made by studies, newsmakers, or conventional wisdom. We ask: Should you believe this?
The Claim: The percent of people developing dementia each year is falling significantly, reports a study out Wednesday in the New England Journal of Medicine, raising hope that some cases can be prevented and, possibly, that the worst forecasts of a “looming dementia crisis” in the United States are overblown.
The Backstory: Wait, what?
For 30 years experts have warned that Alzheimer’s and other forms of dementia are going to trigger medical Armageddon in the United States. As baby boomers reach their 70s and 80s, when the likelihood of developing dementia soars, the ever-growing number of patients is expected to overwhelm the health care system, causing “overworked caregivers, overloaded nursing homes, an overwhelmed health care system, overtaxed state and federal budgets,” former Surgeon General Dr. David Satcher wrote in 2014.
Last year, the Alzheimer’s Association reported that 1 in 3 Americans 85-and-older had Alzheimer’s. Simple math says that as more people live to 85 and beyond, one-third of a bigger number will mean a lot more people with the disease. Yet there have been hints that the most dire projections, based on extrapolating today’s rates of dementia to tomorrow’s larger populations of the elderly, are overly alarmist.
First Take: The long-running Framingham Heart Study has dementia data on some 5,000 people 60-and-over, going back to 1975. Rather than counting dementia cases via medical records — which can be problematic because, as awareness of a disease and diagnostic standards change, the number of reported cases can change even if the underlying reality hasn’t — the researchers used their own, unchanging diagnostic criteria.
They found that the rate of dementia was 3.6 percent in the late 1970s and early 1980s, falling to 2.8 percent a decade later, 2.2 percent a decade after that, and 2 percent in the late 2000s and early 2010s.
That means 44 percent fewer Framingham-ites were developing dementia in the most recent period than in the late 1970s, after taking into account that the group had aged. Rates of vascular dementia fell the most. Rates of Alzheimer’s also declined, but the drop just missed being statistically significant.
The age-specific data were particularly encouraging. By the most recent period, people in their 60s had a 62 percent lower risk of developing dementia than their older neighbors did four decades before. People in their 70s had a 36 percent lower risk. People 80-and-older had a 32 percent lower risk.
In addition, people are developing dementia later in life than in the past: the average age of diagnosis rose from 80 to 85 over the last four decades.
Those results offer evidence that it’s possible to prevent, or at least delay, dementia. “There are things people can do right now to decrease their risk of cognitive decline, and, although to a lesser extent, full-blown dementia,” said Keith Fargo, director of scientific programs at the Alzheimer’s Association, who was not involved in the study.
The Framingham team, led by Claudia Satizabal and Dr. Sudha Seshadri of Boston University School of Medicine, identified better cardiovascular health as a key reason for the decline in dementia, but education mattered, too. The drop in dementia rates occurred only in people with at least a high school diploma. That suggests a role for “cognitive reserve” — basically, the more you know and the nimbler your brain, the more you can stand to lose before you have dementia.
“In general, people with higher education are protected from cognitive decline,” said Satizabal (who offered an early look at her team’s results at the 2014 Alzheimer’s Association International Conference). But people who never completed high school also had much poorer heart health, so the education effect may be a proxy for cardiovascular well-being.
Second Take: Scientists not involved in the study said the Framingham team did a good job of ensuring that the drop in dementia was real, and not an artifact of changing diagnoses. Attributing the decline, at least in part, to better cardiovascular health also makes sense, offering hope that more cases can be prevented or delayed.
But the Framingham cohort is overwhelmingly white and therefore not representative of the racial and ethnic diversity in the United States as a whole. The results might therefore not extend to the entire population.
In an independent study also published Wednesday — the release was pushed up to coincide with the Framingham results — researchers analyzing the health records of 274,000 Northern Californians found significantly different dementia rates in different groups: 19.3 per 1,000 among whites, 26.6 among African-Americans, 15.2 among Asian-Americans.
Another development that might offset the Framingham findings: the rise in obesity and diabetes, which are risk factors for dementia.
Because of that and other considerations, the Alzheimer’s Association is sticking with its projection that, by 2050, the prevalence of Alzheimer’s will nearly triple, to 13.8 million Americans. “The projections we have are based on more representative samples,” Fargo said. “It would be premature to change” them.
That’s the nub of the debate.
Dr. David Jones, of Harvard Medical School, said the number of future cases “might not be as bad” as current projections say. The reluctance to dial down those forecasts might reflect “public health catastrophism,” added Dr. Jeremy Greene of Johns Hopkins, who, together with Jones, coauthored an accompanying perspective article on the Framingham study.
“It’s risky for advocates [to temper the most dire forecasts] for fear that it might bring a loss of funding,” Greene said.
The Takeaway: Dementia can likely be delayed and even prevented. And while ever-more Americans will develop dementia as the population ages, the calamitous projections need another look.