“Challenges with memory and thinking have emerged as a leading health issue reported by U.S. adults,” associate professor of neurology Adam de Havenon of the Yale School of Medicine has reported.

A 2025 Yale study, authored by de Havenon, found an alarming increase in self-reported cognitive disability, particularly among adults ages 18 to 34. The younger cohort’s rate nearly doubled over a decade—from 5.1% in 2013 to 9.7% in 2023—driving most of the overall increase. By comparison, the rate among adults overall increased more modestly from 5.3% to 7.4% over the same period. The study tracked 4.5 million adults over 10 years.

Is There a Youth Dementia Epidemic?

While the findings are a cause for concern, they do not necessarily suggest an emerging dementia epidemic. “This isn’t a diagnosis of dementia or even of cognitive impairment,” de Havenon explained. “It’s a subjective report of people saying they’re having serious difficulty concentrating, remembering, or making decisions. With dementia, there’s a structural brain disease and a specific pathology that’s injuring the brain and leading to cognitive impairment.”

That said, the Yale study notes that these findings should be investigated further, “as growing cognitive problems among the population can pose future healthcare and workplace consequences.” Because participants in the Yale study have not had their brains scanned, there’s no way of knowing yet if they display the structural brain changes associated with dementia. Further research would be needed to determine if there is a link between early self-reported cognitive decline and the structural brain changes associated with dementia.

But if such a link is established, it would pose a significant economic cost; a study published in Frontiers in Neurology notes that dementia cost the global economy $1.3 trillion in 2019. That’s what makes research in treating dementia—from behavioral interventions to anti-inflammatory nasal spray—so important.

The Yale study also found a connection to socioeconomic factors among the participants, which demonstrates that the difficulties “may be becoming more widespread, especially among younger adults, and that social and structural factors likely play a key role.”

Is Technology to Blame?

While de Havenon’s report might have relied on subjective self-reporting, other studies support his findings. Earlier this year, neuroscientist Jared Cooney Horvath provided written testimony before the U.S. Senate Committee on Commerce, Science, and Transportation, noting that “over the past two decades, the cognitive development of children across much of the developed world has stalled and, in many domains, reversed.”

Rather, he blamed federal policy that “continues to incentivize large-scale digital adoption without demanding independent efficacy evidence, privacy protections, and developmental safeguards,” which “risks compounding long-term educational and workforce harm.”

For two decades, state governments have invested in providing students with laptops and tablets, digitizing classroom functions, and making Gen Z a beta test for a digital-first generation. The result? Despite having unprecedented access to information from an early age, Gen Z has become the first generation to score lower on standardized tests than previous generations.

Undoing Decades of Damage

Horvath says the fix