vlog

Skip to content
NOWCAST vlog News at 10pm Weeknights
Watch on Demand
Advertisement

Technology use may be associated with a lower risk for dementia, study finds

Technology use may be associated with a lower risk for dementia, study finds
Poor sleep in middle age can put you at risk for dementia later in life. That's according to *** study published Wednesday in the journal Neurology, the authors asked 589 participants around age 40 to self report whether they experienced several characteristics of poor sleep, short sleep duration, bad sleep quality, difficulty initiating and maintaining sleep, early morning awakening in daytime, sleepiness. 15 years later, the authors performed MRI brain scans on those same participants. Subjects who reported 2 to 3 poor sleep characteristics were 1.6 years older in so called brain age. While subjects who reported three or more characteristics were 2.6 years older in brain age. Co-author Clements Caballes told med page today. Advanced brain age is associated with cognitive decline in Alzheimer's related atrophy patterns. Therefore, poor sleep may be an important target for early interventions aimed at preventing neurocognitive decline if you're trying to improve your sleep. Clinical psychologist, Doctor Shelby Harris told CBS news cut back on caffeine and alcohol before bed, try relaxation techniques and make sure you get enough exercise
Advertisement
Technology use may be associated with a lower risk for dementia, study finds
With the first generation of people exposed widely to technology now approaching old age, how has its use affected their risk of cognitive decline?That's a question researchers from two Texas universities sought to answer in a new meta-analysis study, a review of previous studies, published Monday in the journal Nature Human Behavior. The query investigates the "digital dementia hypothesis," which argues that lifetime use may increase reliance on technology and weaken cognitive abilities over time."We say a really active brain in youth and midlife is a brain that is more resilient later," said Dr. Amit Sachdev, medical director of the department of neurology and ophthalmology at Michigan State University, who wasn't involved in the study.But the authors discovered that the digital dementia hypothesis may not bear out: Their analysis of 57 studies totaling 411,430 older adults found technology use was associated with a 42% lower risk of cognitive impairment, which was defined as a diagnosis of mild cognitive impairment or dementia, or as subpar performance on cognitive tests.Forms of technology included computers, smartphones, internet, email, social media or "mixed/multiple uses," according to the new study."That these effects were found in studies even when factors like education, income, and other lifestyle factors were adjusted was also encouraging: the effect doesn't seem just due to other brain health factors," co-lead study author Dr. Jared Benge, associate professor in the department of neurology at the University of Texas at Austin's Dell Medical School, said via email.The authors searched eight databases for studies published through 2024, and the 57 chosen for their main analysis included 20 studies that followed participants for about six years on average and 37 cross-sectional studies, which measure health data and outcomes at one point in time. The adults were age 68 on average at the beginning of the studies.While technology use was generally linked with a lower risk of cognitive decline, the findings for social media use were inconsistent, the authors said.None of the 136 studies the authors reviewed overall reported an increased risk of cognitive impairment correlated with technology use — a consistency that is "really quite rare," said co-lead study author Dr. Michael Scullin, professor of psychology and neuroscience at Baylor University, via email.The research is "a really well-organized and -executed meta-analysis of essentially the entire field over the last 18 years or 20 years," said Dr. Christopher Anderson, chief of the division of stroke and cerebrovascular diseases at Brigham and Women's Hospital in Boston. Anderson wasn't involved in the study.But if you're thinking the study's findings mean you're free to use technology to your heart's content since your brain will be fine anyway — not so fast."Our findings are not a blanket endorsement of mindless scrolling," Benge, who is also a clinical neuropsychologist at UT Health Austin's Comprehensive Memory Center, said. "They are instead a hint that the generation that gave us the internet has found ways to get some net positive benefits from these tools to the brain."And despite the study's significance, there are still many uncertainties about the relationships between various aspects of technology use and brain health.Technology use and the brainOne of the study's limitations is that it doesn't have details on how people were using technological devices, experts said. As a result, it's unclear whether participants were using computers or phones in ways that meaningfully exercised their brains, or what specific way may be most associated with cognitive protection.Lacking information on the amount of time technology was used means it's also unknown whether there is harmful threshold or if only a little time is needed for cognitive benefits, Anderson said.These questions are difficult "to try to answer, because the sheer volume of technology exposures that we have to navigate is so high," Sachdev said. "To isolate one technology exposure and its effect is difficult, and to measure just a whole ecosystem of technology exposures and … their aggregate effect is also a challenge."Additionally, "the amount that we can extrapolate from this study towards future generations is very unclear, given the ubiquity of technology today that people are exposed to and have been exposed to from their birth," Anderson said."When you think about the kind of technology that this cohort would've been interacting with earlier in their lives, it's a time when you had to really work to use technology," Anderson added.Their brains were also already well formed, Benge said.The study may support the alternative to the digital dementia hypothesis, which is the cognitive reserve theory. The theory "contends that exposure to complex mental activities leads to better cognitive well-being in older age," even in the face of age-related brain changes, according to the study.That technology may reduce risk of cognitive decline by helping us be more neurologically active is possible, Sachdev said. Technology use can also foster social connection in some instances, and social isolation has been linked with greater odds of developing dementia.It's also possible that older adults who are using technology may already have more active and resilient brains, explaining their engagement with technology.Managing your technology useInferences on best practices for technology use in consideration of cognitive health can't be drawn from the study since it didn't have specifics on participants' use habits, experts said.But "it does support that a healthy mix of activities is likely to be the most beneficial, and that fits with other literature on the topic as well," Anderson said. "What this probably does more than anything else is provide some reassurance that there's no association between at least moderate use of technology and cognitive decline."Engaging in moderation is best, Sachdev said. And that should largely bring joy, genuine connection, creativity and intellectual stimulation to your life, experts said."It should be productive in some way," he added, and entertaining yourself can sometimes meet that requirement. But if you're experiencing eye or neck strain from sitting in front of a screen, that's a sign you're using technology too much."Too much of anything can be a bad thing," Sachdev said. "Identifying the purpose and the duration and then executing along those lines is how we would advise for most topics."Some older adults have avoided technology use, thinking it's too difficult to learn. But Scullin and others have found even people with mild dementia can be trained to use such devices, he said. Though sometimes frustrating, the difficulty "is a reflection of the mental stimulation afforded through learning the device," Scullin added.

With the first generation of people exposed widely to technology now approaching old age, how has its use affected their risk of cognitive decline?

That's a question researchers from two Texas universities sought to answer in a new meta-analysis study, a review of previous studies, published Monday in the journal . The query investigates the "," which argues that lifetime use may increase reliance on technology and weaken cognitive abilities over time.

Advertisement

"We say a in youth and midlife is a brain that is more resilient later," said Dr. Amit Sachdev, medical director of the department of neurology and ophthalmology at Michigan State University, who wasn't involved in the study.

But the authors discovered that the digital dementia hypothesis may not bear out: Their analysis of 57 studies totaling 411,430 older adults found technology use was associated with a 42% lower risk of cognitive impairment, which was defined as a diagnosis of mild cognitive impairment or dementia, or as subpar performance on cognitive tests.

Forms of technology included computers, smartphones, internet, email, social media or "mixed/multiple uses," according to the new study.

"That these effects were found in studies even when factors like education, income, and other lifestyle factors were adjusted was also encouraging: the effect doesn't seem just due to other brain health factors," co-lead study author Dr. Jared Benge, associate professor in the department of neurology at the University of Texas at Austin's Dell Medical School, said via email.

The authors searched eight databases for studies published through 2024, and the 57 chosen for their main analysis included 20 studies that followed participants for about six years on average and 37 , which measure health data and outcomes at one point in time. The adults were age 68 on average at the beginning of the studies.

While technology use was generally linked with a lower risk of cognitive decline, the findings for were inconsistent, the authors said.

None of the 136 studies the authors reviewed overall reported an increased risk of cognitive impairment correlated with technology use — a consistency that is "really quite rare," said co-lead study author Dr. Michael Scullin, professor of psychology and neuroscience at Baylor University, via email.

The research is "a really well-organized and -executed meta-analysis of essentially the entire field over the last 18 years or 20 years," said Dr. Christopher Anderson, chief of the division of stroke and cerebrovascular diseases at Brigham and Women's Hospital in Boston. Anderson wasn't involved in the study.

But if you're thinking the study's findings mean you're free to use technology to your heart's content since your brain will be fine anyway — not so fast.

"Our findings are not a blanket endorsement of mindless scrolling," Benge, who is also a clinical neuropsychologist at UT Health Austin's Comprehensive Memory Center, said. "They are instead a hint that the generation that gave us the internet has found ways to get some net positive benefits from these tools to the brain."

And despite the study's significance, there are still many uncertainties about the relationships between various aspects of technology use and brain health.

Technology use and the brain

One of the study's limitations is that it doesn't have details on how people were using technological devices, experts said. As a result, it's unclear whether participants were using computers or phones in ways that meaningfully exercised their brains, or what specific way may be most associated with cognitive protection.

Lacking information on the amount of time technology was used means it's also unknown whether there is harmful threshold or if only a little time is needed for cognitive benefits, Anderson said.

These questions are difficult "to try to answer, because the sheer volume of technology exposures that we have to navigate is so high," Sachdev said. "To isolate one technology exposure and its effect is difficult, and to measure just a whole ecosystem of technology exposures and … their aggregate effect is also a challenge."

Additionally, "the amount that we can extrapolate from this study towards future generations is very unclear, given the ubiquity of technology today that people are exposed to and have been exposed to from their birth," Anderson said.

"When you think about the kind of technology that this cohort would've been interacting with earlier in their lives, it's a time when you had to really work to use technology," Anderson added.

Their brains were also already well formed, Benge said.

The study may support the alternative to the digital dementia hypothesis, which is the cognitive reserve theory. The theory "contends that exposure to complex mental activities leads to better cognitive well-being in older age," even in the face of age-related brain changes, according to the study.

That technology may reduce risk of cognitive decline by helping us be more neurologically active is possible, Sachdev said. Technology use can also foster social connection in some instances, and social isolation has been linked of developing dementia.

It's also possible that older adults who are using technology may already have more active and resilient brains, explaining their engagement with technology.

Managing your technology use

Inferences on best practices for technology use in consideration of cognitive health can't be drawn from the study since it didn't have specifics on participants' use habits, experts said.

But "it does support that a healthy mix of activities is likely to be the most beneficial, and that fits with other literature on the topic as well," Anderson said. "What this probably does more than anything else is provide some reassurance that there's no association between at least moderate use of technology and cognitive decline."

Engaging in moderation is best, Sachdev said. And that should largely bring joy, genuine connection, creativity and intellectual stimulation to your life, experts said.

"It should be productive in some way," he added, and entertaining yourself can sometimes meet that requirement. But if you're experiencing eye or neck strain from sitting in front of a screen, that's a sign you're using technology too much.

"Too much of anything can be a bad thing," Sachdev said. "Identifying the purpose and the duration and then executing along those lines is how we would advise for most topics."

Some older adults have avoided technology use, thinking it's too difficult to learn. But Scullin and others have found even people with mild dementia can be trained to use such devices, he said. Though sometimes frustrating, the difficulty "is a reflection of the mental stimulation afforded through learning the device," Scullin added.