by Michael Smith
It’s not just TV — video games are also associated with an increased risk of attention problems in children, researchers said.
A large study found that children who spent more than three hours in front of a computer or television screen — whether playing video games or watching TV — were significantly more likely to have attention problems, according to Edward Swing, MS, of Iowa State University in Ames, Iowa, and colleagues.
The finding comes from a 13-month longitudinal study of 1,323 children ages six through 12, but a similar link was found in a snapshot of 210 university undergraduates (most between ages 18 to 24), Swing and colleagues reported online in Pediatrics. Interestingly, around half were female among both groups studied.
Among the middle childhood cohort, those who spent a median of 3.86 hours in front of a screen had an odds ratio for developing attention problems of 1.81 (95% CI, 1.56 to 2.11). Among the sample of late adolescents and young adults, those with a median of 4.36 hours of total screen time, the odds ratio for attention problems was 2.04 (95% CI, 1.45 to 2.88).
The findings suggest that the risk of attention problems could be reduced if parents limited children’s viewing and video game time to a total of two hours a day, as recommended by the American Academy of Pediatrics, Swing and colleagues said.
The effects of television of children’s attention spans have been demonstrated in many studies, Swing and colleagues said. (See TV’s Effect on Child Behavior Is in the Timing) But a similar examination of the impact of video games has been neglected, they added.
To help fill the gap, they took advantage of a cohort of children recruited from 10 schools in two Midwest states as part of an obesity prevention project. The children, their parents, and their teachers reported on video game use, television watching, and attention problems four times over 13 months.
As well, they recruited undergraduates at a large public university, also in the Midwest, for a single laboratory session in which participants reported their video game and television exposure and also completed the Adult ADHD Self-Report Scale, the Brief Self-Control Scale, and the Barratt Impulsiveness Scale.
Among the children, they found the median daily exposure to television was 2.99 hours and to video games was 0.66 hours, for a total of 3.86 hours a day of screen time. Those who were above the medians were significantly also more likely to have attention problems:
* For those above the median in television viewing, the odds ratio for attention problems was 1.55, with a 95% confidence interval from 1.33 to 1.79.
* For video games, the corresponding odds ratio was 1.82, with a 95% confidence interval from 1.57 to 2.11.
* For both combined, the odds ratio was 1.81, with 95% confidence interval from 1.56 to 2.11.
Among the undergrads, odds ratios were similar — 1.68 for television, 1.82 for video games, and 2.04 for both combined. None of the confidence intervals included unity.
The two forms of screen time were independent predictors of the risk of attention problems, Swing and colleagues said.
When the researchers compared children whose screen time was more or less than two hours a day, they found those above the cutoff had a 67% increased risk of attention problems compared with those below. The odds ratio was 1.67, with a 95% confidence interval from 1.27 to 2.21.
The pattern was similar in the undergrad sample — an odds ratio of 2.23, with a 95% confidence interval from 1.13 to 4.39.
The study should be interpreted with some caution, the researchers stressed, since in neither sample can a casual relationship be demonstrated between video games and attention problems. In addition, the authors noted that it’s possible that some forms of television and video games cause a problem, while others do not.
Michael Smith is a MedPage Today North American Correspondent.
Originally published in MedPage Today. Visit MedPageToday.com for more pediatrics news.