Studies showing that players of action video games are smarter than non-gamers in certain key ways are mostly flawed, according to a paper by psychology researchers published this week.
The paper, written by Walter Boot and Daniel Blakely of Florida State University and Daniel Simons of the Beckman Institute for Advanced Science and Technology and published in Frontiers in Cognition
, disputes the methodology of various studies that have found that frequent action video game players outperform non-gamers in areas of perception and cognition.
The authors write that "[t]he possibility that video game training transfers broadly to other aspects of cognition is exciting because training on one task rarely improves performance on others." But they say that such findings are dubious because of improperly conducted research "common to most (if not all) of the published studies documenting gaming effects."
The researchers first broadly question conclusions that action video gaming improves people's performance in measures of perception and cognition.
"Even if gamers do outperform non-gamers, the difference might not be caused by gaming," they write. "[P]eople may become action gamers because they have the types of abilities required to excel at these games, or a third factor might influence both cognitive abilities and gaming."
The paper then turns its attention to comparative studies that actively recruit expert gamers to pit them against non-gamers in tests of perception and cognition. The authors say that a person's knowledge that they've been recruited as an expert gamer could increase their desire to perform well in testing, while non-gamers with no knowledge of why they're in such a study might not be so motivated, affecting results for both groups.
"Almost all studies comparing expert and novice gamers either neglect to report how subjects were recruited or make no effort to hide the nature of the study from participants," according to the authors.
"Many studies recruit experts through advertisements explicitly seeking people with game experience, thereby violating a core principle of experimental design and introducing the potential for differential demand characteristics."
A better approach, the authors write, would be to use covert recruitment techniques for more clinical studies that seek to measure the cognitive and perceptual abilities of unskilled gamers after being trained on action video games as compared with similarly unskilled study participants who don't receive training or, better yet, are trained to play a non-action video game.
But even game-training studies that do that can be flawed in their methodology, according to the researchers, who question whether many such studies have suitable control groups to offset a "placebo effect" that could affect outcomes.
"No game-training studies have taken the necessary precautions to avoid differential placebo effects across training conditions and outcome measures," the authors write. "In fact, no published studies have tested whether participants expect to improve as a result of training. Without an adequate control for placebo effects, any conclusion that game training caused cognitive improvements is premature —the benefit could be due to the expectation that a benefit should accrue.
Copyright © 2010 Ziff Davis Publishing Holdings Inc.
To discuss this article visit its forum entry here
View more articles from: Gaming