Brain Training Games’ Efficacy Claims Called Into Question
Do brain training programs actually work to improve memory and cognition? The scientific community is split — but a recent meta-analysis seems to indicate that brain-game makers have not adequately demonstrated the truth behind their claims of success.
October 7, 2016
Brain-training programs like Lumosity and LearningRx have long promised improved memory, faster processing speed, and more vigorous problem solving skills — all through the power of computer-based cognitive games. But a recent meta-analysis of the effectiveness of these so-called “brain games” calls the game makers’ claims into question — finding that their validating studies were too small, poorly designed, or entirely misleading.
In the study, published in the October 2016 issue of Psychological Science in the Public Interest, Daniel Simons, Ph.D., and his team looked at more than 130 existing studies on the efficacy of brain training. The researchers assessed them all on a range of factors, including sample size, use of a control group, and defensible extrapolation of conclusions.
Simons said his team found that the majority of studies “did not really adhere to what we think of as best practices.” Most had at least one major flaw in design or analysis; this included all of the studies cited by brain-training companies as proof of their products’ efficacy. Many failed to account for the placebo effect, which is common in brain-training studies; participants often do better on a test after a period of training and are convinced that they’ve become more competent — when in fact they’re just putting in more effort.
“It’s disappointing that the evidence isn’t stronger,” said Simons. “It would be really nice if you could play some games and have it radically change your cognitive abilities. But the studies don’t show that on objectively measured, real-world outcomes.”
A few studies showed that the subjects did in fact improve on specific tasks — but the authors extrapolated those findings to apply to other day-to-day tasks, without evidence to support these conclusions.
“It is not that people do not improve — they do, but only at playing the particular game,” said Russell Barkley, Ph.D., in a Facebook post commenting on the study. “There is little or no generalization to natural settings or to larger cognitive domains, such as working memory, that are supposed to be improved from practicing specific cognitive training games.” Dr. Barkley was not involved in the current research.
The study was the result of an ongoing debate in the scientific community about the power of brain training, Simons said. In October 2014, 75 researchers in the fields of cognitive psychology and neuroscience published an open letter disputing the marketing claims made by the largest brain-training companies. Shortly after, a rebuttal was published: a group of 133 scientists signed on to a letter in support of brain training, arguing that “brain plasticity is a lifelong phenomenon” and that, while more research was needed, the brain-training industry was using best practices to support their claims and were not actively misleading the public with their results.
This meta-analysis lends credence to the naysayers, and even some of the signers of the rebuttal letter were swayed.
“The evaluation was very even-handed and raised many excellent points,” said George Rebok, a psychologist at Johns Hopkins University who studies brain training and who signed the rebuttal. “It really helped raise the bar in terms of the level of science that we must aspire to.”