DNB-meta-analysis (Link Bibliography)

DNB-meta-analysis links:

  1. DNB-FAQ

  2. #analysis

  3. #control-groups

  4. #paymentextrinsic-motivation

  5. #training-time

  6. #training-type

  7. #biases

  8. #iq-test-time

  9. DNB-FAQ#jaeggi-2008

  10. DNB-FAQ#support

  11. DNB-FAQ#criticism

  12. ⁠, Monica Melby-Lervåg, Charles Hulme (2013-02):

    It has been suggested that training programs are effective both as treatments for attention-deficit/​​​​hyperactivity disorder () and other cognitive disorders in children and as a tool to improve cognitive ability and scholastic attainment in typically developing children and adults. However, effects across studies appear to be variable, and a systematic meta-analytic review was undertaken. To be included in the review, studies had to be or quasi-experiments without randomization, have a treatment, and have either a treated group or an untreated control group.

    23 studies with 30 group comparisons met the criteria for inclusion. The studies included involved clinical samples and samples of typically developing children and adults. Meta-analyses indicated that the programs produced reliable short-term improvements in working memory skills. For verbal working memory, these near-transfer effects were not sustained at follow-up, whereas for visuospatial working memory, limited evidence suggested that such effects might be maintained. More importantly, there was no convincing evidence of the generalization of working memory training to other skills (nonverbal and verbal ability, inhibitory processes in attention, word decoding, and arithmetic).

    The authors conclude that memory training programs appear to produce short-term, specific training effects that do not generalize. Possible limitations of the review (including age differences in the samples and the variety of different clinical conditions included) are noted. However, current findings cast doubt on both the clinical relevance of working memory training programs and their utility as methods of enhancing cognitive functioning in typically developing children and healthy adults.

    [Keywords: working memory training, ADHD, attention, learning disabilities]

  13. 2015-schwaighofer.pdf

  14. http://pps.sagepub.com/content/11/4/512.full

  15. 2019-kassai.pdf: ⁠, Reka Kassai, Judit Futo, Zsolt Demetrovics, Zsofia K. Takacs (2019-01-01; dual-n-back):

    In the present meta-analysis we examined the near-transfer and far-transfer effects of training components of children’s executive functions skills: working memory, inhibitory control, and cognitive flexibility.

    We found a near-transfer effect (g+ = 0.44, k = 43, p < 0.001) showing that the interventions in the primary studies were successful in training the targeted components. However, we found no convincing evidence of far-transfer (g+ = 0.11, k = 17, p = 0.11). That is, training a component did not have a statistically-significant effect on the untrained components.

    By showing the absence of benefits that generalize beyond the trained components, we question the practical relevance of training specific skills in isolation. Furthermore, the present results might explain the absence of far-transfer effects of working memory training on academic skills (⁠; Sala & Gobet 2017).

  16. ⁠, Thomas S. Redick (2019-05-16):

    Seventeen years and hundreds of studies after the first journal article on working memory training was published, evidence for the efficacy of working memory training is still wanting. Numerous studies show that individuals who repeatedly practice computerized working memory tasks improve on those tasks and closely related variants. Critically, although individual studies have shown improvements in untrained abilities and behaviors, systematic reviews of the broader literature show that studies producing large, positive findings are often those with the most methodological shortcomings. The current review discusses the past, present, and future status of working memory training, including consideration of factors that might influence working memory training and transfer efficacy.

  17. 2014-au.pdf

  18. 2015-dougherty.pdf

  19. ⁠, Amit Lampit, Harry Hallock, Michael Valenzuela (2014-09-29):

    Background: New effective interventions to attenuate age-related cognitive decline are a global priority. Computerized cognitive training (CCT) is believed to be safe and can be inexpensive, but neither its efficacy in enhancing cognitive performance in healthy older adults nor the impact of design factors on such efficacy has been systematically analyzed. Our aim therefore was to quantitatively assess whether CCT programs can enhance cognition in healthy older adults, discriminate responsive from nonresponsive cognitive domains, and identify the most salient design factors.

    Methods and Findings: We systematically searched Medline, Embase, and PsycINFO for relevant studies from the databases’ inception to 9 July 2014. Eligible studies were randomized controlled trials investigating the effects of ≥4 h of CCT on performance in neuropsychological tests in older adults without dementia or other cognitive impairment. Fifty-two studies encompassing 4,885 participants were eligible. Intervention designs varied considerably, but after removal of one outlier, heterogeneity across studies was small (I2 = 29.92%). There was no systematic evidence of publication bias. The overall effect size (⁠, random effects model) for CCT versus control was small and statistically-significant, g = 0.22 (95% 0.15 to 0.29). Small to moderate were found for nonverbal memory, g = 0.24 (95% CI 0.09 to 0.38); verbal memory, g = 0.08 (95% CI 0.01 to 0.15); working memory (WM), g = 0.22 (95% CI 0.09 to 0.35); processing speed, g = 0.31 (95% CI 0.11 to 0.50); and visuospatial skills, g = 0.30 (95% CI 0.07 to 0.54). No statistically-significant effects were found for executive functions and attention. Moderator analyses revealed that home-based administration was ineffective compared to group-based training, and that more than three training sessions per week was ineffective versus three or fewer. There was no evidence for the effectiveness of WM training, and only weak evidence for sessions less than 30 min. These results are limited to healthy older adults, and do not address the durability of training effects.

    Conclusions: CCT is modestly effective at improving cognitive performance in healthy older adults, but efficacy varies across cognitive domains and is largely determined by design choices. Unsupervised at-home training and training more than three times per week are specifically ineffective. Further research is required to enhance efficacy of the intervention.

  20. http://handbook.cochrane.org/chapter_9/9_analysing_data_and_undertaking_meta_analyses.htm

  21. http://handbook.cochrane.org/front_page.htm

  22. http://www.metafor-project.org/

  23. http://groups.google.com/group/brain-training

  24. http://scholar.google.com/scholar?q=%22n-back%22+AND+%28%22fluid+intelligence%22+OR+%22IQ%22%29

  25. https://pubmed.ncbi.nlm.nih.gov/?term=%22n-back%22+AND+%28%22fluid+intelligence%22+OR+%22IQ%22%29

  26. http://library.mpib-berlin.mpg.de/ft/sli/SLI_Working_2008.pdf

  27. DNB-FAQ#qiu-2009

  28. DNB-FAQ#polar-june-2009

  29. DNB-FAQ#seidler-2010

  30. DNB-FAQ#stephenson-2010

  31. DNB-FAQ#jaeggi-2010

  32. DNB-FAQ#jaeggi-2011

  33. DNB-FAQ#chooi-2011

  34. DNB-FAQ#schweizer-et-al-2011

  35. DNB-FAQ#jausovec-2012

  36. DNB-FAQ#kundu-et-al-2012

  37. DNB-FAQ#salminen-2012

  38. DNB-FAQ#redick-2012

  39. DNB-FAQ#vartanian-2013

  40. 2015-loosli.pdf

  41. 2015-loosli-supplementary.pdf

  42. http://www.termedia.pl/Journal/-74/pdf-26564-10?filename=effects%20of%20working.pdf

  43. http://cdn.neoscriber.org/cdn/serve/a6/e3/a6e3f09cd4639c7ed18a6c0042b688fa857c468c/ijpbs-inpress-inpress-5009.pdf

  44. http://eprints.lincoln.ac.uk/1932/1/MetaAnalysisPaper.pdf

  45. Replication

  46. Iodine

  47. https://www.nytimes.com/2012/05/06/opinion/sunday/iq-points-for-sale-cheap.html

  48. https://www.frontiersin.org/Journal/10.3389/fnsys.2014.00034/full

  49. 1970-cronbach.pdf

  50. https://www.frontiersin.org/articles/10.3389/fnhum.2016.00153/full

  51. 2015-protzko.pdf: “The environment in raising early intelligence: A meta-analysis of the fadeout effect”⁠, John Protzko

  52. 1989-jensen-2.pdf

  53. 2007-tenijenhuis.pdf

  54. 2014-tenijenhuis.pdf: “Are Headstart gains on the g factor? A meta-analysis”⁠, Jan te Nijenhuis, Birthe Jongeneel-Grimen, Emil O. W. Kirkegaard

  55. 2015-tenijenhuis.pdf: “Are adoption gains on the g factor? A meta-analysis”⁠, Jan te Nijenhuis, Birthe Jongeneel-Grimen, Elijah L. Armstrong

  56. http://www.klingberglab.se/pub/BergmanNutley_fluid_intelligence_2011.pdf

  57. 2012-shipstead.pdf

  58. 2013-colom.pdf: “Adaptive n-back training does not improve fluid intelligence at the construct level: Gains on individual tests suggest that training may enhance visuospatial processing”⁠, Roberto Colom, Francisco J. Román, Francisco J. Abad, Pei Chun Shih, Jesús Privado, Manuel Froufe, Sergio Escorial, Kenia Martínez, Miguel Burgaleta, M. A. Quiroga, Sherif Karama, Richard J. Haier, Paul M. Thompson, Susanne M. Jaeggi

  59. ⁠, Hayes, Taylor R. Petrov, Alexander A. Sederberg, Per B (2015):

    Recent reports of training-induced gains on fluid intelligence tests have fueled an explosion of interest in cognitive training-now a billion-dollar industry. The interpretation of these results is questionable because score gains can be dominated by factors that play marginal roles in the scores themselves, and because intelligence gain is not the only possible explanation for the observed control-adjusted far transfer across tasks. Here we present novel evidence that the test score gains used to measure the efficacy of cognitive training may reflect strategy refinement instead of intelligence gains. A novel scanpath analysis of eye movement data from 35 participants solving Raven’s Advanced Progressive Matrices on two separate sessions indicated that one-third of the of score gains could be attributed to test-taking strategy alone, as revealed by characteristic changes in eye- patterns. When the strategic contaminant was partialled out, the residual score gains were no longer significant. These results are compatible with established theories of skill acquisition suggesting that procedural knowledge tacitly acquired during training can later be utilized at posttest. Our novel method and result both underline a reason to be wary of purported intelligence gains, but also provide a way forward for testing for them in the future.

  60. 2015-ritchie.pdf

  61. 2015-estrada.pdf: “A general factor of intelligence fails to account for changes in tests’ scores after cognitive practice: A longitudinal multi-group latent-variable study”⁠, Eduardo Estrada, Emilio Ferrer, Francisco J. Abad, Francisco J. Román, Roberto Colom

  62. https://escholarship.org/content/qt1kn271xs/qt1kn271xs.pdf

  63. ⁠, Pauline L. Baniqued, Courtney M. Allen, Michael B. Kranz, Kathryn Johnson, Aldis Sipolins, Charles Dickens, Nathan Ward, Alexandra Geyer, Arthur F. Kramer (2015-10-19):

    Although some studies have shown that cognitive training can produce improvements to untrained cognitive domains (far transfer), many others fail to show these effects, especially when it comes to improving fluid intelligence. The current study was designed to overcome several limitations of previous training studies by incorporating training expectancy assessments, an active control group, and “Mind Frontiers,” a video game-based mobile program comprised of six adaptive, cognitively demanding training tasks that have been found to lead to increased scores in fluid intelligence (Gf) tests. We hypothesize that such integrated training may lead to broad improvements in cognitive abilities by targeting aspects of working memory, executive function, reasoning, and problem solving. Ninety participants completed 20 hour-and-a-half long training sessions over four to five weeks, 45 of whom played Mind Frontiers and 45 of whom completed visual search and change detection tasks (active control). After training, the Mind Frontiers group improved in working memory n-back tests, a composite measure of perceptual speed, and a composite measure of reaction time in reasoning tests. No training-related improvements were found in reasoning accuracy or other working memory tests, nor in composite measures of episodic memory, selective attention, divided attention, and multi-tasking. Perceived self-improvement in the tested abilities did not differ between groups. A general expectancy difference in problem-solving was observed between groups, but this perceived benefit did not correlate with training-related improvement. In summary, although these findings provide modest evidence regarding the efficacy of an integrated cognitive training program, more research is needed to determine the utility of Mind Frontiers as a cognitive training tool.

  64. #melbylervag-hulme-2013

  65. 1991-clark.pdf

  66. 1991-anglin-instructionaltechnology.pdf

  67. 2009-zehdner.pdf

  68. ⁠, Giovanni Sala, N. Deniz Aksayli, K. Semir Tatlidil, Yasuyuki Gondo, Fernand Gobet (2019-11):

    • Working memory (WM) training does not enhance older adults’ cognitive function.
    • The training slightly improves older adults’ performance in untrained memory tasks.
    • The same pattern of results is observed in younger adults.
    • The models exhibit a high degree of consistency; hence this literature is not noisy.

    Abstract:

    In the last two decades, considerable efforts have been devoted to finding a way to enhance cognitive function by cognitive training. To date, the attempt to boost broad cognitive functions in the general population has failed. However, it is still possible that some cognitive training regimens exert a positive influence on specific populations, such as older adults. In this meta-analytic review, we investigated the effects of working memory (WM) training on older adults’ cognitive skills. Three robust-variance-estimation meta-analyses (N = 2140, m = 43, and k = 698) were run to analyze the effects of the intervention on (a) the trained tasks, (b) near-transfer measures, and (c) far-transfer measures. While large effects were found for the trained tasks (g = 0.877), only modest (g = 0.274) and near-zero (g = 0.121) effects were obtained in the near-transfer and far-transfer meta-analyses, respectively. Publication-bias analysis provided adjusted estimates that were slightly lower. Moreover, when active control groups were implemented, the far-transfer effects were null (g = −0.008). Finally, the effects were highly consistent across studies (ie., low or null true heterogeneity), especially in the near-transfer and far-transfer models. While confirming the difficulty in obtaining transfer effects with cognitive training, these results corroborate recent empirical evidence suggesting that WM is not isomorphic with other fundamental cognitive skills such as fluid intelligence.

  69. 2013-rapport.pdf: “Do programs designed to train working memory, other executive functions, and attention benefit children with ADHD? A meta-analytic review of cognitive, academic, and behavioral outcomes”⁠, Mark D. Rapport, Sarah A. Orban, Michael J. Kofler, Lauren M. Friedman

  70. 2019-long.pdf: “Suggestion of cognitive enhancement improves emotion regulation”⁠, Quanshan Long, Na Hu, Hanxiao Li, Yi Zhang, Jiajin Yuan, Antao Chen

  71. 2017-sala.pdf

  72. http://journals.sagepub.com/doi/full/10.1177/0963721417712760

  73. 2019-scherer.pdf: ⁠, Ronny Scherer, Fazilat Siddiq, Bárbara Sánchez Viveros (2019-07-01; psychology):

    Does computer programming teach students how to think? Learning to program computers has gained considerable popularity, and educational systems around the world are encouraging students in schools and even children in kindergartens to engage in programming activities. This popularity is based on the claim that learning computer programming improves cognitive skills, including creativity, reasoning, and mathematical skills.

    In this meta-analysis, we tested this claim performing a 3-level, random-effects meta-analysis on a sample of 105 studies and 539 effect sizes. We found evidence for a moderate, overall transfer effect (g = 0.49, 95% CI [0.37, 0.61]) and identified a strong effect for near transfer (g = 0.75, 95% CI [0.39, 1.11]) and a moderate effect for far transfer (g = 0.47, 95% CI [0.35, 0.59]). Positive transfer to situations that required creative thinking, mathematical skills, and metacognition, followed by spatial skills and reasoning existed. School achievement and literacy, however, benefited the least from learning to program. Moderator analyses revealed statistically-significantly larger transfer effects for studies with untreated control groups than those with treated (active) control groups. Moreover, published studies exhibited larger effects than gray literature.

    These findings shed light on the cognitive benefits associated with learning computer programming and contribute to the current debate surrounding the conceptualization of computer programming as a form of problem solving.

    [Keywords: cognitive skills, computational thinking, computer programming, three-level meta-analysis, transfer of skills, passive control group inflation, publication bias]

    Educational Impact and Implications Statement: In this meta-analysis, we tested the claim that learning how to program a computer improves cognitive skills even beyond programming. The results suggested that students who learned computer programming outperformed those who did not in programming skills and other cognitive skills, such as creative thinking, mathematical skills, metacognition, and reasoning. Learning computer programming has certain cognitive benefits for other domains.

    Moderators: …Statistically-significantly higher effects occurred for published literature (g = 0.60, 95% CI [0.45, 0.75]) than for gray literature (g = 0.34, 95% CI[0.15, 0.52]; QM[1] = 4.67, p = 0.03).

    Besides the publication status, only the type of treatment that control groups received (ie., treated vs. untreated) statistically-significantly explained Level 2 variance, QM(1) = 40.12, p < 0.001, R[^2^~2~]{.supsub} = 16.7%. More specifically, transfer effect sizes were statistically-significantly lower for studies including treated control groups (g = 0.16) than for studies including untreated control groups (g = 0.65). [0.65 / 0.16 = 400% bias].

    Figure 2a: Funnel plot
  74. 1993-lipsey.pdf: ⁠, Mark W. Lipsey, David B. Wilson (1993-12-01; psychology):

    Conventional reviews of research on the efficacy of psychological, educational, and behavioral treatments often find considerable variation in outcome among studies and, as a consequence, fail to reach firm conclusions about the overall effectiveness of the interventions in question. In contrast, meta-analysis reviews show a strong, dramatic pattern of positive overall effects that cannot readily be explained as artifacts of meta-analytic technique or generalized placebo effects. Moreover, the effects are not so small that they can be dismissed as lacking practical or clinical-significance. Although meta-analysis has limitations, there are good reasons to believe that its results are more credible than those of conventional reviews and to conclude that well-developed psychological, educational, and behavioral treatment is generally efficacious.

  75. https://statmodeling.stat.columbia.edu/wp-content/uploads/2019/02/DeQuidt_Haushofer_Roth_Demand_AER_2018.pdf

  76. 2013-boot.pdf: ⁠, Walter R. Boot, Daniel J. Simons, Cary Stothart, Cassie Stutts (2013-07-09; dual-n-back):

    To draw causal conclusions about the efficacy of a psychological intervention, researchers must compare the treatment condition with a control group that accounts for improvements caused by factors other than the treatment.

    Using an active control helps to control for the possibility that improvement by the experimental group resulted from a placebo effect. Although active control groups are superior to “no-contact” controls, only when the active control group has the same of improvement as the experimental group can we attribute differential improvements to the potency of the treatment. Despite the need to match expectations between treatment and control groups, almost no psychological interventions do so.

    This failure to control for expectations is not a minor omission—it is a fundamental design flaw that potentially undermines any causal inference. We illustrate these principles with a detailed example from the video-game-training literature showing how the use of an active control group does not eliminate expectation differences. The problem permeates other interventions as well, including those targeting mental health, cognition, and educational achievement.

    Fortunately, measuring expectations and adopting alternative experimental designs makes it possible to control for placebo effects, thereby increasing confidence in the causal efficacy of psychological interventions.

    [Keywords: intervention design, research methods, placebo effect, ]

  77. 2016-foroughi.pdf: ⁠, Cyrus K. Foroughi, Samuel S. Monfort, Martin Paczynski, Patrick E. McKnight, P. M. Greenwood (2016-07-05; dual-n-back):

    Placebo effects pose problems for some intervention studies, particularly those with no clearly identified mechanism. Cognitive training falls into that category, and yet the role of placebos in cognitive interventions has not yet been critically evaluated. Here, we show clear evidence of placebo effects after a brief cognitive training routine that led to substantial fluid intelligence gains. Our goal is to emphasize the importance of ruling out alternative explanations before attributing the effect to interventions. Based on our findings, we recommend that researchers account for placebo effects before claiming treatment effects.


    Although a large body of research shows that general cognitive ability is heritable and stable in young adults, there is recent evidence that fluid intelligence can be heightened with cognitive training. Many researchers, however, have questioned the methodology of the cognitive-training studies reporting improvements in fluid intelligence: specifically, the role of placebo effects. W

    e designed a procedure to intentionally induce a placebo effect via overt recruitment in an effort to evaluate the role of placebo effects in fluid intelligence gains from cognitive training. Individuals who self-selected into the placebo group by responding to a suggestive flyer showed improvements after a single, 1-h session of cognitive training that equates to a 5-point to 10-point increase on a standard IQ test. Controls responding to a non-suggestive flyer showed no improvement.

    These findings provide an alternative explanation for effects observed in the cognitive-training literature and the brain-training industry, revealing the need to account for confounds in future research.

    …We also observed differences between groups for scores on the Theories of Intelligence scale, which measures beliefs regarding the malleability of intelligence (34). The participants in the placebo group reported substantially higher scores on this index compared with controls [B = 14.96, SE = 1.93, t(48) = 7.75, p < 0.0001, d = 2.15], indicating a greater confidence that intelligence is malleable. These findings indicate that our manipulation via recruitment flyer produced statistically-significantly different groups with regard to expectancy. We did not detect differences in Need for Cognition scores (41) [B = 0.56, SE = 5.67, t(48) = 0.10, p = 0.922] (Figure 3). Together, these results support the interpretation that participants self-selected into groups based on differing expectations.

  78. https://www.youtube.com/watch?v=ZZitj9wBNTw

  79. https://www.nytimes.com/2008/04/29/health/research/29brai.html

  80. https://online.wsj.com/article/SB10001424052702304432304576371462612272884.html

  81. https://docs.google.com/file/d/0B-bpmBygrg8LUTJvazhzOFYwQUk/edit

  82. http://scottbarrykaufman.com/wp-content/uploads/2012/01/Nisbett-et-al.-2012.pdf

  83. 2000-duval.pdf

  84. http://handbook.cochrane.org/chapter_16/16_5_4_how_to_include_multiple_groups_from_one_study.htm

  85. 2010-chooi-table.pdf

  86. 2012-takeuchi.pdf: “Effects of working memory training on functional connectivity and cerebral blood flow during rest”⁠, Hikaru Takeuchi, Yasuyuki Taki, Rui Nouchi, Hiroshi Hashizume, Atsushi Sekiguchi, Yuka Kotozaki, Seishu Nakagawa, Calros M. Miyauchi, Yuko Sassa, Ryuta Kawashima

  87. ⁠, Kundu, Bornali Sutterer, David W. Emrich, Stephen M. Postle, Bradley R (2013):

    Although long considered a natively endowed and fixed trait, working memory (WM) ability has recently been shown to improve with intensive training. What remains controversial and poorly understood, however, are the neural bases of these training effects and the extent to which WM training gains transfer to other cognitive tasks. Here we present evidence from human electrophysiology (EEG) and simultaneous transcranial magnetic stimulation and EEG that the transfer of WM training to other cognitive tasks is supported by changes in task-related effective connectivity in frontoparietal and parieto-occipital networks that are engaged by both the trained and transfer tasks. One consequence of this effect is greater efficiency of stimulus processing, as evidenced by changes in EEG indices of individual differences in short-term memory capacity and in visual search performance. Transfer to search-related activity provides evidence that something more fundamental than task-specific strategy or stimulus-specific representations has been learned. Furthermore, these patterns of training and transfer highlight the role of common neural systems in determining individual differences in aspects of visuospatial cognition.

  88. 2014-schmiedek.pdf

  89. ⁠, Schmiedek, Florian Lövdén, Martin Lindenberger, Ulman (2010):

    We examined whether positive transfer of cognitive training, which so far has been observed for individual tests only, also generalizes to cognitive abilities, thereby carrying greater promise for improving everyday intellectual competence in adulthood and old age. In the COGITO Study, 101 younger and 103 older adults practiced six tests of perceptual speed (PS), three tests of working memory (WM), and three tests of episodic memory (EM) for over 100 daily 1-h sessions. Transfer assessment included multiple tests of PS, WM, EM, and reasoning. In both age groups, reliable positive transfer was found not only for individual tests but also for cognitive abilities, represented as latent factors. Furthermore, the pattern of correlations between latent change factors of practiced and latent change factors of transfer tasks indicates systematic relations at the level of broad abilities, making the interpretation of effects as resulting from unspecific increases in motivation or self-concept less likely.

  90. http://handbook.cochrane.org/chapter_9/9_4_5_2_meta_analysis_of_change_scores.htm

  91. https://news.ycombinator.com/item?id=6469832