Analytical reproducibility and data reusability of published meta-analyses on clinical psychology interventions
Ver/Abrir:
Exportar referencia:
Compartir:
Estadísticas:
Ver estadísticasMetadatos
Mostrar el registro completo del ítemAutor(es):
López-Nicolás, Rubén; Lakens, Daniel; López-López, José Antonio; Rubio-Aparicio, María; Sandoval-Lentisco, Alejandro; [et al.]; ;Fecha de publicación:
2023Resumen:
Meta-analysis is one of the most useful research approaches, the relevance of which relies on its credibility. Reproducibility of scientific results could be considered as the minimal threshold of this credibility. We assessed the reproducibility of a sample of meta-analyses published between 2000-2020. From a random sample of 100 papers reporting results of meta-analyses of interventions in clinical psychology, 217 meta analyses were selected. We first tried to retrieve the original data by recovering a data file, recoding the data from document files, or requesting it from original authors. Second, through a multi-stage workflow, we tried to reproduce the main results of each meta-analysis. The original data were retrieved for 67% (146/217) meta-analyses. While this rate showed an improvement over the years, in only 5% of these cases was it possible to retrieve a data file ready for reuse. Of these 146, 52 showed a discrepancy larger than 5% in the main results in the first stage. For10 meta-analyses this discrepancy was solved after fixing a coding error of our data retrieval process and for 15 of them it was considered approximately reproduced in a qualitative assessment. In the remaining meta-analyses (18%, 27/146), different issues were identified in an in-depth review, such as reporting inconsistencies, lack of data, or transcription errors.Nevertheless, the numerical discrepancies were mostly minor, with little or no impact on the conclusions. Overall, one of the biggest threats to the reproducibility of meta-analysis is related to data availability and current data sharing practices in meta-analysis.
Meta-analysis is one of the most useful research approaches, the relevance of which relies on its credibility. Reproducibility of scientific results could be considered as the minimal threshold of this credibility. We assessed the reproducibility of a sample of meta-analyses published between 2000-2020. From a random sample of 100 papers reporting results of meta-analyses of interventions in clinical psychology, 217 meta analyses were selected. We first tried to retrieve the original data by recovering a data file, recoding the data from document files, or requesting it from original authors. Second, through a multi-stage workflow, we tried to reproduce the main results of each meta-analysis. The original data were retrieved for 67% (146/217) meta-analyses. While this rate showed an improvement over the years, in only 5% of these cases was it possible to retrieve a data file ready for reuse. Of these 146, 52 showed a discrepancy larger than 5% in the main results in the first stage. For10 meta-analyses this discrepancy was solved after fixing a coding error of our data retrieval process and for 15 of them it was considered approximately reproduced in a qualitative assessment. In the remaining meta-analyses (18%, 27/146), different issues were identified in an in-depth review, such as reporting inconsistencies, lack of data, or transcription errors.Nevertheless, the numerical discrepancies were mostly minor, with little or no impact on the conclusions. Overall, one of the biggest threats to the reproducibility of meta-analysis is related to data availability and current data sharing practices in meta-analysis.
Palabra(s) clave:
data reusability
data sharing
meta-analysis
reproducibility
research synthesis
Colecciones a las que pertenece:
- Artículos de revistas [689]