Credibilidad o barbarie: Cómo la crisis de replicación ha desatado una revolución en Psicología y otras ciencias

Autores/as

DOI:

https://doi.org/10.5944/ap.22.1.43231

Palabras clave:

crisis de replicacion, revolucion de la credibilidad, prácticas cuestionables de investgación, meta-ciencia, ciencia abierta

Resumen

La ciencia actual vive tiempos críticos y revolucionarios. El surgimiento de la “crisis de replicación” ha supuesto un reto estructural histórico, junto con mala práxis científica y problemas derivados con una estructura de incentivos perversa en el sistema de publicaciones. Como respuesta, se han propuesto diversas reformas en la comunidad científica conocidas como la “revolución de la credibilidad”. En este artículo revisamos estos sucesos, su cronología, características principales, y su relación entre ellos. De esta forma, buscamos divulgar y formar a los lectores en las nuevas prácticas de la comunidad científica en Psicología y otras ciencias para producir y consumir una ciencia más íntegra y replicable. En definitiva, concienciar y formar parte de una mejor comunidad científica para los retos del siglo XXI.

Descargas

Los datos de descargas todavía no están disponibles.

Citas

Aczel, B., Szaszi, B. y Holcombe, A. O. (2021). A Billion-Dollar Donation: Estimating the Cost of Researchers’ Time Spent on Peer Review. Research Integrity and Peer Review, 6(1) Artículo 14. https://doi.org/10.1186/s41073-021-00118-2

Alipourfard, N., Arendt, B., Benjamin, D. M., Benkler, N., Bishop, M., Burstein, M., Bush, M., Caverlee, J., Chen, Y., Clark, C., Dreber Almenberg, A., Errington, T. M., Fidler, F., Field, S., Fox, N., Frank, A., Fraser, H., Friedman, S., Gelman, B., Gentile, J., … Wu, J. (2021). Systematizing Confidence in Open Research and Evidence (SCORE). SocArXiv, 1–33. https://doi.org/10.31235/osf.io/46mnb

Al-Khatib, A. y Teixeira da Silva, J. A. (2016). Stings, Hoaxes and Irony Breach the Trust Inherent in Scientific Publishing. Publishing Research Quarterly, 32(3), 208–219. https://doi.org/10.1007/s12109-016-9473-4

Anderson, S. F. y Liu, X. (2023). Questionable Research Practices and Cumulative Science: The Consequences of Selective Reporting on Effect Size Bias and Heterogeneity. Psychological Methods. https://doi.org/10.1037/met0000572

Ansede, M. (2024, octubre 16). La editorial Springer Nature retira 75 estudios del rector de Salamanca y sus colaboradores por prácticas fraudulentas. El País. https://elpais.com/ciencia/2024-10-16/la-editorial-springer-nature-retira-75-estudios-del-rector-de-salamanca-y-sus-colaboradores-por-practicas-fraudulentas.html

Baker, M. (2016). 1,500 Scientists Lift the Lid on Reproducibility. Nature, 533(7604), 452–454. https://doi.org/10.1038/533452a

Bartoš, F., Maier, M., Shanks, D. R., Stanley, T. D., Sladekova, M. y Wagenmakers, E.-J. (2023). Meta-analyses in Psychology often Overestimate Evidence for and Size of Effects. Royal Society Open Science, 10(7), Artículo 230224. https://doi.org/10.1098/rsos.230224

BBC Mundo. (2024, Noviembre). Quién es Robert Kennedy Jr., el activista antivacunas y heredero de la dinastía Kennedy al que Trump elige para dirigir el Departamento de Salud. BBC Mundo. https://www.bbc.com/mundo/articles/c33e815jdpxo

Bem, D. J. (2011). Feeling the Future: Experimental Evidence for Anomalous Retroactive Influences on Cognition and Affect. Journal of Personality and Social Psychology, 100(3), 407–425. https://doi.org/10.1037/a0021524

Bem, D., Tressoldi, P., Rabeyron, T., y Duggan, M. (2015). Feeling the Future: A Meta-Analysis of 90 Experiments on the Anomalous Anticipation of Random Future Events. F1000Research, 4, 1188. https://doi.org/10.12688/f1000research.7177.2

Benjamin, D. J., Berger, J. O., Johannesson, M., Nosek, B. A., Wagenmakers, E.-J., Berk, R., Bollen, K. A., Brembs, B., Brown, L., Camerer, C., Cesarini, D., Chambers, C. D., Clyde, M., Cook, T. D., De Boeck, P., Dienes, Z., Dreber, A., Easwaran, K., Efferson, C., Fehr, E., … Johnson, V. E. (2018). Redefine Statistical Significance. Nature Human Behaviour, 2(1), 6–10. https://doi.org/10.1038/s41562-017-0189-z

Berberi, I., y Roche, D. G. (2022). No Evidence that Mandatory Open Data Policies Increase Error Correction. Nature Ecology & Evolution, 6(11), 1630–1633. https://doi.org/10.1038/s41559-022-01879-9

Bernal, I. y Perakakis, P. (2023). No-pay Publishing: Use Institutional Repositories. Nature, 619(7971), 698–698. https://doi.org/10.1038/d41586-023-02315-z

Brainard, J. (2023). Fast-growing Open-Access Journals Stripped of coveted Impact Factors. Science, 379(6639), 1283–1284. https://doi.org/10.1126/science.adi0098

Brembs, B., Button, K., y Munafò, M. (2013). Deep Impact: Unintended Consequences of Journal Rank. Frontiers in Human Neuroscience, 7, Artículo 291. https://doi.org/10.3389/fnhum.2013.00291

Brembs, B., Huneman, P., Schönbrodt, F., Nilsonne, G., Susi, T., Siems, R., Perakakis, P., Trachana, V., Ma, L. y Rodriguez-Cuadrado, S. (2023). Replacing Academic Journals. Royal Society Open Science, 10(2), 230206. https://doi.org/10.1098/rsos.230206

Bohannon, J. (2016). About 40% of economics experiments fail replication survey. Science, 3. https://www.science.org/content/article/about-40-economics-experiments-fail-replication-survey

Boyce, V., Prystawski, B., Abutto, A. B., Chen, E. M., Chen, Z., Chiu, H., Ergin, I., Gupta, A., Hu, C., Kemmann, B., Klevak, N., Lua, V. Y. Q., Mazzaferro, M. M., Mon, K., Ogunbamowo, D., Pereira, A., Troutman, J., Tung, S., Uricher, R. y Frank, M. C. (2024). Estimating the Replicability of Psychology Experiments After an Initial Failure to Replicate. Collabra: Psychology, 10(1), Artículo 125685. https://doi.org/10.1525/collabra.125685

Butler, L.-A., Matthias, L., Simard, M.-A., Mongeon, P. y Haustein, S. (2023). The Oligopoly’s Shift to Open Access: How the Big Five Academic Publishers Profit from Article Processing Charges. Quantitative Science Studies, 4(4), 778–799. https://doi.org/10.1162/qss_a_00272

Camerer, C. F., Dreber, A., Forsell, E., Ho, T.-H., Huber, J., Johannesson, M., Kirchler, M., Almenberg, J., Altmejd, A., Chan, T., Heikensten, E., Holzmeister, F., Imai, T., Isaksson, S., Nave, G., Pfeiffer, T., Razen, M. y Wu, H. (2016). Evaluating Replicability of Laboratory Experiments in Economics. Science, 351(6280), 1433–1436. https://doi.org/10.1126/science.aaf0918

Cardeña, E. (2018). The Experimental Evidence for Parapsychological Phenomena: A Review. American Psychologist, 73(5), 663–677. https://doi.org/10.1037/amp0000236

Catazaro, M. (2023). Saudi Universities Entice Top Scientists to Switch Affiliations—Sometimes with Cash. Nature, 617(7961), 446–447. https://doi.org/10.1038/d41586-023-01523-x

Chu, J. S. G. y Evans, J. A. (2021). Slowed Canonical Progress in large Fields of Science. Proceedings of the National Academy of Sciences, 118(41), Articulo e2021636118. https://doi.org/10.1073/pnas.2021636118

Cobey, K. D., Lalu, M. M., Skidmore, B., Ahmadzai, N., Grudniewicz, A. y Moher, D. (2018). What is a Predatory Journal? A Scoping Review. F1000Research, 7. https://doi.org/10.12688/f1000research.15256.2

Doyen, S., Klein, O., Pichon, C. L. y Cleeremans, A. (2012). Behavioral Priming: It's All in the Mind, but Whose Mind? PloS one, 7(1), Artículo e29081. https://doi.org/10.1371/journal.pone.0029081

Edwards, M. A. y Roy, S. (2017). Academic Research in the 21st Century: Maintaining Scientific Integrity in a Climate of Perverse Incentives and Hypercompetition. Environmental Engineering Science, 34(1), 51–61. https://doi.org/10.1089/ees.2016.0223

Fleerackers, A., Ratcliff, C. L., Wicke, R., King, A. J. y Jensen, J. D. (2024). Public Understanding of Preprints: How Audiences Make Sense of Unreviewed Research in the News. Public Understanding of Science, 34(2), 154–171. https://doi.org/10.1177/09636625241268881

Enserink, M. (2012). Final Report: Stapel Affair Points to Bigger Problems in Social Psychology. Science. https://www.science.org/content/article/final-report-stapel-affair-points-bigger-problems-social-psychology

Eronen, M. I. y Bringmann, L. F. (2021). The Theory Crisis in Psychology: How to Move Forward. Perspectives on Psychological Science, 16(4), 779–788. https://doi.org/10.1177/1745691620970586

Errington, T. M., Mathur, M., Soderberg, C. K., Denis, A., Perfito, N., Iorns, E. y Nosek, B. A (2021). Investigating the Replicability of Preclinical Cancer Biology. eLife, 10, Article 71601. https://doi.org/10.7554/eLife.71601

Feynman, R. P. (1998). Cargo Cult Science*. En J. Williams (Ed.), The Art and Science of Analog Circuit Design (pp. 55–61). Newnes. https://doi.org/10.1016/B978-075067062-3/50008-X

Flake, J. K., y Fried, E. I. (2020). Measurement Schmeasurement: Questionable Measurement Practices and How to Avoid Them. Advances in Methods and Practices in Psychological Science, 3(4), 456–465. https://doi.org/10.1177/2515245920952393

Forozish, A. O. (2024). How the Credibility Revolution Created a Paradigm Shift. Available at SSRN 4744474. https://doi.org/10.2139/ssrn.4744474

Gertler, P., Galiani, S. y Romero, M. (2018). How to Make Replication the Norm. Nature, 554(7693), 417–419. https://doi.org/10.1038/d41586-018-02108-9

Gerrits, R. G., Jansen, T., Mulyanto, J., van den Berg, M. J., Klazinga, N. S. y Kringos, D. S. (2019). Occurrence and Nature of Questionable Research Practices in the Reporting of Messages and Conclusions in International Scientific Health Services Research Publications: A Structured Assessment of Publications Authored by Researchers in the Netherlands. BMJ Open, 9(5), Artículo e027903. https://doi.org/10.1136/bmjopen-2018-027903

Giner-Sorolla, R. (2012). Science or Art? How Aesthetic Standards Grease the Way through the Publication Bottleneck but Undermine Science. Perspectives on Psychological Science, 7(6), 562–571. https://doi.org/10.1177/1745691612457576

Gomez, C. J., Herman, A. C. y Parigi, P. (2022). Leading Countries in Global Science Increasingly Receive more Citations than other Countries Doing Similar Research. Nature Human Behaviour, 6(7), 919–929. https://doi.org/10.1038/s41562-022-01351-5

Greenland, S., Senn, S. J., Rothman, K. J., Carlin, J. B., Poole, C., Goodman, S. N. y Altman, D. G. (2016). Statistical Tests, P Values, Confidence Intervals, and Power: A guide to Misinterpretations. European Journal of Epidemiology, 31(4), 337–350. https://doi.org/10.1007/s10654-016-0149-3

Grossmann, A. y Brembs, B. (2021). Current Market Rates for Scholarly Publishing Services. F1000Research, 10, 20. https://doi.org/10.12688/f1000research.27468.2

Hardwicke, T. E. y Wagenmakers, E.-J. (2023). Reducing Bias, Increasing Transparency, and Calibrating Confidence with Preregistration. Nature Human Behaviour, 7(1), 15–26. https://doi.org/10.1038/s41562-022-01497-2

Haig, B. D. (2017). Tests of Statistical Significance Made Sound. Educational and Psychological Measurement, 77(3), 489–506. https://doi.org/10.1177/0013164416667981

Hussey, I., Alsalti, T., Bosco, F., Elson, M. y Arslan, R. (2025). An Aberrant Abundance of Cronbach’s Alpha Values at. 70. Advances in Methods and Practices in Psychological Science, 8(1). https://doi.org/10.1177/25152459241287123

Ioannidis, J. P. A. (2005). Why Most Published Research Findings are False. PLOS Medicine, 2(8), Artículo e124. https://doi.org/10.1371/journal.pmed.0020124

Ioannidis, J. P. A. (2008). Why Most Discovered True Associations are Inflated. Epidemiology, 19(5), 640–648. https://doi.org/10.1097/EDE.0b013e31818131e7

Jarke, H., Anand-Vembar, S., Alzahawi, S., Andersen, T. L., Bojanić, L., Carstensen, A., Feldman, G., Garcia-Garzon, E., Kapoor, H., Lewis, S., Todsen, A. L., Većkalov, B., Zickfeld, J. H. y Geiger, S. J. (2022). A Roadmap to Large-Scale Multi-Country Replications in Psychology. Collabra: Psychology, 8(1), Artículo 57538. https://doi.org/10.1525/collabra.57538

Jobst, L. J., Bader, M. y Moshagen, M. (2023). A tutorial on assessing statistical power and determining sample size for structural equation models. Psychological Methods, 28(1), 207–221. https://doi.org/10.1037/met0000423

Kang, H. (2021). Sample size determination and power analysis using the G*Power software. Journal of Educational Evaluation for Health Professions, 18, 1–12. https://doi.org/10.3352/jeehp.2021.18.17

Kekecs, Z., Palfi, B., Szaszi, B., Szecsi, P., Zrubka, M., Kovacs, M., Bakos, B. E., Cousineau, D., Tressoldi, P., Schmidt, K., Grassi, M., Evans, T. R., Yamada, Y., Aczel, B., Adam-Troian, J., Albers, C. J., Alfano, M., Alicke, M. D., Alister, C., … Nosek, B. A. (2023). Raising the Value of Research Studies in Psychological Science by Increasing the Credibility of Research Reports: The Transparent Psi Project. Royal Society Open Science, 10(2), Artículo 191375. https://doi.org/10.1098/rsos.191375

Klein, R. A., Vianello, M., Hasselman, F., Adams, B. G., Adams, R. B., Jr., Alper, S., Aveyard, M., Axt, J. R., Babalola, M. T., Bahník, Š., Batra, R., Berkics, M., Bernstein, M. J., Berry, D. R., Bialobrzeska, O., Binan, E. D., Bocian, K., Brandt, M. J., Busching, R., Cabak Rédei, A., … Nosek, B. A. (2018). Many Labs 2: Investigating variation in replicability across samples and settings. Advances in Methods and Practices in Psychological Science, 1(4), 443–490. https://doi.org/10.1177/2515245918810225

Korbmacher, M., Azevedo, F., Pennington, C. R., Hartmann, H., Pownall, M., Schmidt, K., Elsherif, M., Breznau, N., Robertson, O., Kalandadze, T., Yu, S., Baker, B. J., O’Mahony, A., Olsnes, J. Ø.-S., Shaw, J. J., Gjoneska, B., Yamada, Y., Röer, J. P., Murphy, J., Alzahawi, S., … Evans, T. (2023). The Replication Crisis has led to Positive Structural, Procedural, and Community Changes. Communications Psychology, 1(1), 1–13. https://doi.org/10.1038/s44271-023-00003-2

Kruschke, J. K. (2021). Bayesian Analysis Reporting Guidelines. Nature Human Behaviour, 5(10), 1282-1291. https://doi.org/10.1038/s41562-021-01177-7

Lakens, D. (2022). Sample Size Justification. Collabra: Psychology, 8(1), Artículo 33267. https://doi.org/10.1525/collabra.33267

Lakens, D. (2024). When and how to deviate from a preregistration. Collabra: Psychology, 10(1), Artículo 117094. https://doi.org/10.1525/collabra.117094

Lakens, D., Hilgard, J. y Staaks, J. (2016). On the Reproducibility of Meta-Analyses: Six Practical Recommendations. BMC Psychology, 4(1), Artículo 24. https://doi.org/10.1186/s40359-016-0126-3

Lakens, D., Mesquida, C., Rasti, S., y Ditroilo, M. (2024). The benefits of preregistration and registered reports. Evidence-Based Toxicology, 2(1), 2376046. https://doi.org/10.1080/2833373X.2024.2376046

Leitgöb, H., Seddig, D., Asparouhov, T., Behr, D., Davidov, E., De Roover, K., Jak, S., Meitinger, K., Menold, N., Muthén, B., Rudnev, M., Schmidt, P. y van de Schoot, R. (2023). Measurement invariance in the social sciences: Historical development, methodological challenges, state of the art, and future perspectives. Social Science Research, 110, Artículo 102805. https://doi.org/10.1016/j.ssresearch.2022.102805

Liverpool, L. (2023). AI Intensifies Fight against ‘Paper Mills’ that churn out Fake Research. Nature, 618(7964), 222–223. https://doi.org/10.1038/d41586-023-01780-w

Maddi, A. y Sapinho, D. (2023). On the Culture of Open Access: The Sci-hub paradox. Scientometrics, 128(10), 5647–5658. https://doi.org/10.1007/s11192-023-04792-5

Maier, M., y Lakens, D. (2022). Justify your Alpha: A Primer on two Practical Approaches. Advances in Methods and Practices in Psychological Science, 5(2), 25152459221080396. https://doi.org/10.1177/25152459221080396

Mayo, D. G. (2018). Statistical Inference as Severe Testing: How to get Beyond the Statistics Wars. Cambridge University Press.

Mayoni, S. (2022, diciembre 12). Scientific publishers are reaping huge profits from the work of researchers, and the universities are paying for it. University Post – Independent of management. https://uniavisen.dk/en/scientific-publishers-are-reaping-huge-profits-from-the-work-of-researchers-and-the-universities-are-paying-for-it/

McShane, B. B., Gal, D., Gelman, A., Robert, C. y Tackett, J. L. (2019). Abandon Statistical Significance. The American Statistician, 73(sup1), 235–245. https://doi.org/10.1080/00031305.2018.1527253

Mole, B. (2024, enero 22). Top Harvard cancer researchers accused of scientific fraud; 37 studies affected. Ars Technica. https://arstechnica.com/science/2024/01/top-harvard-cancer-researchers-accused-of-scientific-fraud-37-studies-affected/

Nagy, T., Hergert, J., Elsherif, M., Wallrich, L., Schmidt, K., Waltzer, T., Payne, J. W., Gjoneska, B., Seetahul, Y., Wang, Y. A., Scharfenberg, D., Tyson, G., Yang, Y.-F., Skvortsova, A., Alarie, S., Graves, K. A., Sotola, L. K., Moreau, D. y Rubínová, E. (2024). Bestiary of Questionable Research Practices in Psychology. https://osf.io/preprints/psyarxiv/fhk98Nelson

Nelson, L. D., Simmons, J. y Simonsohn, U. (2018). Psychology’s Renaissance. Annual Review of Psychology, 69(1), 511–534. https://doi.org/10.1146/annurev-psych-122216-011836

Nuijten, M. B. y Wicherts, J. M. (2024). Implementing Statcheck during Peer Review is Related to a Steep Decline in Statistical-Reporting Inconsistencies. Advances in Methods and Practices in Psychological Science, 7(2), 1–14. https://doi.org/10.1177/2515245924125894

Nielsen, M. W. y Andersen, J. P. (2021). Global citation inequality is on the rise. Proceedings of the National Academy of Sciences of the United States of America, 118(7), 1–10. https://doi.org/10.1073/pnas.2012208118

Nosek, B. A., Hardwicke, T. E., Moshontz, H., Allard, A., Corker, K. S., Dreber, A., Fidler, F., Hilgard, J., Kline Struhl, M., Nuijten, M. B., Rohrer, J. M., Romero, F., Scheel, A. M., Scherer, L. D., Schönbrodt, F. D. y Vazire, S. (2022). Replicability, Robustness, and Reproducibility in Psychological Science. Annual Review of Psychology, 73, 719–748. https://doi.org/10.1146/annurev-psych-020821-114157

Nosek, B. A., Spies, J. R. y Motyl, M. (2012). Scientific Utopia: II. Restructuring Incentives and Practices to Promote Truth over Publishability. Perspectives on Psychological Science, 7(6), 615–631. https://doi.org/10.1177/174569161245905

Open Science Collaboration. (2015). Estimating the Reproducibility of Psychological Science. Science, 349(6251), Artículo aac4716 https://doi.org/10.1126/science.aac4716

Park, M., Leahey, E. y Funk, R. J. (2023). Papers and Patents are Becoming Less Disruptive over time. Nature, 613(7942), 138–144. https://doi.org/10.1038/s41586-022-05543-x

Parsons, S. (2022). Exploring Reliability Heterogeneity with Multiverse Analyses: Data Processing Decisions Unpredictably Influence Measurement Reliability. Meta-Psychology, 6, 1–22. https://doi.org/10.15626/MP.2020.2577

Parsons, J. A., Alperin, J. P., Bishop, D. V. M., Bowman, T. D., Crick, T., de Rijcke, S., Fortunato, S., Frassl, M. A., Guggenheim, C., Hahnel, M., Heise, C., Kramer, B., Labib, K., Loizides, F., Madan, C. R., Moore, S., O’Donnell, D. P., Rice, D. B., Ross-Hellauer, T., ... Sugimoto, C. R. (2022). A Community-Sourced Glossary of open Scholarship Terms. Nature Human Behaviour, 6(3), 312–318. https://doi.org/10.1038/s41562-021-01269-4

Pek, J. y Flora, D. B. (2018). Reporting Effect Sizes in Original Psychological Research: A Discussion and Tutorial. Psychological Methods, 23(2), 208–225. https://doi.org/10.1037/met0000126

Perakakis, P. (2021, mayo 3). ¿Qué son los «transformative agreements» y qué necesitamos saber antes de utilizarlos? Pandelis Perakakis. https://pandelisperakakis.info/2021/05/03/que-son-los-transformative-agreements-y-que-necesitamos-saber-antes-de-utilizarlos/

Retraction Watch (2015, junio 16) The Retraction Watch Leaderboard. https://retractionwatch.com/the-retraction-watch-leaderboard/. Accedido el 21/10/2024

Retraction Watch. (2024, marzo 18). Papers and peer reviews with evidence of ChatGPT writing. Retraction Watch. https://retractionwatch.com/papers-and-peer-reviews-with-evidence-of-chatgpt-writing/. Recuperado el 21/10/2024.

Rosenthal, R. (1979). The File Drawer Problem and Tolerance for Null Results. Psychological Bulletin, 86(3), 638–641. https://doi.org/10.1037/0033-2909.86.3.638

Ruggeri, K., Većkalov, B., Bojanić, L., Andersen, T. L., Ashcroft-Jones, S., Ayacaxli, N., Barea-Arroyo, P., Berge, M. L., Bjørndal, L. D., Bursalıoğlu, A., Bühler, V., Čadek, M., Çetinçelik, M., Clay, G., Cortijos-Bernabeu, A., Damnjanović, K., Dugue, T. M., Esberg, M., Esteban-Serna, C., Felder, E. N., ... Folke, T. (2021). The General Fault in our Fault Lines. Nature Human Behaviour, 5(10), 1369–1380. https://doi.org/10.1038/s41562-021-01092-x

Schiavone, S. R., y Vazire, S. (2023). Reckoning with our Crisis: An agenda for the Field of Social and Personality Psychology. Perspectives on Psychological Science, 18(3), 710–722. https://doi.org/10.1177/17456916221101060

Silverstein, P., Elman, C., Montoya, A., McGillivray, B., Pennington, C. R., Harrison, C. H., Steltenpohl, C. N., Röer, J. P., Corker, K. S., Charron, L. M., Elsherif, M., Malicki, M., Hayes-Harb, R., Grinschgl, S., Neal, T., Evans, T. R., Karhulahti, V.-M., Krenzer, W. L. D., Belaus, A., Moreau, D., ... Syed, M. (2024). A guide for Social Science Journal Editors on Easing into Open Science. Research Integrity and Peer Review, 9(1), Artículo 2. https://doi.org/10.1186/s41073-023-00141-5

Simmons, J. P., Nelson, L. D. y Simonsohn, U. (2011). False-positive Psychology: Undisclosed Flexibility in Data Collection and Analysis Allows Presenting Anything as Significant. Psychological Science, 22(11), 1359–1366. https://doi.org/10.1177/0956797611417632

Simonsohn, U., Nelson, L. D. y Simmons, J. P. (2014). P-curve: A Key to the File-Drawer. Journal of Experimental Psychology: General, 143(2), 534–547. https://doi.org/10.1037/a0033242

Simonsohn, U., Nelson, L. y Simmons, J. (2023, septiembre 1). [113] Data Litigada: Thank You (And An Update). Data Colada. https://datacolada.org/113

Stagge, J. H., Rosenberg, D. E., Abdallah, A. M., Akbar, H., Attallah, N. A. y James, R. (2019). Assessing Data Availability and Research Reproducibility in Hydrology and Water Resources. Scientific Data, 6, Artículo 190030. https://doi.org/10.1038/sdata.2019.30

Steegen, S., Tuerlinckx, F., Gelman, A. y Vanpaemel, W. (2016). Increasing Transparency through a Multiverse Analysis. Perspectives on Psychological Science, 11(5), 702–712. https://doi.org/10.1177/1745691616658637

Stokel-Walker, C. (2023). ChatGPT listed as author on Research Papers: Many Scientists Disapprove. Nature, 613(7945), 620–621. https://doi.org/10.1038/d41586-023-00107-z

The Economist. (2023) There is a Worrying amount of Fraud in Medical Research. https://www.economist.com/science-and-technology/2023/02/22/there-is-a-worrying-amount-of-fraud-in-medical-research

Weissgerber, T. L., Milic, N. M., Winham, S. J. y Garovic, V. D. (2015). Beyond Bar and Line Graphs: Time for a New Data Presentation Paradigm. PLOS Biology, 13(4), Artículo e1002128. https://doi.org/10.1371/journal.pbio.1002128

Tsui, A. S. (2022). From Traditional Research to Responsible Research: The Necessity of Scientific Freedom and Scientific Responsibility for better Societies. Annual Review of Organizational Psychology and Organizational Behavior, 9, 1–32. https://doi.org/10.1146/annurev-orgpsych-062021-021303

UNESCO. (2022). Understanding open science—UNESCO Biblioteca Digital (UNESCO open science toolkit, p. 6) [Factsheet]. https://doi.org/10.54677/UTCD9302

Van den Akker, O. R., van Assen, M. A. L. M., Bakker, M., Elsherif, M., Wong, T. K. y Wicherts, J. M. (2024). Preregistration in Practice: A Comparison of Preregistered and Non-Preregistered Studies in Psychology. Behavior Research Methods, 56(6), 5424–5433. https://doi.org/10.3758/s13428-023-02277-0

Van Noorden, R. (2023a). Medicine is Plagued by Untrustworthy Clinical Trials. How many Studies are Faked or Flawed? Nature, 619(7970), 454–458. https://doi.org/10.1038/d41586-023-02299-w

Van Noorden, R. (2023b). More than 10,000 research papers were retracted in 2023—A new record. Nature, 624(7992), 479–481. https://doi.org/10.1038/d41586-023-03974-8

Vazire, S. (2018). Implications of the Credibility Revolution for Productivity, Creativity, and Progress. Perspectives on Psychological Science, 13(4), 411–417. https://doi.org/10.1177/1745691617751884

Walker, R. y Rocha da Silva, P. (2015). Emerging Trends in Peer Review—A Survey. Frontiers in Neuroscience, 9, Artículo 169. https://doi.org/10.3389/fnins.2015.00169

Wagenmakers, E.-J., Marsman, M., Jamil, T., Ly, A., Verhagen, J., Love, J., Selker, R., Gronau, Q. F., Šmíra, M., Epskamp, S., Matzke, D., Rouder, J. N. y Morey, R. D. (2018). Bayesian Inference for Psychology. Part I: Theoretical advantages and practical ramifications. Psychonomic Bulletin & Review, 25(1), 35–57. https://doi.org/10.3758/s13423-017-1343-3

Wagner III, J. A. (2022). The Influence of Unpublished Studies on Results of Recent Meta-Analyses: Publication Bias, the File Drawer Problem, and Implications for the Replication Crisis. International Journal of Social Research Methodology, 25(5), 639–644. https://doi.org/10.1080/13645579.2021.1922805

Wang, H., Chen, Y., Lin, Y., Abesig, J., Wu, I. X. y Tam, W. (2021). The Methodological Quality of Individual Participant Data Meta-Analysis on Intervention Effects: Systematic Review. BMJ, 372, Artículo 736. https://doi.org/10.1136/bmj.n736

White, N., Parsons, R., Collins, G. y Barnett, A. (2023). Evidence of Questionable Research Practices in Clinical Prediction Models. BMC Medicine, 21(1), Artículo 339. https://doi.org/10.1186/s12916-023-03048-6

Wingen, T., Berkessel, J. B. y Dohle, S. (2022). Caution, preprint! Brief Explanations Allow Nonscientists to Differentiate between Preprints and Peer-Reviewed Journal Articles. Advances in Methods and Practices in Psychological Science, 5(1), 1–15. https://doi.org/10.1177/25152459211070559

Wolfram, D., Wang, P., Hembree, A. y Park, H. (2020). Open Peer Review: Promoting Transparency in Open Science. Scientometrics, 125(2), 1033–1051. https://doi.org/10.1007/s11192-020-03488-4

Wood, B. D. K., Müller, R. y Brown, A. N. (2018). Push Button Replication: Is Impact Evaluation Evidence for International Development Verifiable? PLOS ONE, 13(12), Artículo e0209416. https://doi.org/10.1371/journal.pone.0209416

Youyou, W., Yang, Y. y Uzzi, B. (2023). A Discipline-wide Investigation of the Replicability of Psychology Papers over the past Two Decades. Proceedings of the National Academy of Sciences, 120(6), Artículo e2208863120. https://doi.org/10.1073/pnas.2208863120

Zong, Q., Xie, Y. y Liang, J. (2020). Does open Peer Review Improve Citation Count? Evidence from a Propensity Score Matching Analysis of PeerJ. Scientometrics, 125(1), 607–623. https://doi.org/10.1007/s11192-020-03545-y

Descargas

Publicado

2025-06-30

Cómo citar

Lecuona de la Cruz, O., Corradi, G., Angulo-Brunet, A. . ., & García-Garzón, E. (2025). Credibilidad o barbarie: Cómo la crisis de replicación ha desatado una revolución en Psicología y otras ciencias. Acción Psicológica, 22(1), 115–136. https://doi.org/10.5944/ap.22.1.43231

Número

Sección

Número especial: Nuevos avances metodológicos en Psicología

Artículos similares

1 2 > >> 

También puede Iniciar una búsqueda de similitud avanzada para este artículo.