lnu.sePublications
Change search
CiteExportLink to record
Permanent link

Direct link
Cite
Citation style
  • apa
  • ieee
  • modern-language-association-8th-edition
  • vancouver
  • Other style
More styles
Language
  • de-DE
  • en-GB
  • en-US
  • fi-FI
  • nn-NO
  • nn-NB
  • sv-SE
  • Other locale
More languages
Output format
  • html
  • text
  • asciidoc
  • rtf
Is the future of peer review automated?
Charite Univ Med Berlin, Germany.ORCID iD: 0000-0002-4830-309X
Queensland Univ Technol, Australia.
Charite Univ Med Berlin, Germany.
Linnaeus University, Faculty of Health and Life Sciences, Department of Psychology.ORCID iD: 0000-0003-1579-0730
Show others and affiliations
2022 (English)In: BMC Research Notes, E-ISSN 1756-0500, Vol. 15, no 1, article id 203Article in journal, Editorial material (Other academic) Published
Abstract [en]

The rising rate of preprints and publications, combined with persistent inadequate reporting practices and problems with study design and execution, have strained the traditional peer review system. Automated screening tools could potentially enhance peer review by helping authors, journal editors, and reviewers to identify beneficial practices and common problems in preprints or submitted manuscripts. Tools can screen many papers quickly, and may be particularly helpful in assessing compliance with journal policies and with straightforward items in reporting guidelines. However, existing tools cannot understand or interpret the paper in the context of the scientific literature. Tools cannot yet determine whether the methods used are suitable to answer the research question, or whether the data support the authors' conclusions. Editors and peer reviewers are essential for assessing journal fit and the overall quality of a paper, including the experimental design, the soundness of the study's conclusions, potential impact and innovation. Automated screening tools cannot replace peer review, but may aid authors, reviewers, and editors in improving scientific papers. Strategies for responsible use of automated tools in peer review may include setting performance criteria for tools, transparently reporting tool performance and use, and training users to interpret reports.

Place, publisher, year, edition, pages
Springer Nature, 2022. Vol. 15, no 1, article id 203
Keywords [en]
Rigor, Reproducibility, Transparency, Automated screening, Peer review
National Category
History of Science and Ideas
Research subject
Social Sciences
Identifiers
URN: urn:nbn:se:lnu:diva-115184DOI: 10.1186/s13104-022-06080-6ISI: 000809561400002PubMedID: 35690782Scopus ID: 2-s2.0-85131903377OAI: oai:DiVA.org:lnu-115184DiVA, id: diva2:1681188
Available from: 2022-07-06 Created: 2022-07-06 Last updated: 2025-02-21Bibliographically approved

Open Access in DiVA

No full text in DiVA

Other links

Publisher's full textPubMedScopus

Authority records

Brown, Nicholas

Search in DiVA

By author/editor
Schulz, RobertBrown, NicholasSalholz-Hillel, MaiaBandrowski, Anita
By organisation
Department of Psychology
In the same journal
BMC Research Notes
History of Science and Ideas

Search outside of DiVA

GoogleGoogle Scholar

doi
pubmed
urn-nbn

Altmetric score

doi
pubmed
urn-nbn
Total: 71 hits
CiteExportLink to record
Permanent link

Direct link
Cite
Citation style
  • apa
  • ieee
  • modern-language-association-8th-edition
  • vancouver
  • Other style
More styles
Language
  • de-DE
  • en-GB
  • en-US
  • fi-FI
  • nn-NO
  • nn-NB
  • sv-SE
  • Other locale
More languages
Output format
  • html
  • text
  • asciidoc
  • rtf