Sciety adds three new groups evaluating COVID-19 preprints
Built for researchers to learn about the latest results, the Sciety application has added to its growing network by showcasing open preprint evaluations from three new groups of experts.
Sciety is a tool for helping people navigate the growing preprint landscape, particularly those affiliated with the biomedical and life sciences. We are pleased to welcome three new groups to Sciety, allowing us to play a central role in bringing together evaluations of preprints on the novel coronavirus (COVID-19). These groups are the Novel Coronavirus Research Compendium (NCRC), Rapid Reviews: COVID-19 (RR:C19) and ScreenIT. Each comes with their own unique approach to evaluating the latest research findings, bringing diverse perspectives to the application.
With the rise in the early dissemination of scientific discovery, it has become increasingly difficult to keep up with the latest research results and decide where to invest precious reading time. There has been significant growth in research outputs relating to the COVID-19 pandemic and the need for an efficient system to help readers navigate new results has become clear. Our three new groups have met this challenge by producing high-quality screenings, reviews and summaries of key findings, and Sciety has brought them all together in one convenient and accessible place.
eLife announced support for the NCRC earlier this year and including this group’s evaluation content on Sciety is a significant part of the collaboration. The faculty, fellows, alumni and students behind the Compendium select research for public health action and assign teams of experts to answer a series of standardised questions related to the study. The outputs are collated in a single evaluation that we present as an event in a paper’s activity stream on Sciety.
Similarly, the experts behind RR:C19 aim to keep researchers, policy makers and health leaders as well as academic journals and the media informed about the latest developments pertaining to the pandemic. They harness doctoral students and an AI-driven platform, COVIDScholar, to identify interesting preprints and then source peer reviews from academic experts. The reviews are published with a summary strength of evidence scale and accompanied by editorial commentary. Unlike traditional journals, they seek to both validate strong science and debunk preprints that are receiving unwarranted attention from policy-makers and the media.
ScreenIT, a pipeline of automated tools, is the brainchild of the Automated Screening Working Group. New COVID-19 preprints are screened daily and ScreenIT generates an easy-to-read table of results for each preprint. As with the NCRC and RR:C19, ScreenIT has focused its initial efforts on COVID-19 research, checking for common issues that may affect transparency and reproducibility. Examples include the availability of open data, the inclusion of a limitations section, ethics and consent statements, reporting of participants sex or gender, or information about whether experiments were blinded.
These new groups join 11 others already providing evaluations via Sciety, with groups such as eLife, Review Commons and preLights also covering COVID-19-related papers.
Each group has their own custom page, where they are encouraged to provide further context about their processes and people, allowing readers to get the most out of the evaluations they have produced. In this way, Sciety itself can be agnostic about the review models its groups choose to support, while offering a single, understandable interface for consuming the diverse range of content available.
Now we are growing the amount of evaluation accessible through Sciety, we’ll be concentrating on building features that give groups the opportunity to curate articles into different lists. We’ll also be improving the search function to include the details of evaluations and lists, offering new ways to discover content directly on Sciety.
Paul Shannon, Head of Technology at eLife, who is leading the efforts around Sciety, says: “The Sciety team was formed during the COVID-19 pandemic which means it is particularly important to us that we’ve been able to support the response to it. These groups have been innovative and are dedicated to the work they’re doing so I’m really excited to bring all of their evaluations together on Sciety. We want to be the home of public preprint evaluation and with these new groups we’re a great place for COVID-19 practitioners, researchers and anyone else interested in the latest developments to find high-quality evaluations on a topic that is affecting us all.”
You can easily find activities by all of these groups either through search or the dedicated Groups’ pages. To stay up to date with the latest developments, as well as opportunities for user testing and feedback, subscribe to our mailing list.
------------—ENDS-----------------
For more information about the NCRC and this group’s evaluations, visit https://go.sciety.org/ncrc.
For more information about the Rapid Reviews:COVID-19 and this group’s evaluations, visit https://go.sciety.org/rapid-reviews-c19.
For more information about the ScreenIT and this group’s evaluations, visit https://go.sciety.org/screenit.
To read about eLife’s support for the NCRC, visit https://elifesciences.org/for-the-press/a922314b/elife-collaborates-with-novel-coronavirus-research-compendium-on-manuscript-curation-and-review.
To stay up to date with the latest developments, as well as opportunities for user testing and feedback, subscribe to our mailing list at https://sciety.org/feedback.
And to find out more about Sciety, see https://sciety.org.
Media contact
Emily Packer, Media Relations Manager
eLife and Sciety
press@sciety.org
+441223855373
About
Sciety: The home of public preprint evaluation, Sciety is a new tool for helping people navigate the growing preprint landscape, particularly those affiliated with the biomedical and life sciences. Sciety brings together the evaluation output of many different groups that are providing opinions on preprints in the hope that their multiple perspectives will inform the research discovery process.
NCRC: The 2019 Novel Coronavirus Research Compendium (NCRC) is a centralized, publicly available resource that rapidly curates and reviews the emerging scientific evidence about SARS-CoV-2 and COVID-19. Our goal is to provide accurate, relevant information for global public health action by clinicians, public health practitioners, and policy makers.
Rapid Reviews: COVID-19: One of the missions of RR:C19 is to accelerate peer review of COVID-19-related research across a wide range of disciplines and deliver almost-real-time, dependable scientific information that policymakers, scholars, and health leaders can use. We do this by soliciting rapid peer reviews of time-sensitive and interesting preprints, which are then published online and linked to the preprint servers that host the manuscripts. An underlying value of RR:C19 is transparency and reliability. Each review is published online and assigned a DOI, making the work fully citable and claimable on reviewers’ ORCID and Publons accounts. Our standard practice is to publish the reviewers’ full names and affiliations, although we will allow reviewers to publish their review anonymously, upon request. RR:C19 assesses manuscripts with the same level of rigour as found for top journals. Assessments include how well the paper’s conclusions are substantiated by the research, graded on the RR:C19 Strength of Evidence Scale, with supporting comments.
ScreenIT: The Automated Screening Working Groups is a group of software engineers and biologists passionate about improving scientific manuscripts on a large scale. Our goal is to process every manuscript in the biomedical sciences as it is being submitted for publication, and provide customized feedback to improve that manuscript. Our members have created tools that check for common problems in scientific manuscripts, including information needed to improve transparency and reproducibility. We have combined our tools into a single pipeline, called ScreenIT.