Early bird prices are coming to an end soon... ⏰ Grab your tickets before January 17

This article was published on May 11, 2020

How COVID-19 is making scientific research more transparent


How COVID-19 is making scientific research more transparent Image by: Pixabay

The evolving COVID-19 pandemic has created an urgent need for scientific evidence, and quickly. We need politicians to be able to make informed decisions, and we need to support the development of effective vaccines and treatments, as well as understanding the unfolding impact of the pandemic on society. The speed with which the global scientific community has risen to this sudden pressing need is remarkable.

But science is usually a slow-moving process – a series of steps towards a better understanding, rather than individual “eureka” moments. Getting to the truth is often not straightforward, and scrutinising claims and counter-claims is an inherent part of the scientific method. Individual studies need to be replicated to see if the original observations are robust, and often they turn out not to be.

But now we are seeing – necessarily and understandably – a rush of studies attempting to add to our modest knowledge of the SARS-CoV-2 virus and provide answers to all of the other important questions emerging from the pandemic.

Read: [Small errors in coronavirus testing might lead to surprisingly big problems]

Some of these studies are conducted with limited resources, rather than specific funding for the purpose, although funders such as the Wellcome Trust and the UK Medical Research Council have moved fast to provide significant support for research activity in this area.

The 💜 of EU tech

The latest rumblings from the EU tech scene, a story from our wise ol' founder Boris, and some questionable AI art. It's free, every week, in your inbox. Sign up now!

The rise of the preprint

Scientific publishing is also changing.

Usually, scientific research is peer-reviewed before it is accepted for publication in a journal. This means that (typically) two or three researchers with relevant expertise have reviewed and critiqued the work, and often recommended revisions or even further experiments. It is meant to ensure that published work meets a certain minimum quality standard, although it is certainly by no means perfect. Even though it is the established means of ensuring quality, weak work can slip through, and strong work can be unfairly criticized and delayed.

Now, we are increasingly seeing more results posted to preprint servers for more rapid dissemination. A preprint is effectively the version of a scientific article that has not yet been peer-reviewed. It is usually posted around the same time it is submitted to a journal for review.

Preprint servers have been around for a long time in some disciplines – notably mathematics and physics, where arXiv has been in use since 1991 – and have existed in other guises, for example as “working papers” in areas such as economics. But they have only become widespread in recent years; there are now multiple platforms supporting preprints across a range of different disciplines, including biomedicine, for example, bioRxiv and medRxiv.

Often the published version of a study – the one that has passed peer review – is a little different from the preprint version. But sometimes changes are required, and often important ones, such as the inclusion of additional experiments or analyses that provide greater confidence in the overall conclusions of the work.

One of the advantages of preprints over traditional forms of peer review is that they allow more scrutiny from a far larger portion of the scientific community than is provided by the traditional peer-review process. The danger comes when a preliminary report is interpreted as definitive.

The fact that preprints should be treated as preliminary is well known by researchers. However, in the current situation, we are increasingly seeing results reported in preprints being picked up by the media. For example, a study of the prevalence of SARS-CoV-2 antibodies conducted in Santa Clara, California was reported by a number of outlets, including the Wall Street Journal, despite having been heavily criticized by some researchers.

Preprints allow researchers to get their results out quicker, but they should be treated with caution. Credits: Shutterstock

This in itself is not entirely new, but we are seeing rapid growth in preprints as scientists attempt to put their findings in the public domain as quickly as possible – at the beginning of April 2020, around 17% of COVID-19 publications were preprints. This is coupled with a desire for equally rapid dissemination of apparently noteworthy new findings by the media. The overall sense is that the scientific process has been accelerated.

But is this entirely a good thing? There is a long-standing aphorism – originally from engineering but perhaps applicable here – fast, cheap, good; you can pick two. We all know from personal experience that when we rush mistakes are more likely to happen. This is simply human nature, and scientists, however well trained and well-intentioned, are human too. The fundamentals of good design, careful conduct, and thoughtful interpretation apply even when there is a pressing need for knowledge.

These different issues – research conducted quickly and disseminated via preprints rapidly, and the media reporting these findings equally rapidly – perhaps conspire to mean we are at risk of generating and communicating findings that are not robust. And we have already begun to see retractions of COVID-19 research.

Transparency is everything

Work that is still at the preprint stage should be clearly reported as such by media outlets, and readers should treat the findings as preliminary. Perhaps more importantly, we all need to recognize that our knowledge will evolve, and no single study or finding will be definitive. Understanding COVID-19 is a team effort.

The current pandemic is unprecedented in recent history and has demonstrated the strength of the global scientific community. Resources have been rapidly diverted towards understanding the virus, modeling strategies to reduce its impact, developing vaccines and treatments, and more. Collaborations – both national and international – have emerged almost overnight, and preprint servers have experienced a surge of submissions. We are making progress and at an extraordinary pace.

However, we also need to ensure that our desire for speed in the generation of knowledge is not at the expense of quality. Given the importance and the immediacy of the challenge we face, rigorous and high-quality research is more important than ever. Transparency will be critical. By making study protocols, materials, data, and analysis plans available to researchers, work will be able to be scrutinized more closely, and any errors detected and corrected more rapidly. Indeed, the mere act of making our research transparent may encourage more error-checking before we release our work.

There is an urgent need for data and knowledge, but it is critically important that research is of high quality and that the knowledge generated is robust. False information is worse than no information at all.The Conversation

This article is republished from The Conversation by Marcus Munafo, Professor of Biological Psychology, University of Bristol under a Creative Commons license. Read the original article.

Get the TNW newsletter

Get the most important tech news in your inbox each week.

Also tagged with