In light of the ongoing SARS-CoV-2 pandemic, throughout the past 2 years, the question of whether virus research is safe has been widely discussed in the mainstream media, on social media, and indeed by the scientific community. Eminent scientists worldwide have proposed hypotheses supporting either zoonotic transmission of the virus or various forms of laboratory release. The evidence from both sides is compelling, but the lab-leak hypothesis makes for better viewing. The debate continues. However, as we have seen with increasing regularity over the past 50 years, nature is more than capable of bowling a googly.
The lab-leak hypothesis brings into question the safety of virus research or research conducted with dangerous pathogens. Our previous blog discussed the classification systems (Hazard Groups and Containment Levels) and facility design requirements for working safely with dangerous pathogens. It’s worth noting that these can be organisms that infect humans, animals, and plants. Human pathogens may impact us directly, but those that infect our food supplies have the potential to be equally devastating to human health, the economy, and society.
Research on dangerous pathogens in the UK is regulated through The Health and Safety at Work Act 1974, Control of Substances Hazardous to Health Regulations 2002, Animal Health Act 1981, The Specified Animal Pathogens Order 2008, The Environmental Protection Act 1990, and The Genetically Modified Organisms (Contained Use) Regulations 2014.
In the UK, research facilities using dangerous pathogens may be subject to regular inspections and audits by the Health and Safety Executive, the Home Office, Department for Environment, Food and Rural Affairs, and Counter-Terrorism Police. These agencies have the power to stop research, shut down facilities, and issue fines or prosecute organisations that are in breach of health and safety legislation.
Whether virus research is safe depends less on the experiments being performed and more on the how and where they’re conducted. For instance, there are legitimate reasons for objections to gain of function experiments, which have the potential to generate new strains or variants of a virus with increased pathogenicity. The argument in favour of this type of research is that it may allow scientists to characterise novel genome mutations and thus pre-empt their effects should they appear in nature. However, with great power comes great responsibility and facilities conducting such research need to have a strong rationale and rigid procedures for ensuring the safe handling, disposal, and security of these biological agents.
When procedures aren’t followed, there may be severe repercussions. For example, in 2007, Foot and Mouth disease virus was accidentally released from a research facility in the UK. The incident led to infections on nearby farms and the subsequent culling of infected animals. This relatively small outbreak cost approximately £47 million to contain. A review ordered by the British Government suggested that the probable cause was a live virus that had leaked from a faulty drain, which then contaminated land surrounding the facility. The infectious waste was then transferred to local farms on the wheels of a vehicle that had recently visited the research site. The review concluded the leak was caused by a lack of communication between the multiple organisations based at the site.
It is perhaps unsurprising that the main risk of infection posed by virus research is to laboratory personnel. The results of a survey conducted by Wurtz et al. (2016) and published in the European Journal of Clinical Microbiology and Infectious Diseases showed that the main causes of laboratory-acquired infections (LAI) were: technical failure of infrastructure, technical failure of equipment, failure to wear personal protective equipment, splashes and spills, bites and scratches, and inadequate compliance with safety rules. These data demonstrate that while human error is a factor, significantly more lab workers become infected due to failures of the plant or equipment that is meant to protect them.
The examples above demonstrate that accidents can happen in laboratories with dangerous pathogens. However, these events are rare when considering the millions of working hours spent by scientists conducting research into viruses worldwide each year. Occurrences whereby pathogens escape from the lab and cause infections in the community, either humans or animals, are even rarer.
To put this into context, nature throws new pathogens at us much more regularly. Since 1997 alone, we have seen the emergence of highly pathogenic strains of Influenza virus -Avian and Swine-, Severe Acute Respiratory Syndrome Coronavirus and Severe Acute Respiratory Syndrome Coronavirus 2, and Middle Eastern Respiratory Syndrome Coronavirus. An average of one every 5 years. A host of other highly pathogenic viruses have also been identified in the past 50 years, including Human Immunodeficiency virus, Hendra virus, Ebola virus and its close relative Marburg virus, Zika virus, and Lassa virus, to name but a few. As the last 2 years have demonstrated, emerging natural infections pose a serious risk to human health.
So, is virus research safe? In principle, the answer is yes. Research is conducted at specialist facilities by highly trained staff who are acutely aware of the hazards posed by the organisms they handle and the regulations about their use. However, as with most industries, scientific research comes with risks and can never be 100% safe. As with all risk assessments, the cost must be weighed against the benefits. As advancements made during the SARS-CoV-2 pandemic have shown, most often, the benefits are for the greater good.
Blog by Max Handley
Edited by Reckon Better Scientific Editing