by Mark S Reed, Gavin B Stewart, Anthonia James and Ged Hall

The need for robust evidence to inform policy has never been greater, as countries around the world grapple with issues of unprecedented complexity. However, it is rare to find individual studies that conclusively resolve a major knowledge gap or controversy, whose findings are consistently reproduced by others. Instead, knowledge tends to accumulate incrementally via successive studies using different methods in different contexts, often leading to apparently contradictory findings. This can make it difficult to identify clear, evidence-informed policy options.

But there is a problem. Research funders regularly ask the teams to develop policy briefs, based on the single projects they have funded, despite the fact that the projects may have only considered a single system in isolation, under experimental conditions or based their findings on case study research. Researchers are also under increasing pressure to get their findings published in top journals, and in many disciplines, the more generalisable and internationally applicable the findings are, the more likely they are to be deemed as academically significant enough to warrant publication in top-flight journals. Even if researchers resist the temptation to over-claim, the editors and reviewers of many applied journals increasingly expect authors to discuss the implications of their work for policy and practice, and request revisions that may inadvertently encourage authors to over-generalise the relevance of their work beyond the contexts in which they collected their data. Your findings might be of international significance to other academics, but are you sure they are relevant to policy and practice in different national jurisdictions?

This is an important problem for at least two reasons. First, the policy options might not be sufficiently robust, and so might lead to unintended consequences. Although it might feel tempting to claim otherwise, if the evidence for a policy intervention is mixed or inconclusive, a clear finding from a single new study does not change the nature of the evidence base overall; there are still multiple studies with findings that contradict yours, as well as those that point in the same direction as your new study. Second, if changes in policy or practice are based on single studies, it is only a matter of time before a new study will come along with contradictory evidence, forcing a policy U-turn, and potentially undermining the trust of policymakers and the public in research.

A new approach

Evidence synthesis methods have been developed to resolve these tensions, showing where there is robust evidence across multiple studies and context to inform policy and practice, and where more research is needed, if the evidence is mixed or inconclusive. However, the typical systematic review takes many person-months of time and is therefore costly and may miss time-limited windows of policy opportunity. Moreover, few researchers have the skills to conduct robust evidence synthesis, and there are few post-graduate programmes that provide this training to early career researchers.

For this reason, Policy Leeds and N8 AgriFood teamed up with evidence synthesis methodologist, Dr Gavin Stewart and Professor Mark Reed, a visiting Professor at Leeds and N8 AgriFood Chair at the time, now at Scotland’s Rural College (SRUC), to deliver a training programme funded by N8 AgriFood and Research England (QR-SPF). Gavin created the evidence synthesis training programme, which aimed to give early career researchers, from across the N8 institutions, skills in evidence synthesis, whilst the rest of the team leveraged links in the policy world. The programme provided an opportunity to produce both peer-reviewed publications and policy briefs that could address policy challenges identified by the policy community. The model is not only incredibly simple – it delivers evidence synthesis for policy-makers in a fraction of the time and cost of traditional systematic reviews by facilitating production of rapid review and other evidence synthesis products.

This is the process:

  1. Identify evidence needs from policy colleagues in a thematic area (food and farming, in our case, via teams in Defra, Natural England, Environment Agency and Food Standards Agency in England and equivalent departments and agencies in devolved administrations)
  2. Offer training to early career researchers in rapid evidence synthesis and writing policy briefs
  3. Support trainees to write paper and policy brief
  4. Policy colleagues get evidence and early career researchers gain new skills and potentially publications.

One study calculated that a typical systematic review costs about £100,000 and takes between 6 months and 16 months assuming five co-authors devote 10–20 hours per week to the review, and another estimated 1–2 years for to complete a full systematic review. In contrast to this, despite a number of authors dropping out due to challenges posed by the pandemic, our whole programme worked out at £2336 per review, and in the case of the cohort who were able to attend the residential training, the majority completed a first draft of rapid reviews in one week. This included the training costs, the residential element and open access fees for two of articles, but did not include project management costs for N8 AgriFood.

Challenges

Our plans were however significantly disrupted by COVID, leading to one of the two planned residential training courses running online, and the majority of these researchers requiring support over a 6-12 month period after the online training, which was not budgeted for. Although the programme could have been delivered without this additional support, had both trainings operated as residential courses, about one person month of additional time was required between two staff.

Having said that, only 11 of the researchers completed evidence syntheses and/or policy briefs out of 28 who attended the original training. Had 20 of these researchers completed their work, the cost per synthesis/brief would have halved. There were a number of reasons why researchers did not complete the work:

  • The main reason for the high attrition rate was disruption to the planned residential training programme caused by COVID (five out of eight attending the residential course completed their work). The online provision for the second planned residential was spread out, and in hindsight, finding a way to generate the sustained focus of a residential course in the online world may have produced better results.
  • Linked to this, due to the delays caused by COVID, some participants reached the end of their contracts and started new roles that were no longer linked to the evidence synthesis or were too demanding for them to complete their work. A more intense online design may also have helped to alleviate this issue as we may have been able to complete the programme earlier.
  • A few participants hadn’t understood the commitment required of them, or the fact that they had to choose from a list of pre-determined policy-relevant review questions. Advertising for such a course should in future emphasise the responsibilities and commitments required for participants as much as the benefits of applying to join the programme.
  • In one case, a participant dropped out of the programme because they had terminated their PhD. In other cases, it was difficult to determine the reason why people did not complete the work, and because the programme did not have responsibility for line managing any of the participants, it was not possible to see or negotiate other competing responsibilities, which ultimately led to attrition. Ensuring that participants are clear regarding expectations and that there is strong alignment with ongoing work programmes fully understood by supervisory teams or PIs is an important learning point.

As a result of the delays to complete the programme, many members of the policy community who had posed questions had moved on. In future, more regular dialogue with policy contacts would be desirable, to ensure continuity as staff move on. As a solution to this, the team are planning policy webinars on themes that link a number of the reviews, to ensure recommendations reach the policy community. In addition to policy briefs, future programmes might also consider the production of videos, interactive online media or infographics as alternative modes of communication to policy colleagues. More work could also be done to follow up with policy colleagues to determine the long-term policy impacts of the programme.

Does it work?

Overall, despite the challenges posed by the pandemic to this programme of work, it was possible to provide a cohort of early career researchers with skills in evidence synthesis that they will be able to use elsewhere in their careers. The fact that the programme led to the publication of ten evidence syntheses and/or policy briefs addressing evidence gaps identified by the policy community, for such limited resources, represents remarkably good value for money. The potential speed with which this can deliver outputs to the policy community is also important, given that the programme was designed to be completed within three months from identification of policy questions through to production of papers and briefings. Although many participants didn’t produce outputs, they did benefit from the training and skills development (some conducted their review but just didn’t produce a final synthesis paper or briefing).

With some tweaking and a fair wind (that doesn’t carry a pandemic), it should be possible to replicate this model to achieve rapid turn-around times for policy colleagues whilst continuing to build evidence synthesis skills in the research community. See for yourself by reading the outputs here: https://policyhub.n8agrifood.ac.uk/portfolio-items/rapid-evidence-synthesis-training/

Acknowledgements

This programme of work was funded by Research England’s QR-SPF fund and the N8 AgriFood programme. The work was managed by Ged Hall for University of Leeds, and Anthonia James for N8 AgriFood. Initial training and support was provided by Gavin Stewart, with project management by Anthonia James, policy brief graphic design by Belinda Morris and additional support from Mark Reed. Thanks to Professor Eric Jensen for useful suggestions for this blog.