EDITORIALS

 

How to get research into practice: first get practice into research

 

 

John WalleyI,1; M Amir KhanII; Sayed Karam ShahIII; Sophie WitterIV; Xiaolin WeiI

ICommunicable Disease Research Programme, Nuffield International Health and Development Centre, Leeds Institute of Health Sciences, University of Leeds, 71-75 Clarendon Road, Leeds LS2 9PL, England
IIAssociation for Social Development, Islamabad, Pakistan
IIIWHO Afghanistan Country Office, Kabul, Afghanistan
IVHealth Economist, University of Aberdeen, Scotland

 

 

Discovering ways to increase access to and delivery of interventions is a major challenge. Typically research is divorced from implementation, which has led to a growing literature about how to get research into practice. However, operational research is best prioritized, designed, implemented and replicated from within national programmes.

The current model for most international health service research is based on the assumption that the research community "discovers" solutions and then tries to market them to busy decision-makers and practitioners. The problem of failing to get research into policy and practice is well known. Much debate focuses on the effectiveness of different approaches to dissemination and behaviour change.1–4 This is a significant issue when trying to influence individual practitioners. Another focus is on developing the capacity of research institutions in developing countries, with the expectation that this will increase the relevance and local ownership of results.5 We argue that these two approaches are necessary but not sufficient. The aim should not be to perfect techniques of feeding results to decision-makers, but to start from the perspective of the decision-makers even before devising the questions. This means "getting practice into research".

This approach is not appropriate for research into new and untried treatments where efficacy has not been established, but should become the norm for operational research, by which we understand research into how an intervention is implemented. It is an approach that is gaining ground in the developed north, but which has even greater application in resource-constrained settings. Here, based on our experience in China, Pakistan and elsewhere, are some key considerations:

Operational research should be embedded in local programmes. Operational research should emerge out of an ongoing partnership with a national programme. This includes the process of prioritizing, developing, conducting and disseminating research, and is part of national expansion of services.

Operational research should focus on local opportunities for going to scale. The first stage is to explore the options that are under consideration for implementation and then design research to inform the choice of how that implementation should best be carried out. For maximum effect, it is often useful to focus attention on situations where there are resources available from international or national agencies, but where some technical or organizational block has prevented them from being used effectively.

The research questions may be based on an understanding of the barriers to large-scale access.6 Then trials and social and economic studies can be embedded within programme sites, and provide knowledge on how to overcome these barriers and deliver effective interventions, as in Pakistan.7–9 Because these operational issues are commonly relevant to other high-burden countries, the publication of the results should have international as well as national influence.

Interventions to be evaluated should be realistic, given the resource constraints in that setting. Trial designs will vary according to the circumstances, but the key point is that the intervention is not implemented according to some kind of international ideal, relying on additional resources, but is integrated into existing health systems and is carried out using the resources which will be available for eventual scale-up. Unless the resource expectations are realistic, there will be no follow-up to research.

The national programme should implement the intervention, while researchers facilitate.

Researchers can act as a catalyst for action, and participate within national programme working groups to design the intervention and draft the guidelines and materials required for implementation. They can conduct the research together with the national programme and advise on the national scale-up. The main point is that the national programme implements both the existing service and the intervention being tested (as they will replicate nationally if it is found effective). Researchers can carry out any data-gathering that is over and above the routine (e.g. structured interviews or collection of cost data).

Research and programme development should be linked. The development components should run alongside the research, with technical assistance being provided on programme frameworks and operational plans. The intervention research guidelines and training materials should also be adapted and used for successful expansion of whichever modality is supported by the operational research results.

By focusing on specific obstacles, embedded research improves resource use and hence resource availability. Done well, operations research not only helps make effective use of existing internal and external resources, but also assists programme managers to mobilize further support once successful implementation has been demonstrated.

Supporting programmes to conduct research is the best way to build capacity. The embedded research approach should also build local research capacity. A track record of successful country research helps local research teams to bid for further funding. Health programme managers in developing countries are increasingly recognizing the value of research and are setting up their own research teams.

 

References

Available at http://who.int/bulletin

 

References

1. Bero L, Grilli R, Grimshaw J, Harvey E, Oxman A, Thomson M. Closing the gap between research and practice: an overview of systematic reviews of interventions to promote the implementation of research findings. BMJ 1998;312:465-8.

2. Garner.P, Kale R, Dickson.R, Dans T, Salinas R. Getting research findings into practice: implementing research findings in developing countries. BMJ 1998;317:531-5.

3. Siddiqi K, Newell J, Robinson M. Getting evidence into practice: what works in developing countries? Int J Qual Health Care 2005;17:447-53.

4. Siddiqi K, Robinson M. Getting evidence into practice in developing countries. Evid Based Cardiovasc Med 2006;10:5-7.

5. Sitthi-Amorn C, Somrongthong R. Strengthening health research capacity in developing countries: a critical element for achieving health equity. BMJ 2000;321:813-7.

6. Khan A, Walley J, Newell J, Imdad N. Tuberculosis in Pakistan: socio-cultural constraints and opportunities in treatment. Soc Sci Med 2000; 50:247-54.

7. Walley J, Khan M, Newell J, Khan M. Effectiveness of the direct observation component of DOTS for tuberculosis: a randomised controlled trial in Pakistan. Lancet 2001;357:664-9.

8. Khan M, Walley J, Witter S, Imran A, Safdar N. Costs and cost-effectiveness of different DOT strategies for the treatment of tuberculosis in Pakistan. Health Policy Plan 2002;17:178-86.

9. Khan M, Walley J, Witter S, Shah S, Javeed S. Tuberculosis patient compliance with DOTS and direct observation: results of a social study in Pakistan. Health Policy Plan 2005;0:471.

 

 

1 Correspondence to John Walley (e-mail: j.d.walley@leeds.ac.uk).

World Health Organization Genebra - Genebra - Switzerland
E-mail: bulletin@who.int