Article Text
Statistics from Altmetric.com
- HIV
- knowledge translation
- programme science
- public health
- sexually transmitted infections
- social/policy perspectives
- STD control
Three decades into the emergence of the HIV epidemic, centuries into the appearance of other sexually transmitted infections (STI), and despite the development of many efficacious individual, group and structural level interventions, it is clear that advances made in the prevention of HIV and other STI have not been sufficient to get ahead of these epidemics.1–4 As in other spheres of public health and health service delivery, consensus emerged that central to this problem was insufficient use of scientific evidence in planning and delivering interventions.5 To address this gap, health programme planners and implementers were encouraged to adopt ‘evidence-based approaches’ by pulling in evidence from the scientific literature and experts to inform their decision making. Increasingly, scientists have been encouraged to engage in knowledge translation to ensure that the findings from their research is being made known to policy makers, planners and implementers to guide better decisions.6
While reinforcing the need to close the gap between evidence and action, there is growing sentiment that current concepts and approaches for doing so are inadequate, and new paradigms are needed. In a recent article Parkhurst and colleagues7 pointed to a flaw in basic conceptual basis of knowledge translation. They argue that the usual paradigm of ‘getting research into practice’ by first developing ‘clear agreed-on evidence’ about interventions and then pushing that evidence into policy formulation and implementation has two important drawbacks. First, this approach does not address how policies and programmes are to be developed when there are evidence gaps, nor is this approach suited for dealing with complexity in causation or interventions and the importance of the social and epidemiological context. Second, this approach tends to separate researchers from those engaged in programme development and implementation. They therefore recommend emphasising a paradigm of ‘getting research out of practice’ that engages scientists and programme planners and implementers jointly to develop and refine hypotheses about the impact of an intervention strategy, and focuses on operational research, process evaluation and proper outcome evaluation to build the knowledge base further about what works in different contexts and why.
We strongly endorse the approach advocated by Parkhurst and colleagues.7 In fact, we propose that these concepts need to be taken further to incorporate the full range of prevention programme design, implementation, management and evaluation. A recently formulated approach, Program Science, may provide a framework that both expands the scope for knowledge development and provides an interface between programme and science focused on resolving programme issues.8 9
Programme science can perhaps best be defined as the systematic application of theoretical and empirical scientific knowledge to improve the design, implementation and evaluation of public health programmes. The endpoint for Program Science is the population level impact on the incidence of infections, by optimising the choice of the right strategy for the right populations at the appropriate time; by doing the right things the right way; and by ensuring appropriate scale and efficiency (figure 1). As such, the focus of Program Science extends beyond the design, optimal implementation and coverage (scale-up) of combination intervention packages and focuses on the development of the prevention programme in its totality. This includes issues of resource allocation, definition and prioritisation of target populations, development and prioritisation of intervention packages, identification of stopping rules to prevent indefinite implementation of specific interventions beyond the cessation of their usefulness. In addition, Program Science incorporates the development and application of programme impact evaluation methods that are appropriate for the complex interaction between intervention packages and their context. There is also considerable scope for examining the best ways of mobilising support for interventions in the policy arena through advocacy, and within the community through organised community mobilisation processes. Towards this end, Program Science integrates different spheres of practice including strategic planning and policy development, programme implementation and programme management with complementary spheres of knowledge, including epidemiology, transmission dynamics, policy analysis, intervention efficacy and effectiveness, surveillance, operations research and monitoring and evaluation.
To develop the concept further and examine the application of Program Science in STI and HIV prevention, a meeting of researchers, prevention programme implementers, policy makers and funders was convened in early May 2010, supported by the Office of AIDS Research of the National Institutes of Health.10 During the meeting programme implementers and policy makers from different contexts highlighted current knowledge gaps and the potential role for science to contribute to the improvement of programme design, implementation and impact evaluation. Researchers addressed key components and the evidence base for Program Science including mathematical modelling, complexity science, implementation science, health systems research and impact evaluation. The meeting resulted in a convergence of views and a strong impetus to form a consortium of programme and policy leaders and researchers to define further and establish the Program Science initiative. To promote the application of the Program Science concept on the ground, participants also agreed to launch country-level Program Science projects in three countries: India, Nigeria and Kenya. The rationale for selecting these countries included the diversity of their HIV epidemics, the scope for an enhanced response, and the potential for using the lessons derived in those countries to establish similar processes elsewhere in those geographical regions. The focus and scope of initiatives in each of these countries will be developed in further consultations with programme leaders and scientists in each country. However, each will entail the establishment of an integrated process of engagement between programme leaders and scientists to optimise the design, implementation and evaluation of HIV prevention programmes, with a strong emphasis on systematically generating and externalising knowledge gained from each initiative. These projects are intended to engage programme leaders and policy makers directly with scientists to address the particular prevention programme issues in these countries, and also to build an empirical base for Program Science that can be disseminated/externalised to improve HIV/STI prevention programmes and outcomes in other countries and world regions. In addition to these projects the National Center for HIV/AIDS, Viral Hepatitis, STD and TB Prevention at the Centers for Disease Control and Prevention has recently adopted program science as one of their priority initiatives, and plan to incorporate the core elements described here to promote the better integration of science and prevention programme delivery in the USA. We hope that this initiative will further invigorate the movement towards bringing programme and science together to maximise the health impact of HIV/STI prevention programmes.
Key messages
The impact of programmes for the prevention of HIV and STI can be improved by closing the programme–science gap.
The standard model of knowledge translation involves generating scientific evidence in support of single interventions and efforts to ensure that this evidence is used in practice.
We propose a new paradigm, Program Science, which will involve addressing the complexity of programme design, implementation and evaluation that will involve a broader set of activities and better integration between programme implementers and scientists.
Footnotes
Competing interests None declared.
Provenance and peer review Not commissioned; internally peer reviewed.