Article Text

Download PDFPDF

S01.3 Using multiple data sources for programme evaluation: integration of program monitoring data with other research studies
Free
  1. BM Ramesh
  1. University of Manitoba, Department of Community Health Sciences, Winnipeg, Canada

Abstract

Background Integration of program monitoring data with focused research studies can be a powerful approach to program evaluation and outcome assessment. This paper draws on examples from a large HIV prevention program in Karnataka, India implemented by the University of Manitoba, and funded by the Bill & Melinda Gates Foundation.

Methods Data sources included (1) routine program data to monitor coverage (2) semi-annual assessment of behavioural outcomes using rapid, unlinked anonymous methods called Polling Booth Surveys (PBS) (3) Integrated Behavioural and Biological Surveys (IBBS) and (4) mathematical modeling of HIV transmission dynamics.

Results The program monitoring data indicated that the monthly coverage of the estimated female sex workers (FSWs) increased from 68% to 76% and the monthly clinical attendance increased from 19% to 27% over a one year period. PBS demonstrated that the condom use among FSWs in last sex with any client increased from 64% to 73% over four years. IBBS indicated that HIV prevalence among the FSWs declined from 25% at baseline to 13% at end line. The mathematical modeling which used parameters from these data sources suggested that a total of over 80,000 infections were averted by the Karnataka program. The monitoring and evaluation teams were embedded within the program, independently carrying out the design, data collection, analysis and feedback.

Discussion The embeddedness of program monitoring and evaluation enabled regular feedback to program implementation in terms of which geographies to focus, which sub-groups to prioritize etc. Special intervention packages were implemented for the young and high-volume FSWs.

Conclusion The examples presented here used interactive processes of data use throughout the program cycle through regular feedback to program implementation pon geographies/sub-populations that are lagging behind in terms of both coverage and quality.

Disclosure No significant relationships.

S01.4 Evaluating complex public health issue violence: understanding and measuring violence and evaluating violence interventions – lessons from STRIVE Sinead Delany-Moretlwe

Wits Reproductive Health and HIV Institute, South Africa

  • program science
  • program monitoring and evaluation
  • HIV and MNCH

Statistics from Altmetric.com

Request Permissions

If you wish to reuse any or all of this article please use the link below which will take you to the Copyright Clearance Center’s RightsLink service. You will be able to get a quick price and instant permission to reuse the content in many different ways.