Implementing and monitoring

Monitoring and evaluation plays a key part in ensuring good programme results. While monitoring and evaluation is an accountability mechanism to ensure and report that activities are being delivered as planned, it should also be used for learning and adaptive planning. As discussed in Step 4, ongoing reflection on what is working and what isn’t lets you adapt the programme as it progresses.

 

How to do this?

Obtain and analyse baseline data
Baseline data can serve multiple purposes – identifying community needs, setting programme targets, determining the type of intervention and the level of implementation and measuring programme performance and impact. Baseline data may have already been collected during the situational analysis phase or as part of formative research, but additional information might be needed at this stage,to inform M&E activities.
Much of this information may already exist thanks to routine national and district data collection and should be collated and analysed to arrive at a baseline. This process is also a good opportunity to foster collaboration between different government departments at national and district level.
If new information is needed, joint WASH and NTDs surveys can be conducted using the opportunity of disease mapping to collect information on WASH and other determinants (or vice versa). See, for example, the Tropical Data methodology, which incorporates WASH indicators into disease mapping surveys. Baseline data can be presented numerically, or in map form by overlaying disease prevalence with relevant data on determinants (for example, STH prevalence and access to sanitation). Maps are powerful tools to visually represent need and progress over time. The table below lists out the type of baseline information you may need for an integrated programme.

 

 

Routine monitoring and reporting
Routine monitoring shows whether progress is being made against the agreed plan, so you can address challenges as they occur. Information on access to water and sanitation services is often collected at various administrative levels, so rather than collecting new information, you can arrange for this information to be shared. Regular reporting should be accompanied by supervision, either using existing structures or by undertaking joint visits (by WASH and NTD programme managers).
You may find the routine supervision guide and form a useful tool for this purpose. Keep in mind that for routine supervision to be effective it should have consequences – with good performance being rewarded (for example through recognition) and underperformance being addressed (for example through supportive supervision, further training, etc). The capacity needed for supervision and to analyse routine reports should be included at the planning phase of your programme.

 

Periodic reflection
Reflection should be part of your monitoring and evaluation plan, so you can regularly respond to questions such as:

  • Are there lessons and insights on why progress is or isn’t being achieved?
  • How can these insights be used to improve implementation or adapt the plan?
  • Are there more effective activities that can be done to achieve the objectives, or could activities have been implemented more effectively?
  • Are the findings of the original situational analysis still relevant?
  • Are there any new risks that need to be mitigated?
  • Has anything changed?
  • Have all key aspects been addressed?
  • What has changed in the environment (politically, administratively, structurally, programmatically etc.) that could be influencing (negatively, positively) expected programme achievements and goals?

 

To do this, it may be useful to convene a small group and together go back to the problem analysis conducted during the planning phase problem analysis approaches tool. Once the reflection has taken place, make the necessary changes to your logframe in terms of new resources, activities and outputs.

 

Evaluation
Unlike routine monitoring, an evaluation takes place at programme milestones and at the end of the programme to assess it. An evaluation can help demonstrate impact, how effectively implemented the programme is or was, and the effect it has had on systems and institutions. It is often done by individuals or agencies not involved in programme delivery. The evaluation seeks to answer:

  • To what extent did the programme meet its intended goals and objectives?
  • What programme activities worked and did not work?
  • What are the significant changes and achievements?
  • What adaptations were made to the plan, or the implementation structures, to enable this?
  • What are the lessons for further changes to the programme or for other programmes?

Keep in mind that disease control programmes tend to focus on epidemiological impact evaluation using impact surveys. It is crucial to go beyond this and include:

  • An evaluation of all interventions (like drugs offered versus uptake of drugs and access versus use of water and sanitation);
  • Data quality assessments – e.g. how to improve the data coming from the community through to national levels;
  • A process evaluation to determine how the programme was implemented (this is often overlooked but is very important to evaluate in order to be able to interpret outcomes and impact, and to identify successful processes that can be taken to scale, and replicated in similar contexts);
  • Some analysis on return on investment, or cost benefit analysis, by demonstrating the results achieved by the inputs. The WHO: Helminth control in school age children: a guide for managers of control programmes, Second edition provides a diagram to illustrate each of these components.

 

Accountability

Setting up a strong accountability structure will be essential, and accountability should be addressed at multiple levels:

  • To the community: The community should not only be aware of the purpose of the programme but should have a say in its design and implementation. This can be done in different ways, like circulating information through the media, using social mobilisation activities, or working through existing community based administrative and other structures (schools, leadership councils, health clubs/groups) and outreach functions. This will not only provide insights into programme delivery in different social and cultural settings but can also help make sure that all parts of the community are being reached;
  • Within the Ministry of Health and other government departments: Demonstrating good results brings much needed continued resource allocation. It also helps communicate the programme’s importance to other ministries (see the programme Dashboard template for a simple way of presenting such information). For example, by highlighting aspects like value for money, a successful integrated programme provides the Ministry of Health with a valuable business case to bring to the Ministry of Finance. Results should also be shared in annual health and WASH sector reviews and performance reports, to demonstrate the contribution of the programme to the achievement of sector goals;
  • To funders and partners: Ideally, the programme should build on a strong existing health management information system put in place by the health authorities. If such as system is not in place, any additional monitoring frameworks put in place should incorporate standardised indicators and be aligned with governments systems to the extent possible in order to reduce the burden of reporting and strengthen the health system;
  • The international community: all NTD and WASH programmes operate within the overall global development framework (currently enshrined in the Sustainable Development Goals), and in the case of NTDs, the WHO 2020 Roadmap and the WHO Global Strategy on WASH and NTDs. Programme successes and challenges should therefore be shared in relevant international forums and disease alliances. This will hold the programme to account, help countries learn from one another, and facilitate cross-border collaboration.

 

Ongoing coordination

Stakeholders and partners need to be constantly engaged. To do this, you can use and reinforce existing structures (task forces, coordination committees and government roles), which will avoid adding more meetings to already busy schedules. This should take place at all administrative levels – national, regional, district, etc. Remember that financial incentives such as per diems may not be the most effective way to keep people involved – the prospect of achieving programme goals may create even stronger motivation. It is worth investing in someone to lead this coordination. It’s important not to give up at the first hurdle; if participation falls off after the initial meetings, try to identify and address the reasons for lack of engagement.

Getting the M&E Framework right

A good logical framework (logframe) is a visual representation of the logic underlying a programme’s purpose and activities. It demonstrates the sequence of events through which a programme may contribute to positive changes and helps justify investments and contributes to overall accountability. It is based on the concept of cause and effect, meaning that if certain activities take place under certain conditions, certain results will be delivered.

 

A log-frame summarises

  • What the programme is going to achieve;
  • What activities will be carried out;
  • What means/resources/inputs (human, technical, infrastructural) are required;
  • What potential problems could affect success;
  • How progress and achievements will be measured and verified.

 

 

Steps to log-frame development

  • 1 . Define the overall goal to which your programme contributes

    That could be poverty reduction, achievement of SDG 3 targets in your country, NTD elimination or sustained control, etc.

  • 2 . Define the outcome to be achieved by the programme

    In other words, the impact the programme will have, or changes to the environment or to behaviours. This should ideally be a single outcome.

  • 3 . Define the outputs for achieving that outcome

    Basically, what the programme will deliver. For example, the number of people who will be trained, number of hardware produced, or number of committees formed.

  • 4 . When the programme is multi-year, include milestones

    Interim outcomes you will achieve by the end of each reporting period.

  • 5 . Define the activities for achieving each output

    Essentially how the programme will be delivered. Provide a brief summary of the activities that must be implemented to accomplish each output, and provide a summary schedule of periodic meetings, monitoring events and evaluations.
    A Gantt chart is a useful tool for this purpose.

  • 6 . Build in assumptions

    Statements about the uncertainty factors that may affect the programme. These should be things that are not activities in the logframe, but that affect whether or not planned activities can take place.
    Examples of this are new funding, external investments, availability of specific supplies, etc.
    Making these assumptions explicit from the beginning will help explain why certain things have or haven’t happened (for example when using the ‘five whys’ approach to problem analysis.

  • 7 . Define your indicators

    You will need multiple indicators to measure changes and impact, including:
    - NTD indicators, such as incidence, prevalence, co-endemicity, intensity WASH coverage, access and use indicators, such as presence and use of household latrines and improved water supply at household level and in schools and healthcare facilities
    - Indicators relating to changes in individual, family and community behaviours and perceptions over time, or proxy measures such as presence of hand-washing stations with soap and water
    - Process indicators, such as proportion of district NTD plans that include WASH activities and indicators, proportion of coordination structures with WASH and NTD representation, etc.
    - Programme and data quality indicators, such as number and quality of training sessions, quality of reported treatment data, etc., to ensure the programme is being delivered as planned.

  • 8 . To accompany the logframe, prepare a risk analysis and matrix

    This will ensure that you are aware of risks and have put in place measures to deal with them. The programme risk analysis template should help undertake this process.

The template logframe tool offers a comprehensive set of indicators for your consideration. Use the definitions and checklist for logframe development to help the development process.

Download STEP 5 tools