Lead Evaluator for Final Evaluation Innovative Approaches to Response Preparedness Programme (IARP)

  • Contractor
  • , Remote
  • TBD USD / Year
  • Netherlands Red Cross profile




  • Job applications may no longer being accepted for this opportunity.


Netherlands Red Cross

1. IARP programme background

The final evaluation will need to assess the 5-year IARP programme implemented in Ethiopia, Uganda and Kenya. IARP programme, funded by IKEA Foundation, plans to reduce the effect of climate related disasters on communities including refugee populations. This will be achieved through setting up Forecast-based Financing (FbF) systems, supported by Data Preparedness and use of Cash Transfer Programming (CTP) as an early action. The development and combination of these innovative approaches will put an Early Warning Early Action (EWEA) system in place that enables National Societies, in partnership with governments and other key stakeholders, to deliver cost-efficient, well-targeted and timely action to the most vulnerable people facing probable climate-related disasters.

Currently, many predictable extreme events such as floods and drought, result in disasters and suffering. This is aggravated by climate change. The impact of these events can be reduced or avoided if weather and climate forecast information are systematically used for early action to prepare the most at-risk communities before a disaster. However, forecasts are not always used to take early action, with governments and humanitarian organizations often starting their response operations after a disaster has taken place. Among other things, this is due to the lack of plans and financing to take early action. If such procedures were in place, this could reduce climate-related risks to vulnerable people and save valuable time and money in humanitarian response.

Over the past five years, starting in 2018, the IARP programme aimed at filling this critical gap with forecast-based financing (FbF) supported by data preparedness and use of cash transfers as an early action. The National Red Cross Societies in Uganda (URCS), Ethiopia (ERCS) and Kenya (KRCS) in collaboration with their national government and relevant stakeholders have put an early warning early action system in place that enables their Red Cross-National Societies in partnership with government agencies and meteorological services, humanitarian organizations and others to carry out cost-effective and timely actions for the most vulnerable people. The aim of having such a system in place is to reduce the impacts of climate change and disasters, protect lives and livelihoods, and support development.

The specific objectives of this 5-year response preparedness programme were:

  • A countrywide FbF system is in place in support of early warning early action.

  • The National Societies use data to better understand risks and identify beneficiaries for greater humanitarian impact.

  • National Societies implement cash transfer programming to support early action.

2. Purpose and intended use

The evaluation will need to focus on the different result levels, ranging from long-term outcomes, intermediate outcomes and short-term outcomes, both for accountability purposes and for learning for future response preparedness projects implemented by Sister National Societies, IFRC in collaboration with NLRC. It is important that this evaluation leads to important learning points regarding strengths and weaknesses of this response preparedness programme for internal learning purposes. In addition to this it needs to outline a strong cost-efficiency and effectiveness analysis of this type of response preparedness programming. NLRC values insights from an ‘outsider’s perspective’ and recommendations for future programming.

3. Scope

The entire 5-year IARP response preparedness programme will be evaluated based upon secondary data review (all reports that have been written in the programme period) as well as primary, quantitative and qualitative data collection. This evaluation should provide insights to the achievements and/or progress made towards the indicators. It is preferred to use the six OECD DAC evaluation criteria to measure the findings against:

  1. Relevance

  2. Effectiveness

  3. Cost-Efficiency

  4. Impact

  5. Coherence

  6. Sustainability

The evaluation will assess the achievements of the programme at community and stakeholder levels (including internal Red Cross stakeholders as well as external stakeholders), what has worked well and what has been a real challenge. This will lead to the formulation of a number of lessons learned and recommendations to further strengthen effective future preparedness programming.

Preparation for the evaluation will start in June whilst the evaluation itself will be carried out between August and December 2022 and will assess the research questions as mentioned under point 5.

4. Responsibilities and lines of communication

This evaluation is commissioned by the Programme Manager of the IARP programme based at NLRC HQ in the Netherlands. The process of the evaluation will be managed by an Evaluation Management Team (EMT) which will be headed by the NLRC PMEAL advisor and consists of other members of the NLRC team, based at HQ and in the three countries, who have been involved in the IARP programme. EMT provides input and advice particularly during the inception phase and other important milestones of the evaluation that will be identified in the evaluator’s inception report. It will monitor the evaluation team (ET) regarding evaluation management, design, implementation and quality control.

The evaluation will be carried out by the ET, led by an External Lead Evaluator (the position that is advertised in this vacancy). The Lead Evaluator hired by the NLRC will, jointly with the NSs, hire an external evaluator in each of the three countries, in consultation with the EMT. The local evaluators will be hired and paid by the budget lines of the respective National Societies (Ethiopia, Kenya and Uganda). The evaluation team will together decide on the roles and responsibilities of each team member. Regular meetings between the ET and the EMT are to be scheduled to discuss progress.

The Lead Evaluator will manage and oversee the evaluation in all three countries and will ensure that the evaluations conducted in the three individual countries are aligned. The actual implementation of the evaluation, including the secondary and primary data collection, will be conducted by additional evaluators per country, the selection of which will also involve the lead evaluator. The responsibility of the Lead Evaluator is therefore to provide guidance on information sources, data collection tool development, sampling methods and data analysis to these in-country evaluators. It is the responsibility of the Lead Evaluator to ensure that all the data collected eventually leads to one coherent report answering all the evaluation questions mentioned in chapter 5.

5. Key questions

Evaluation questions are laid out in the following six evaluation criteria. The ET, in close consultation with the EMT, is expected to further elaborate on the questions in their Inception Report. The effect of the COVID-19 pandemic should be taken into account in the entire evaluation, in order to understand the effects that the pandemic has had in which of the criteria areas below.**i) Relevance**

· To what extent did the programme respond to the needs of the target group, being the communities who will be supported via the established early warning early action mechanisms, the National Societies and the stakeholders involved in the national anticipatory action system?

· Looking back on the results (at institutional and community level) achieved, how relevant do we consider a response preparedness programme with a length of 5 years? What has been the added value of this long-term programme, compared to only focusing on the development of the Early Action Protocols? What is the added relevance of the data preparedness component?

· To what extent did the programme target or have the potential to target the groups that could benefit most from response preparedness programmes? To what extent has the programme the potential to reach the most marginalized groups?

ii) Effectiveness

· To what extent has the programme been effective in reaching the objectives that were set out in the Theory of Change?

o To what extent has the programme been effective in institutionalizing a national anticipatory action mechanism in Ethiopia, Kenya and Uganda?

o To what extent have the NSs integrated FbF SOPs into their DRM strategies or contingency planning?

o To what extent is there a country wide FbF system in place with clear financial protocols and roles and responsibilities to enable EWEA?

o To what extent are NSs data prepared in order to use data to understand risks and support targeting and prioritization of EWEA as part of FbF to increase timeliness and cost-effectiveness of their operations?

o To what extent are NSs well-prepared to implement CTP as an EWEA activity as part of FbF?

· Which challenges were encountered in reaching the intended objectives?

· To what extent have learnings gathered during the project duration across the different countries contributed to improved progress towards objectives?

· To what extent has the increased data capacity of Red Cross-National Societies and other relevant response preparedness actors led to an improved ability to be ready for effective emergency response?

· What worked and what did not work to increase preparedness for emergency response of the National Societies, local authorities and communities in Uganda, Ethiopia and Kenya?**iii) Cost-efficiency and effectiveness**

· How did the NLRC, URCS, KRCS and ERCS convert their programme inputs into outputs and to what extent was it economical and efficient?

· To what extent was the cost structure of the programme appropriate to achieve intended objectives?

· How do the different modes of assistance (i.e., early actions) compare in terms of efficiency? Especially comparing cash transfers to all the other modalities.

· To what extent has it been a cost-effective choice to set up response preparedness mechanisms in Uganda, Ethiopia and Kenya in comparison to the cost-effectiveness of regular emergency response? iv) Impact

· What are the perceived and projected changes or effects (both positive and negative) for the Red Cross-National Societies, of the established Early Action Protocols in Uganda, Kenya and Ethiopia?

· What are the (potential) long-term changes or effects (both positive and negative) for the communities in Kenya, Uganda and Ethiopia, of the established Early Action Protocols?**v) Coherence**

· To what extent have the National Societies in Uganda, Kenya and Ethiopia built strategic (long-term) partnerships with other humanitarian actors, relevant local and national authorities or other relevant stakeholders?

· To what extent was there alignment in vision and implementation between the Red Cross-National Society and local authorities and other stakeholders at country level?

· To what extent have collaborations on project implementation level in-country between the Red Cross-National Society and local authorities and other stakeholders led to better results compared to what the stakeholder could have achieved when implementing in isolation (1+1=3)?

· To what extent has the sharing of lessons learnt between the ERCS, URCS and KRCS led to better results compared to if learnings would have been shared only in-country? vi) Sustainability

· What factors have contributed to building an enabling environment for achieving sustainable results with this programme?

· What factors have lacked to build an enabling environment for achieving sustainable results with this programme?

· What is the potential that the data preparedness and cash readiness capacities of the ERCS, KRCS and URCS are lasting?

· What is the potential for the FbF system in the country to last?

7. Methodology

The methodology will adhere to the IFRC Framework for Evaluations, with particular attention to the processes upholding the standards of how evaluations should be planned, managed, conducted, and utilized (see section 9 for more information). The EMT will manage and oversee the evaluation and ensure that the evaluation upholds IFRC standards for evaluations and OECD-DAC criteria for evaluation.

The Evaluation Team is expected to propose a mixed evaluation methodology of (primarily) qualitative and quantitative research methods. It is preferred that the ET should have a proposal that includes an analysis of the cost efficiency and effectiveness of the programme.

An overarching approach (e.g., realist approach, or contribution analysis) is considered of added value to the evaluation methodology.

Documentation to be provided by the programme team:

· Theory of Change

· Case Study on Stakeholder Engagement in Anticipatory Action

· Annual and bi-annual reports

8. Deliverables

Inception report

The inception report needs to include a detailed evaluation work plan with a clear purpose, process, methodology and activities related to data collection, analysis and reporting, accompanied by a clear timeframe with firm deadlines per deliverable. The report will, among others, include information on roles and responsibilities, travel, and logistical arrangements for the evaluator (provided COVID-19 restrictions allow for travel). The report needs to be revised and approved by the EMT before starting data collection and analysis at the programme and project level.

Final Evaluation report

The evaluation report should be written in English and have a maximum of 40 pages (annexes excluded). It should be all-encompassing. The report should adhere to the NLRC format for evaluations and contain:

· Executive summary specifically addressing: what has been working well and what needs improvement in two clearly separated paragraphs.

· An introduction that outlines the purpose, scope, objectives and the process of this evaluation (building on the inception report)

· A methodology section in which the evaluation questions are clearly operationalized

· Background/context analysis of the complete programme and paragraphs per country

· A conclusion chapter with an analysis of findings following the evaluation criteria and a conclusion section with a clear synthesis containing lessons learned and recommendations. Recommendations need to be specific and feasible. **

· The report should contain appropriate appendices including data collection tools, a copy of the Terms of Reference, cited resources or bibliography, a list of those interviewed and any other relevant materials, including possible data bases. **

The lead evaluator will be the primary author of the final evaluation report and will submit the draft report, illustrated and explained by a presentation, to the EMT. The EMT, in turn, will validate the findings and provide input for necessary corrections in order to ensure that the final report meets expectations and complies with quality standards. The final report will be submitted one week after the receipt of the consolidated feedback from the EMT.

The evaluator will deliver debriefings on the findings to project staff and interviewees involved in the evaluation.

All products arising from this evaluation will be owned by the NLRC. The evaluator will not be allowed, without prior authorization in writing, to present any of the analytical results as his/her own work or to make use of the evaluation results for private publication purposes. NLRC values the process of evaluations and to improve this process it has put in place a tool that analyses the quality of evaluations. When the evaluation adheres to the quality standards the NLRC will publish the evaluation accompanied by the management note.

9. Evaluation quality and ethical standards

The evaluators should take all reasonable steps to ensure that the evaluation is designed and conducted to respect and protect the rights and welfare of people and the communities of which they are members, and to ensure that the evaluation is technically accurate, reliable, and legitimate, conducted in a transparent and impartial manner, and contributes to organizational learning and accountability. Therefore, the evaluation team should adhere to the evaluation standards and specific, applicable process outlined in the IFRC Framework for Evaluations

The IFRC Evaluation standards are:

  1. Utility: Evaluations must be useful and used.

  2. Feasibility: Evaluations must be realistic, diplomatic and managed in a sensible, cost-effective manner.

  3. Ethics & Legality: Evaluations must be conducted in an ethical and legal manner, with particular regard for the welfare of those involved in and affected by the evaluation.

  4. Impartiality & Independence: Evaluations should be impartial, providing a comprehensive and unbiased assessment that takes into account the views of all stakeholders.

  5. Transparency: Evaluation activities should reflect an attitude of openness and transparency.

  6. Accuracy: Evaluations should be technical accurate, providing sufficient information about the data collection, analysis and interpretation methods so that its worth or merit can be determined.

  7. Participation: Stakeholders should be consulted and meaningfully involved in the evaluation process when feasible and appropriate.

  8. Collaboration: Collaboration between key operating partners in the evaluation process improves the legitimacy and utility of the evaluation.

It is also expected that the evaluation will respect the seven Fundamental Principles of the Red Cross and Red Crescent: 1) humanity, 2) impartiality, 3) neutrality, 4) independence, 5) voluntary service, 6) unity, and 7) universality. Further information can be obtained about these principles at: www.ifrc.org/what/values/principles/index.asp

10. Profile of the evaluator(s)

The evaluation will be led by an External Evaluator:

0.1 Education

a) University degree
b) Qualification in PME, DM and/or other relevant area

0.2 Experience

a) Experience with cost-efficiency and effectiveness analysis is highly desired
b) Experience leading and carrying out multi-country evaluations is required
c) Experience with working in the international humanitarian aid sector is required
d) Experience with qualitative and quantitative evaluation techniques is required
f) Experience with remote evaluation methods is a requirement
g) Experience in leading a team consisting of both staff and volunteers is considered an asset
h) Regional experience is considered an asset

0.3 Skills

a) Fluency in spoken and written English language

b) Ability to write concise, yet comprehensive reports
c) Excellent interpersonal, analytical and organizational skills
d) Ability to work effectively in intercultural settings
e) Ability to meet tight deadlines
f) Self-supporting in working with computers (word-processing, spreadsheets, statistical software, etc.)

0.4 Knowledge

a) Technical knowledge and experience in the field of response preparedness is a requirement
b) Knowledge of Kenya, Uganda and/or Ethiopia is considered an asset

The evaluator will work together with a team consisting of at least one external evaluator in each of the other two countries depending on where the lead external evaluator is based. It is also possible to work with additional team members, we therefore ask for both consultant teams as well as individual consultants to apply to this position.

11. Planning

The planning for the evaluation will commence in June 2022. The evaluation is expected to be implemented between August and December 2022. The evaluation team is expected to include a timeline in the proposal. The exact dates and deliverables can be further detailed in consultation with the EMT during the inception phase.

12. Debriefing

At the end of the evaluation in-country delegates and RC local staff and management should be debriefed on initial findings. A debriefing and presentation of initial and final findings will be scheduled with the Evaluation Management Team, as well as with other programme staff and possibly members of the International Assistance HQ Management Team.

13. Costs

The lead evaluator is requested to provide a calculation of costs for the evaluation. S/he should indicate daily fees for evaluation team members, travel costs, the estimated number of days required for different stages of the assignment and other costs.

The final ten percent of the agreed costs will be paid by NLRC after receipt and appraisal of the final evaluation report.

How to apply

Interested candidates/teams should submit their expression of interest to [email protected] and [email protected] requesting the full set of tender documents to include:

  1. Full Terms of Reference
  2. Further details on requirements for submitting a proposal
  3. Theory of Change documents
  4. NLRC purchasing terms and conditions
    From this they will be able to submit a complete and informed proposal the deadline for which is 1/5/22.

Job Notifications
Subscribe to receive notifications for the latest job vacancies.