Consultants to Perform Endline Evaluation for the Community Initiative to Promote Peace (CIPP) Program at Mercy Corps

Contract @ListingAPI-2 in NGO / Non-Profit Associations
  • See Job Description, Nigeria (Selected States), Mercy Corps View on Map
  • Post Date : December 2, 2023
  • Apply Before : December 8, 2023
  • 0 Application(s)
  • View(s) 73
Email Job
  • Share:

Job Detail

  • Job ID 38008
  • Experience See Job Description
  • Gender Male or Famele
  • Qualifications See Job Description

Job Description

Mercy Corps is a leading global organization that works in 40 countries around the world to alleviate suffering, poverty and oppression by helping people build secure, productive and just communities. Since 2012, Mercy Corps has worked to tackle Nigeria’s complex and evolving needs through a range of dynamic, multi-sector programs. In Nigeria, Mercy Corps is addressing the immediate humanitarian needs of vulnerable communities; enabling conflict-affected populations to transform from relief to resilience using market-driven approaches; and supporting community-led peace building efforts.

  • We are recruiting to fill the position below:

Job Title: Consultant to Perform Endline Evaluation for the Community Initiative to Promote Peace (CIPP) Program

Locations: Benue, Kaduna, Kano, Katsina, Kogi, and Plateau
Employment Type: Contract

Purpose of the Endline Evaluation
The endline evaluation seeks to measure the impact on the project by assessing the project’s achievement on its outputs and outcomes, success of the interventions, as compared to baseline and midterm evaluation. the evaluation will assess the extent to which the project contributed to achieving its proposed results, performance, achievements, challenges, and best practices to inform future similar programming, effectiveness of intervention activities implemented by the Activity towards achievement of its set goals and objectives since inception (May 2019).

It will gather and synthesize evidence about the program’s key indicators, and the processes through which they have or have not been achieved, to support learning and adaptation throughout the duration of the program. It will include direct measurement of core elements of the program’s theory of change/results framework, including capacity-building and increased knowledge, quality of services and mechanisms, successful advocacy, resolution of disputes, policy development, etc. Where possible, the endline evaluation will also draw on data and analysis from both phases of the Randomized Control Trials (RCT) evaluation and other researches work, strategies implemented by the Activity to demonstrate contribution of the intervention activities towards the desired outcomes of the program.

The evaluation will accordingly review the overall approach, successes and learning of the Activity in its progress towards meeting the Activity’s outcomes and provide insight on the Activity’s relevance, effectiveness, efficiency, performance, and progress towards meeting results against targets that would help to review the Activity results framework and theory of change (design). This process will involve a review of the Activity’s results and achievement to date, relevance of the program and logframe indicators that would determine how the program progresses towards achieving its set results, as well as implementation strategies towards improved efficiency and effectiveness as well-informed decision making within the remaining period of the Activity implementation.

In summary, the endline evaluation will contribute to the following general evaluation objectives:

  • Enable the Activity, to assess the efficiency, effectiveness, relevance, and outcomes and impact of the program.
  • Provide feedback to all actors and stakeholders involved on the implementation of the program activities to improve the planning, program formulation, appraisal, and effective implementation of the program in the remaining period of program implementation,
  • Ensure accountability for results to the program actors (donors) stakeholders and participants.
  • Generate lessons leant that can be leveraged on towards sustainability of project outcomes after implementation

Proposed Evaluation Scope, Methodology and Proposed Questions

  • The Endline evaluation will cover CIPP communities in six states-Kaduna, Kano, and Katsina, in Nigeria’s Northwest, and Kogi, Benue and Plateau in North Central; including relevant government agencies, peace structures and leaders at community, Local Government Areas (LGA), and the state levels; Religious networks, Civil society coalitions, Women’s associations.
  • The endline evaluation will also draw on survey data from an agreed cross-selection of beneficiaries, who will be sampled from 24 LGAs within the six (6) Northwest and Northcentral states the project is operating. Ensuring representativeness in the sampling approach, the endline evaluation will be conducted in 24 LGAs as follows:
  • Northwest: Data will be collected at the community, LGA, and State levels
  • North central: Data will be collected only at the LGA and State levels

Proposed Evaluation Methodology

  • The evaluation will be undertaken in accordance with the Activity Monitoring, Evaluation and Learning Plan and with the support of an independent consultant(s). The evaluation process will be consultative to deploy participatory approaches that engages a range of program stakeholders who will include direct program participants (male and females-mostly women) youth (male and female), peace building agency, Indirect participants, and government (through relevant line Ministries and Agencies), Activity’s partners and program staff among other during the evaluation process.
  • An appropriate mix of qualitative and quantitative methods will be used to analyze existing data/information and to gather and analyze new data/information where needed. New data collection should help to ensure that the Endline evaluation is based on diverse perspectives by promoting the participation of diverse groups of stakeholders. The evaluation strategy should include secondary as well as primary data based on the results framework and indicators of the program. The MEL team will be expected to propose a methodology and plan for this endline assessment that specifies the proposed data sources and analysis approaches (including sampling plans and questionnaires for any new data collection), as well as a proposal for how the various analyses will be triangulated to help ensure the credibility and validity of the findings and conclusions.
  • The consulting firm/consultant is expected to provide a means of answering the learning questions using both qualitative and quantitative means of data collection, providing a representative sampling strategy. This will include the key deliverables of developing an Inception Report detailing the process and methodologies to be employed to answer the evaluation questions; undertaking desk review of the relevant program documents and secondary analysis to further inform the interpretation of the results, and designing evaluation methodology following the below standards:
  • Design Qualitative and Quantitative Data Framework and Tools – The external evaluator (consultant) is expected to conduct a mixed-methods evaluation using tools and a work plan developed by the consultant and approved by the evaluation team listed above before the start of the evaluation. Data collection shall involve visits to a sample of program locations, meetings with program partners, targeted participants, and other key stakeholders.
  • Quantitative methodological approach – The quantitative evaluation matrix will be developed to evaluate the program’s results that allow for a comparison of the indicators obtained in the CIPP baseline, midterm and annual surveys and progress to date. The external consultant will supervise and lead the sample frame design, questionnaire, and fieldwork in collaboration with the CIPP consortium. Primary data collecting and quantitative survey data analysis will be part of the final product. The consultant will create and refine the final evaluation survey tool, which must inculcate the same data collection instruments, statistical accuracy, and statistical power as the baseline and midterm survey. Mercy Corps will manage the field operations, including the recruiting of enumerators, while the consultant and the CIPP MEL team will co-lead the training and testing of tools With the sampling frame and sample size being the overarching sampling technique, quantitative designs will allow the program to investigate statistically significant changes in estimates from baseline to midterm evaluation important outcome indicators.
  • Qualitative approach – The qualitative evaluation proposed should complement survey data on program performance as well as capture lessons learned and best practices through a variety of qualitative methods. The consulting firm/consultant will design the overall qualitative study approach that considers a variety of primary data collection methods, including semi-structured in-depth interviews; focus group discussions; and observations. The sampling frame will be derived from the program database of all beneficiaries targeted by the program interventions. This will be shared with evaluation team during inception. Using a representative sampling frame, the evaluation team will be responsible for collecting and analyzing qualitative data. Data will be collected from all key stakeholders through interviews, discussions, consultative processes, and observations. Participants for the qualitative interviews will be selected using the purposive samples approach.

Evaluation (Learning) Questions

  • The evaluation proposes a set of questions to be answered to demonstrate progress against results to date, Activity significant contributions towards the desired outcomes and accountability and learning.
  • The proposed questions, if answered, will help in program reflection and adaptations to better achieve desired program outcomes.
  • The evaluation questions proposed for the endline review are illustrative and informed by the Activity evaluation approach, Theory of Change, and Logical Framework as well as the evaluation minimum standards.

At inception, the evaluation consultant(s) together with Activity implementation team is expected to refine and expand on the following proposed questions:

  • To what extent do conflict communities utilize the capacity and skills they have acquired to promote peaceful coexistence and collaboration and what strategies do they put in place for continuous utilization of their skills?
  • What Early Warning and Early Response (EWER) mechanisms have demonstrated effectiveness?
  • How does collaboration with various stakeholders address the root causes of conflict?
  • When and how should collaboration between different stakeholders happen?
  • What role can women play in peacebuilding and prevention of violent extremism?
  • Considering the Activity’s evaluation strategies, describe the best (effective and relevant) approach to demonstrate the Activity’s results considering the set objectives and contextual factors in place.
  • What are the key enabling factors that should be considered to strengthen implementation of the intervention outputs towards the desired outcomes?
  • The consultant will be responsible for designing an innovative mixed methodology to answer the above learning questions towards ensuring the program effectively harnesses lessons learned, best practices, and measure key performance indicators with high confidence levels. The methodology should be robust and flexible. In addition, the evaluation shall be designed to detect statistically significant changes in estimates from baseline, Midterm to endline for key indicators.
  • The consultant will develop an inception report detailing the process and methodologies to be employed to answer the learning questions aligned to the following evaluation dimension:

Proposed Evaluation Criteria
Dimension of the Evaluation Cross-Cutting Themes of the Evaluation:

  • Achievement and Progress Conflict Sensitivity Relevance Adaptive Management Effectiveness and Efficiency Sustainability
  • Gender and Inclusion
  • Synergy with Other Programmes
  • Partnership and Collaborations

Required Sources of Information:

  • For each evaluation question, the consultants are expected to define the information required, sources of information, procedure for collecting data and ensuring its validity and credibility, and the method of analysis (Data Analysis Plan), interpretation, and synthesis. This will be an iterative process between the evaluation team and the programme teams at the inception stage. This process will also anchor the review of the original evaluation questions in the endline evaluation plan when designing the criteria and standards for the data that would be required to answer them, to refine and finalize on the learning evaluation questions.
  • In addition, the recommended sources of information, the Activity implementation team will avail the programmes logframe and work plans to enable extraction of further learning questions for in-depth analysis

The following data sources include both existing and new program information sources:
Existing Sources of Information:

  • Program proposal
  • Results Framework
  • Annual work plans
  • Monitoring, evaluation, and learning plan
  • Gender Assessment Report
  • Annual reports (and associated data)
  • RCT report(s), datasets, and analysis
  • Annual perception survey data and reports

New Sources of Information:

  • Participatory reflection workshops to gather experience and observations from team members, partners, participants, and other external stakeholders
  • Annual perception survey

Evaluation Findings Primary Users
The primary intended users and uses of the evaluation’s findings will include:

  • USAID as the main funder – We hope that the evaluation findings will be used to decide if and how to further support similar peacebuilding and collaboration efforts in Nigeria.
  • Mercy Corps Nigeria and its Partners – We hope that the evaluation findings will be relevant to guide review of the Activity implementation plans based on progress and achievement, and sustainability plans, refine key project documents including Activities log frames and performance indicators and tools, as well as strengthen the Activity’s evaluation strategies and development of the next funding strategy and help develop relationships with funders.
  • Implementing partners and the broader international not-for-profit organizations – for implementing partners and not for profit organizations implementing similar peacebuilding interventions in the Northwest and Northcentral Nigeria, we hope that the findings will be useful to assess their role in designing and implementing similar programmes.

In addition, we hope that the evaluation findings will not only be critical to meet the above highlighted purpose, but also serve as a learning experience, where the process of generating answers to the proposed evaluation questions will provide the Activity and the implementation team with new understanding on the Activity’s achievements and inform an effective implementation strategy for the remaining period of implementation.

Consultancy Key Tasks, Outputs and Deliverables

  • Mercy Corps will establish an evaluation team to oversee all related tasks of the Endline evaluation process. Successful consultant (consultancy firm) will work closely with the Activity MEL Manager, Chief of Party, Country MEL Manager and the Program Performance and Quality team to design and conduct this mixed-methods evaluation and post-evaluation workshop deliverables. CIPP Monitoring, Evaluation and Learning Manager will be responsible for the overall coordination of all tasks.

The consultant(s) will be expected to:

  • Attend initial kick off meeting for the evaluation to validate the purpose, outputs, and deliverables of the evaluation assignment.
  • Review program documents, updated work plans and quarterly/semi-annual/annual reports and all other relevant documents to inform a comprehensive evaluation methodology and matrix presented in a detailed inception report.
  • 3. Review and use the program’s up to date IPTT and other performance data including:
    • Routine monitoring data (RMD)
    • Baseline and midterm study data
    • Quarterly, Biannual and Annual progress and performance reports and/or recurrent monitoring data
    • RCT Reports
  • Review other key program documents (e.g., gender analysis, formative studies conducted for the program, etc.)
  • Review any data/information used to monitor the quality of the program’s key services/interventions (including but not limited to CARM data)
  • Prepare an inception report to include a comprehensive evaluation approach, a draft data analysis plan (DAP) appropriate for the mixed-methods study), sample design(s) and size(s), a practical method to assess the quality of the program’s key services/interventions, a revised and final timeline for conducting the MTE (one round of revisions to inception report and DAP anticipated).
  • Prepare and present a 1-day workshop to present the evaluation plan to the evaluation team and program staff
  • Develop evaluation instruments (one round of revisions anticipated)
    • Translated into local languages (Hausa) as appropriate (firm will manage translations)
    • Including finalizing and testing of these tools as appropriate
  • Develop a final Data Analysis Plan (DAP)
  • Conduct secondary data analysis of Routine Monitoring and Program Data
  • Co- facilitate training of FGD/KII facilitators & note takers (and enumerators where applicable)
  • Conduct and analyze key informant interviews (KII) and focus group discussions (FGD) and manage that new qualitative data.
  • Collect and oversee comprehensive quantitative and qualitative data collection in addition to the above as applicable
  • Encode and analyze data collected through quantitative and qualitative data collection approaches
  • Prepare and present preliminary results and analysis to the CIPP Program team including USAID through a virtual round table workshop.
  • Draft report, with the following required revisions:
    • Feedback from CIPP and PaQ Team
    • Review of updated document when feedback is addressed (by Mercy Corps)
  • Edit and finalize the:
    • Endline Evaluation Report (Not specific)
    • Lesson Learnt Document (Not more than 7-10 pages minus annexes)
    • Two-pager info graphic highlighting key program successes and working models informed by the evaluation findings.
    • At least two case studies highlight the program significant contributions towards the desired outcomes.

Mercy Corps will be responsible for the following tasks:

  • Share all necessary documents with the consultant to finalize the midterm methodology and data collection tools
  • Provide input for study methodology, data collection tools, and report.
  • Working space for the Consultant while in Abuja.
  • Overall accountability of the process, while supporting coordination of field activities including data collection where applicable.
  • Recruitment of facilitators, enumerators, or research assistants (where necessary)
  • Closely monitor the data collection process, ensuring quality control, daily data validation debriefing, and meeting the timelines.
  • Inform sampled audience about their involvement in the study and set specific dates for the study field schedule.
  • Approve inception report before data collection.
  • Provide additional support to the consultant study technical lead (external Consulting partner) for the field visits processes as needed such as orientation and training of enumerators, FGDs and KIIs.
  • Approving draft and final reports
  • Provision of a reporting template
  • Organize all logistics associated with Endline evaluation activities including participatory workshops, accommodation, and transportation etc.
  • Any other required support aligned to the delivery of the endline evaluation tasks

Outputs and Key Deliverables
Endline evaluation will consist of three key phases namely a) inception, b) data collection and c) finalizing the report.
Inception Phase:

  • Kick off meeting with CIPP project team, PAQ team, TSU focal point, and RLT focal point to understand the project and to collect required documents such as TOC, project proposal, results framework, Monitoring, Learning and Evaluation Plan, etc.
  • Engage in additional consultations with relevant program team members to come to a consensus on the endline evaluation methodology, field visit plan, and sampling strategy.
  • Produce an electronic copy of the draft inception report which include a detailed methodology and analytical framework along with tools to be used to gather any needed data/information and analyze existing and new data. The report should also be specified. provisions for quality assurance, data /information collection, data management and confidentiality, sampling, roles and responsibility of team members, key milestones, and detailed work plan.
  • Make an oral presentation of the draft inception report.
  • Incorporate feedback after Mercy Corps review of the inception report
  • Submit final Inception Report

Data Collection Phase:

  • The consultant will be provided with all the necessary contacts and assist with the arrangement of field discussions or survey administration where required as per the field visit plan. This phase will include the training of enumerators, adjustments of data collection tools (FGD, KII and surveys-if needed), planned reflection workshops with the project team, partners, and other stakeholders, and gathering existing program data and preparing it for analysis.
  • If a survey is needed, data will be collected using mobile devices with the Commcare app and then be synchronized into the Mercy Corps database.

Finalizing the Endline Evaluation Report:
This phase consists of the following steps:

  • Presentation: Present preliminary results to CIPP team for reviews and feedback.
  • Draft Endline report: Submit draft report incorporating feedback received during the preliminary results presentation. Mercy Corps will provide feedback to the endline report within seven to ten business days of receiving the draft report.
  • Final Endline report: Submit the final report to CIPP. The final report will be prepared after a few iterations of the report if the quality standards are not met within the first round. The Final report format will be agreed during the inception phase and included into the inception report.
  • The final report will include the completed analytical framework of endline assessment.
  • The soft copy of all the supporting documents and annexes shall be handed over to Mercy Corps with the final report.

Informed by the above steps, the consultant will provide the following deliverables during their contract:

  • An inception report with a comprehensive evaluation plan, including:
    • Mixed-methods evaluation design, methodologies used and sampling design/criteria, frame, size(s) for the qualitative data collection
    • The Data Analysis Plan (DAP) for the mixed methods study
    • Suggested improvements to the evaluation scope
    • Revised evaluation timeline
    • Ethical considerations, limitations, and mitigation strategies.
  • 1-day workshop to present the evaluation plan to key staff
  • 4 days training of facilitators, interviewers and/or enumerators
  • 14 – days data field data collection (both quantitative and qualitative)
  • 2-rounds presentation of preliminary results and analysis workshop for validation with the CIPP Consortium.
  • A final report of the evaluation will be produced. The proposed reporting structure has been provided below.
  • All Data sets, code books, syntax, etc
  • Synthesized lesson learnt document for the program (7 – 9 pages).
  • Program Case study profiles aligned to the program outcome themes.

Proposed Report Structure & Content
Cover Page, List of Acronyms:
Table of Contents:

  • Executive Summary: This section should be a clear and concise stand-alone document that gives readers the essential contents of the evaluation report, including a summary of major findings, lessons learned, and recommendations.
  • Methodology: This section should be sufficiently detailed to help the reader judge the accuracy of the report and its findings.
  • Limitations and mitigation strategies: This section should address constraints and
  • limitations of the methodology, and the implications of these limitations for the findings, including whether and why any of the evaluation findings are inconclusive.
  • Evaluation Results: This section should provide a clear assessment of progress concerning indicators/targets/objectives and evaluation questions. Reference baseline and midterm evaluation information as well as program logic, a theory of change, etc.
  • Synthesis, Recommendations, and Lessons Learned: This is space for the evaluation team to think about the data and results and make concrete
  • recommendations for current or future program improvements, pull out organization lessons learned, and comment on data and results. Everything presented in this section should be directly linked back to the information presented in the Results section of the report
  • Conflicts of Interest: Disclose any conflicts of interest or the appearance of conflicts of interest, including the interest of program staff in having a successful program.
  • Annexes: These should include a complete file of data collection instruments in English and translations if any; list of stakeholder groups with number and type of interactions; SOW, qualitative protocols developed and used, any data sets (these can be provided in electronic format), any required photos, participant profiles or other special documentation needed.

Limitations and Proposed Mitigation Strategies:

  • Time and access to some places may be a major limitation with regard to assessment processes in fragile contexts such as Nigeria’s Northwest and this makes it often challenging to keep up strictly with a set agenda. In addition, insecurity in some LGAs may limit movements and access to some areas and create heightened suspicion of outsiders asking questions.
  • This may potentially slow down the ability of the consultant, enumerators, or other members of the evaluation team to complete the evaluation on time. To address this issue, Mercy Corps has included a mock work plan with extra overflow days for field data collection. MC team will also work closely with the state/LGA authorities to ensure that the data collection process is conducted in the most appropriate time and acceptable conditions to all.
  • Time frame Although the midterm evaluation is scheduled within very strict timelines by considering all the practical concerns, interested candidates are encouraged to develop a work plan in the proposal to speed up the process.

Evaluation Implementation Work Plan and Timelines

  • The Activity Chief of Party, the MEL Manager, the program team, and PaQ team will manage the evaluation process in coordination with the evaluation consultant to provide technical support in the refinement of the evaluation methodology and – in the case of data collection tools, inputs, and all supporting documents – to guide design and finalization of the evaluation methodology and data collection instruments.
  • The Consultancy will run for approximately 45 days between January 2024 and March 2024. Bidding firms should propose edits where needed to the timelines estimated below. The duration/level of effort included is an estimate and the firm / consultant are welcome to suggest changes to this LOE in their bid / proposed plan.

Estimated durations and activities are outlined below:
Duration Activity Stakeholder

  • 1 day Review draft evaluation with the External Evaluator to clarify timeframe and available budget.
  • 2 days Undertake desk review of the relevant program documents that include the proposal, implementation plans, revised program design and timelines, program implementation reports, Mercy Corps strategy documents, Annual Perception Survey Report, Quarterly reports, Baseline Report, RCT Report, Gender assessment report, and any other relevant documents.
  • External Evaluator, CoP, DCoP, Programme Manager,
  • MEL Manager, MEL Team, Country MEL
  • Manager,
  • External Evaluator
  • (Consultants)

Develop an inception report detailing the process and methodologies to be employed to answer the evaluation questions. This should include all evaluation tools, and important time schedules for this exercise, and be presented to Mercy Corps for review and further inputs before going to the field.

1 days Provide feedback to inception report and tools for External Evaluator to incorporate (feedback will be consolidated from all reviewers before returning to External Evaluator)
External Evaluator, CoP, DCoP, Programme Manager

  • MEL Manager, MEL Team, Country MEL


  • 1 day With input from Mercy Corps Programs team and MEL teams, refine data collection tools and translate them in local languages (Hausa and pidgin) as appropriate

External Evaluator

  • Provide final versions of inception report and data collection tools to Mercy Corps

External Evaluator

  • Consultant/firm submits complete draft of the DATA ANALYSIS PLAN (DAP) to Mercy Corps’ POC (a complete draft includes dummy tables, placeholders for charts/graphs/images, description of how data triangulation/synthesis will be conducted, how qualitative data will be analysed, etc.)
  • Mercy Corps’ POC distributes DAP complete draft to ALL Mercy Corps reviewers (and donor reviewers if, required) and consolidates feedback returning this to consultant/firm.

External Evaluator:

  • Consultant/firm submits FINAL DAP to Mercy Corps’ POC having addressed consolidated all feedback.

External Evaluator

  • 2 days Train enumerators/surveyors; pre-test data collection instruments

External Evaluator/Consortium MEL Team

  • 1 days Finalize data collection instruments / tools External Evaluator
  • 10 days Conduct and oversee data collection External Evaluator
  • 3 days Encode and analyze data External Evaluator
  • 5 days Prepare a draft evaluation report and learning summary

External Evaluator:

  • 5 days Provide detailed feedback to draft report External Evaluator, CoP, DCoP
  • MEL Manager, MEL Team, Country MEL
  • Manager.
  • 1-day Internal reflections and validation of the final report External Evaluator,
  • MEL Manager, MEL Team, Country MEL
  • 3 days Finalize report, produce a presentation of findings, and share back with MC (not more than 30 pages – all other additions can be included as annexes)

External Evaluator:

  • After donor review of report, incorporate any feedback from donor for final donor reviewed version.
  • Data sets, code books, syntax, etc are delivered to
  • Mercy Corps’ POC
  • TOTAL 45 Days

External Evaluator:

  • NB** Consultants must NOT exceed beyond the allocated 45 days but will work within the timelines provided that are beyond the required 45 days.

The following are the key deadlines for the required deliverables:

  • First draft report to be submitted by first week of March 2024
  • CIPP will review the draft report and provide feedback no later than Second week of March 2024.
  • Final report, incorporating feedback, will be due on 3rd week of March 2024.
  • Final review of donor feedback and comments once the report has been submitted (to be determined and agreed together with the evaluation team during inception).

Timeframe / Schedule:

  • The process is expected to take 45 working days including preparation, data collection and analysis, and reporting. The Consultant should be able to undertake some of the tasks concurrently to fit within the planned time frame, without compromising the quality expected. The assignment is expected to commence in January 2024, with the final endline evaluation report expected by3 rd week of March 2024. The Consultant will commit to NOT more than the estimated total of 45 working days spread within the provided timelines.

The Consultants will report to:

  • The Program Monitoring, Evaluation and Learning (MEL) Manager supported by Mercy

Corps Country MEL Manager:
The Consulting firm will work closely with:

  • CIPP Chief of Party, CIPP Consortium Program Team, CIPP MEL Focal Points, Consortium Program Managers, PaQ Director, Strategic Learning Manager, PaQ Manager, Grants and Reporting Manager, and Program Staff, among others.

Required Experience & Skills:
The following are the qualifications and experiences the consultants should have:

  • The Lead Consultants must have a master’s degree or Ph.D. in any Social Sciences field e.g., Demography, Population Studies. This is also preferred for the associate consultants.
  • Strong and documented experience in conducting participatory quantitative and qualitative evaluations/studies related to governance, peacebuilding, and/or community- driven development projects
  • Strong research experience including experience conducting major research exercises in support of major development programs – preferably in peacebuilding and governance projects – in challenging operational environments. Previous experience in northeast Nigeria is desirable.
  • A strong approach to assuring quality assurance of data collected.
  • A strong ethical approach to data collection – while still being able to meet the objectives of the consultancy.
  • Demonstrated experience in training local staff in quantitative and qualitative data collection tools including entry templates
  • Demonstrated experience in designing survey methodology, data collection tools, processing, and analysis of data
  • Knowledge of strategic and operational management of humanitarian operations and proven ability to provide strategic recommendations to key stakeholders.
  • Strong analytical skills and ability to synthesize and present findings, draw practical conclusions, make recommendations, and prepare well-written reports on time.
  • Demonstrated experience in both quantitative and qualitative data collection and data analysis techniques, especially in emergency contexts.
  • Data visualization skills are highly desirable.
  • The consultant(s) is expected to have strong skills in survey form design for mobile data collection (ODK, Ona, or Commcare)
  • Experience, knowledge, and clear understanding of Nigeria’s humanitarian context.
  • Experience with evaluating USAID-funded projects.
  • Good people skills and understanding of cultural sensitivities.
  • Readiness to travel to northwest and northcentral Nigeria and conduct direct standard assessment activities as well as field visits to program sites. Having a presence in West Africa or Nigeria is desirable but not essential.
  • Excellent verbal and written communication in English is required

Documents Comprising the Proposal:
Assessment and award of the assignment:

  • Mercy Corps will evaluate Technical and detailed financial proposals and award the assignment based on technical and financial feasibility. Mercy Corps reserves the right to accept or reject one or all proposals received without assigning any reason and is not bound to accept the lowest or the highest bidder. Only those shortlisted will be contacted.

Scoring Evaluation:
Trade-Off Method:

  • Mercy Corps Tender Committee will conduct a technical evaluation which will grade technical criteria on a weighted basis (each criterion is given a percentage, all together equaling 100%). Offeror’s proposals should consist of all required technical submittals so a Mercy Corps committee can thoroughly evaluate the technical criteria listed herein and assign points based on the strength of a technical submission.
  • Award criteria shall be based on the proposal’s overall “value for money” (quality, cost, delivery time, etc.) while taking into consideration donor and internal requirements and regulations. Each individual criteria have been assigned a weighting prior to the release of this tender based on its importance to Mercy Corps in this process.
  • Offeror(s) with the best score will be accepted as the winning offeror(s), assuming the price is deemed fair and reasonable.

When performing the Scoring Evaluation, the Mercy Corps tender committee will assign points for each criterion based on the following scale:

Evaluation Criteria Weight




(1 to 10)



(A) (B) (A*B)
Product/Service/Work Technical Specifications

A comprehensive concept notes with a budget and a defined methodology for responding to evaluation questions. Data collection with a strong ethical perspective. Strong survey form design skills for data collecting on mobile platforms. Analytical abilities, as well as

the capacity to synthesize and explain facts, as well as draw practical conclusions and offer suggestions. Also demonstrate data visualization skills.

30% 10 3
Price/Cost: The price quoted is acceptable and justifiable in terms of value for money. 20% 10 2
Resources: A minimum of an advanced degree, especially in economics, is required of the principal consultant. Both the lead and co-lead must have at least four years of experience directing evaluations for humanitarian, early recovery, and resilience programs. The lead and co-lead must be willing to go to northeast Nigeria to perform direct standard assessment activities as well as program site visits. Fluency in English is required. 20% 10 2
Corporate Capabilities:

Knowledge about the humanitarian situation in Nigeria. Experience conducting evaluations in Nigeria or other African countries. Evaluation experience with EU-funded programs. It is preferable, but not necessary, to have a presence in West Africa or Nigeria. The

firm should have at least 5 years of expertise in the evaluation industry.

10% 10 1
Delivery Time/Project Schedule:

All deliverables must be met within 45 working days before3rd week of March 2024

20% 10 2

Application Closing Date
8th December, 2023.

How to Apply
Interested and qualified consultant(s) should submit the following documentation for the proposal to: using the Job Title as the subject of the mail.

  • Summarized Concept Note
  • Technical and Financial proposal clearly demonstrating a thorough understanding of this ToR and including but not limited to the following:
    • Description of the Methodology
    • Demonstrated previous experience in similar assignments and qualifications as outlined in the ToR
    • Proposed data management plan (collection, processing, and analysis).
    • Proposed timeframe detailing activities and a work plan.
    • Team composition (if applicable) and level of effort of each proposed team member (include CVs of each team member) noting identified roles and team lead
    • Proposed detailed budget aligned to the role or roles of the consultant(s)
  • At least three samples of similar assignments (not limited to but preferable NC/NW Nigeria contexts) Any sub-contracting under this evaluation consultancy will not be accepted.

Note: The consultant will provide transport, and accommodation logistics for he/her staff to and within Nigeria (where necessary). No transport or logistical costs (including per diem and insurance) will be provided by Mercy corps.

Other jobs you may like