Monitoring and Evaluation (M&E)
Approaches and plans

About M&E in Africa RISING

(The following paragraphs are excerpts from the program framework, which contains additional details about M&E in Africa RISING).

The HarvestChoice team at the International Food Policy Research Institute (IFPRI) has been charged with the monitoring and evaluation activities of Africa RISING.
As many operational activities of Africa RISING are still being finalized, the initial M&E plan is subject to further revision and refinement. This plan describes the program’s M&E approach and strategy, the currently understood and agreed intended outcomes and measurable indicators for tracking progress toward and achievement of those outcomes, the M&E methods to be used, and the initial assignment of M&E responsibilities.

Beyond the need to satisfying standard/conventional M&E requirement, this plan also describes activities designed under an expanded Africa RISING M&E scope. These include: (i) a structured stratification schema (by geography and household categories) and action research and control site selection process, (ii) a program-wide, spatially-enabled M&E data management and sharing platform open to program participants and stakeholders, and (iii) initial steps in embedding a farming-system modeling capacity into the program’s M&E toolkit.

The role of modeling is to better integrate and interpret monitoring data in ways that will enhance research design, evaluation and learning as well as, looking forward, research investment targeting and scaling of proven technologies and practices.

While they are highly complementary, monitoring and evaluation are separate both in their purpose and their implementation. Bearing this in mind, the current M&E plan does not describe a single combined activity, but describes each of them separately.

M&E goals and objectives

Monitoring and evaluation of project activities support effective project management, provide the data for timely reporting to project funders, and help all stakeholders learn about the project’s successes and failures. A robust M&E system should provide learning on what did and what did not work that, in turn, should inform the design and implementation of new interventions, as well as catalyze adjustments to ongoing activities that might enhance efficiency and effectiveness.

Critical aid in effective management
Monitoring can be a critical aid in effective management when it provides project managers with timely information on the status of activities and the results they are achieving. This allows managers to assess the need for changes in strategy or implementation.

Reporting requirements
Auditing and monitoring staff require frequent reporting of progress and results (monitoring) from project implementers, in order to provide funders with the evidence they need to both justify the expenditures underway and to maintain a flow of resources. In this regard it is vital to have clarity and consensus on the scope and nature of the expected direct results and beneficiaries, as well as on associated indirect outcomes, be they positive or negative. For example, direct results might include increases in productivity, incomes or nutrition in target smallholder households, whereas increased demand on women’s time or reduction in catchment water yields might be key indirect consequences that need increased monitoring and, possibly, ameliorative action. The selection and/or development of appropriate indicators, as well as the determination of their associated reporting needs (e.g., metrics, frequency, and disaggregation) need to account for all direct and indirect outcomes important to both clients and stakeholders.

The Africa RISING M&E strategy to best meet these needs is of relevance both to USAID and to implementation partners since the data required to satisfy donor needs are also of value to CGIAR and national research scientists and institutions as part of their own internal M&E needs. This plan, therefore, strives to meet the legitimately distinct goals and priorities of both USAID and the CGIAR M&E systems.

Development projects provide great opportunities to learn what works and what does not. This can be done through rigorous independent impact assessment and/or through evaluation(s) carried out by project staff. Given that the impact of large and highly visible projects will be reported at high levels, USAID’s evaluation policy specifies an independent (and rigorous) evaluation. However, USAID recognizes that much valuable learning can also be achieved through evaluations carried out by the project itself. Africa RISING is likely to collect or make use of large amounts of detailed information, which can support various types of evaluation, especially if the evaluation design is carefully considered at the outset of the project. The key questions that Africa RISING (both at a program and project levels) has set to answer will inform the design of the consequent evaluations, with the latter being probably among the most important challenges that the program faces.

Monitoring and Evaluation meetings

See other Africa RISING documents and reports related to Monitoring and Evaluation in the CG Space repository

Monitoring data requirement guide and tools

Monitoring data requirement guide:

Project mapping and monitoring tool (PMMT): PMMT-data entry application

Beneficiary and Technology Tracking Tool (BTTT):

Other monitoring tools
Exposure template

Scaling template

Ag trial template

Data management

Africa RISING data management plan

AR data repository platform (Dataverse)

Dataverse how-to-guide

Dataverse metadata template

Impact evaluation documents

M&E plans and annual reports

  • M&E report 2011-2012

  • M&E report 2012-2013

  • M&E report 2013-2014

  • M&E report 2014-2015

  • M&E report 2015-2016

Site selection reports and maps

Africa RISING Baseline Evaluation Surveys (ARBES)

Survey instruments
  • Malawi (2013)

  • Tanzania (2014)

  • Mali (2014)

  • Ghana (2014)

  • Ethiopia (2014)

ARBES survey reports:
  • Malawi (2013)

  • Mali (2014)

  • Ghana (2014)

  • Tanzania (2014)

  • Tanzania/Babati (2014 & 2015)

ARBES Data Summary

Presentations from the January-February 2012 design workshops: