«December 2012 Barbara Scola-Gähwiler Alice Nègre CGAP is an independent policy and research center dedicated to advancing financial access for the ...»
Resource Guide for Funders
A Technical Guide
CGAP is an independent policy and research center dedicated to advancing financial access for the
world’s poor. It is supported by over 30 development agencies and private foundations who share
a common mission to improve the lives of poor people. Housed at the World Bank, CGAP provides market intelligence, promotes standards, develops innovative solutions, and offers advisory services to governments, microfinance providers, donors, and investors.
© 2012, CGAP/World Bank. All rights reserved.
CGAP 1818 H St., N.W., MSN P3-300 Washington, DC 20433 USA www.cgap.org www.microfinancegateway.org Table of Contents Acknowledgments iv I. Purpose of This Technical Guide 1 II. What Is a Portfolio Review? 2 III. What Questions Can a Portfolio Review Answer? 4 IV. The Review Process 7 A. Preparation Phase 7 B. Portfolio Snapshot 12 C. Scoring Project Performance 15 D. Stakeholder Consultation 23 E. Analysis and Reporting 26 V. Implementing a Portfolio Review 30 A. Timing and Level of Effort 30 B. Terms of Reference 31 C. Selection of Consultants 31 D. Management of the Review Process 33 E. Debriefing and Disseminating Results 35 VI. Annexes 37 A. Terms of Reference (TOR) Template 37 B. Stakeholder Consultation—Interview Checklist 38 C. Sample Outline of Final Report 40 VII. Resources 41 VIII. Quoted Portfolio Reviews 43 iii Acknowledgments This Technical Guide builds on the experiences of CGAP members that conducted reviews of their microfinance portfolios during the past three years. CGAP participated in some of these portfolio reviews, provided technical advice, or served as a member of the steering committee that supervised the review process. In particular, the authors thank AFD, UNCDF, GIZ, and EIB for sharing their experiences with us and with other CGAP members. We also thank Lene Hansen, a reviewer of this Technical Guide and an evaluator in the EIB and UNCDF portfolio reviews, and Nathalie Assouline, the lead evaluator in the AFD portfolio review. Both have made significant contributions to the development of this methodology. Our thanks also go to Richard Rosenberg, who reviewed this Technical Guide and provided inputs from his experience as a lead evaluator in portfolio reviews for IDB, UNCDF, UNDP, and the World Bank. Finally, we thank Mayada El-Zoghbi for sharing her experience as a steering committee member in the AFD and UNCDF portfolio reviews and for providing guidance during the production of this Technical Guide.
ivI. Purpose of This Technical Guide
This Technical Guide presents the portfolio review as an evaluation method that can help funders learn from what they fund. Microfinance portfolio reviews first appeared in the 1990s as an evaluation method that compares the performance of microfinance projects across a funder’s portfolio. At that time, funders focused on building strong, sustainable institutions that would provide access to financial services to low-income populations.
Portfolio reviews helped assess whether this goal was achieved, by using standard financial performance indicators to evaluate the sustainability of microfinance institutions (MFIs).1 Today, the focus has shifted to a broader vision of financial inclusion, which requires funders to consider project performance beyond the sustainability of a supported MFI.
As microfinance sectors evolve and the number and diversity of funders increase, development agencies have to constantly reconsider the added value they bring to their target markets and whether their operations are relevant contributions to responsible market development. In this context, portfolio reviews evolved to assess not only the financial and social performance of MFIs, but also the relevance of the project design in a given market context and the quality of delivery.
There is no standard methodology for portfolio reviews, and this Technical Guide will not deliver a ready-to-use blueprint for evaluators. Rather, it takes stock of the lessons learned so far and aims to capture emerging practices among funders of financial inclusion. Ideally, this Technical Guide will reduce the efforts for funders conducting portfolio reviews in the future and increase the validity and comparability of results.
This Technical Guide is meant to be a reference document for staff of donors and investors who design and implement portfolio reviews, as well as for the evaluation teams
who perform the reviews. It is organized as follows:
• Section II describes the objectives and characteristics of a portfolio review.
• Section III discusses the questions a portfolio review can help answer.
• Section IV describes a step-by-step process of a typical portfolio review.
• Section V offers practical advice on how to implement a portfolio review.
• The annexes include template terms of reference (TOR), template interview checklists, and a sample outline for the final report.
See Rosenberg (1998) and Rosenberg (2006).
II. What Is a Portfolio Review?
A portfolio review analyses the composition of the microfinance portfolio and the performance of individual projects2 across the portfolio to assess whether a funder delivers on its microfinance strategy. Combining these two perspectives provides a comprehensive analysis of whether a funder is on the right track to achieve the overall objectives it has set for its microfinance operations. The main characteristics of a portfolio review are summarized in Table 1.
First and foremost, portfolio reviews are a learning tool for funders who want to understand what works, what doesn’t, and why. Analyzing performance across a funder’s portfolio can yield meaningful learning and can help inform future program design and strategic reorientations. Using a systematic approach to compare the main drivers of project performance throughout the portfolio helps identify common patterns of success and failure. As opposed to individual project evaluations for which lessons may be harder to extrapolate, portfolio reviews generate lessons based on the entire portfolio.
These lessons are likely to have broader validity and help funders make evidence-based decisions. In CGAP’s experience working with various funders, portfolio reviews have proven more likely to lead to actual improvements in operations compared to individual project evaluations.
Portfolio reviews also fulfill an accountability function. They assess whether funders are achieving results and whether these results are relevant in the current market context.
The reader, be it the head of the agency, civil society organizations, or beneficiaries of aid programs, can find out whether a funder delivers on its microfinance strategy and how a funder adds value. If done regularly—every three to five years—portfolio reviews help funders track performance over time and show whether strategic reorientations have led to the expected results. As such, portfolio reviews provide useful input for funders designing or revising their microfinance strategy.
The generic term “project” is used throughout this Technical Guide to describe an individual intervention as tracked in the funder’s monitoring and evaluation system, usually with a unique project identification number.
Terminology varies across different organizations, and sometimes the terms “investment” or “program” are used synonymously to “project.” In this Technical Guide, a project can include funding of one or several recipients and can be part of a set of interventions (program).
While portfolio reviews serve both purposes, there can be trade-offs between learning and accountability. Accountability requires that a portfolio review is done by independent evaluators with little or no involvement of staff responsible for the portfolio that is evaluated. However, from a learning perspective, staff involvement increases the chances that staff members know, buy into, and apply the lessons learned from a portfolio review.3 Funders and evaluators should be aware of this trade-off so that the methodology and process selected for a portfolio review are in line with the funder’s priorities.
Portfolio reviews are one of many tools within a funder’s monitoring and evaluation (M&E) system.4 They build on other data collection tools (such as those used for routine monitoring); they do not substitute for other types of periodic evaluations (e.g., project/program/country evaluations) rather, they serve to complement these other evaluations. When considering doing an evaluation, funders should carefully think about what drives the need for evaluation, what possible limitations they may face, and what specific question(s) they would like to answer. This reflection will help determine which method is best suited for their needs.
Portfolio reviews should be distinguished from impact evaluations. Portfolio reviews focus on the first parts of the results chain5 (inputs, outputs, and outcomes) and provide only approximate data about the impact of projects on beneficiaries. Funders interested in measuring the impact of microfinance projects on clients should consider the different impact evaluation methods that are emerging in the microfinance sector (CGAP 2011).
This is not unique to portfolio reviews but concerns any type of evaluation. OECD (1998, p. 10) notes, “The Principle of independence has to be balanced with the interest in promoting ownership of the evaluation products and their recommendations. At the same time, if accountability, and not lessons, is the primary purpose of an evaluation, then the independence function is critical.” With the advance of results-based management as a central principle in development cooperation, development agencies have invested in building M&E systems that address diverse internal and external information needs. For more information, see World Bank (2009).
The “results chain” is a model often used in development cooperation to describe the sequence of development interventions beginning with inputs used in activities that lead to outputs, outcomes, and ultimately, impact.
III. What Questions Can a Portfolio Review Answer?
The overarching question a portfolio review aims to answer is whether a funder delivers on its microfinance strategy. To answer this question, a portfolio review typically assesses the composition of a funder’s microfinance portfolio and project performance throughout the portfolio. In other words, a funder needs to fund the right projects and these projects have to perform. Since these two dimensions are not necessarily linked, a portfolio review has to assess them both systematically.
Assessing the composition of the microfinance portfolio serves to verify whether a funder supports projects that are in line with its microfinance strategy. Evaluators assess, for example, whether the types of projects, the regional allocation, the levels of intervention, and the funding instruments correspond with the funder’s strategic objectives.
Project performance depends on many different factors, some of which can be influenced by the funder and others that are beyond the funder’s control: funder support, recipient performance, country context, beneficiary needs, external factors, etc. (see Figure 1).
The two drivers of project performance that are specifically important to assess are recipient performance (e.g., for retail financial service providers, indicators include portfolio Figure 1. Drivers of Project Performance
quality, financial and operational sustainability, outreach) and the quality of the funder’s support (e.g., staff and resource allocation, selection of recipients, monitoring, responsiveness to changes in the project environment). Recipient performance, especially the sustainability of the institution supported by the funder, is crucial for the project outcomes to continue beyond the duration of the project. The quality of the funder’s support is what the funder can most directly influence.
To assess and compare project performance throughout the portfolio, a portfolio review uses a systematic approach based on the standard evaluation criteria: relevance, effectiveness, efficiency, impact, and sustainability (OECD 1991, see Table 2). Using such a standardized and systematic approach has many benefits: (i) exhaustiveness, in that it will help answer a funder’s specific questions but can also uncover unexpected issues, (ii) comparability across projects, over time, and possibly with other funders, (iii) usability of indicators since the Development Assistance Committee (DAC) Principles for Evaluation of Development Assistance criteria are well-known by evaluation specialists and consultants, and are likely to be used in internal M&E systems.
Funders might also have specific hypotheses or questions on what works well and what doesn’t, and the portfolio review can be used to test these hypotheses. For instance, is the greenfield (i.e., a newly established MFI) model more efficient in terms of increasing outreach compared to investing in existing institutions? Is direct implementation more efficient compared to implementation via an intermediary? Is a funder’s project management efficiently supporting operations? Reflecting upfront on the specific questions a funder wants to answer with a portfolio review is likely to increase the usefulness of the Table 2. DAC Evaluation Criteria Relevance The extent to which the objectives of a development intervention are consistent with beneficiaries’ requirements, country needs, global priorities, and partners’ and funders’ policies. Note: Retrospectively, the question of relevance often becomes a question of whether the objectives of an intervention or its design are still appropriate given changed circumstances.
Effectiveness The extent to which the development intervention’s objectives were achieved, or are expected to be achieved, taking into account their relative importance.
Efficiency A measure of how well economic resources/inputs (funds, expertise, time, etc.) are converted into results.
Impact Positive and negative, primary and secondary long-term effects produced by a development intervention, directly or indirectly, intended or unintended.