Department of Economics - University of Bologna
Centre for International Development - Bologna

Result-based M&E and Outcome and Impact Evaluation
2015 Annual Edition
Result Based Monitoring and Evaluation, with Ray Rist, instructor - Monday-Friday, 7-11 September 2015
Outcome and Impact Evaluation, with Michael Bamberger, instructor - Monday-Friday, 14-18 September 2015
Extended deadline for applications: 31 August 2015
Our courses

As part of the 10th Annual Edition of the joint Bologna Centre for International Development / Department of Economics
Summer School Programme on Monitoring and Evaluation the programme's focus for the September 2015 modules is on Result-based Monitoring and Evaluation
(first module) and Outcome and Impact Evaluation (second module).

The first module on 
Result-based Monitoring and Evaluation will provide two workshops - one on how to design and build a results-based monitoring and evaluation (M&E) system in your organization; and a second on how to design, collect and analyze data for a case study evaluation.

The first workshop is based on the Ten Steps to a Results-Based Monitoring and Evaluation System developed by Jody Kusek and Ray Rist at The World Bank.
The emphasis will be on how to plan, design, and implement such a system, beginning with the readiness assessment, moving along through goal setting, indicator selection, establishing baseline data and setting targets,  to means of data analysis and reporting, to ensuring use and sustainability. The focus of this approach is on a comprehensive ten-step model that will help guide you through the process of designing and building a results-based M&E system. These steps will begin with a “Readiness Assessment” and will take the audience through the design, management, and, importantly, the sustainability of their M&E system. The handbook describes these steps in detail, the tasks needed to complete them, and the tools available to help you along the way. Throughout this workshop, participants will discuss the ten steps, the tasks needed to complete them, and the tools to help along the way. This workshop will last for three days and reading materials will be provided.

The second workshop will be for two days and will focus on the Uses (and Abuses) of Case Studies in Evaluation.  Although case study methodology is often useful in addressing evaluation questions, choosing the right type of case study is as important as choosing the methodology. This workshop defines what is a case study, focusing on when to use it and what type of case study to deploy.  Participants will critique several types of case studies and will also conduct several short field work exercises.  Again, reading material will be provided.

This course is primarily targeted toward officials who are faced with the challenge of managing for results. Developing countries in particular have multiple obstacles to overcome in building M&E systems. However, results-based M&E systems are a continuous work in progress for both developed and developing countries. When implemented properly these systems provide a continuous flow of information feedback into the system, which can help guide policymakers toward achieving the desired results. Seasoned program managers in developed countries and international organizations—where results-based M&E systems are now in place—are using this approach to gain insight into the performance of their respective organizations.

The course, which will be delivered by Ray Rist himself, can be taken as a guide on how to design and construct a results-based M&E system in the public sector. The goal of the course is to help prepare one plan, design, and implement a results-based M&E system within one’s organization. In addition, the course (and the handbook on which it is based on) will also demonstrate how an M&E system can be a valuable tool in supporting good public management.
 
Here is the calendar for the first module:
FIRST WEEK AGENDA


The second module on Outcome and impact Evaluation, which will be delivered by Micheal Bamberger, will provide three workshops - the first on how to identify unintended outcomes of development programs; the second on incorporating gender into monitoring and evaluation systems; the third on evaluating complex development programs.

The first workshop (two days) - Identifying unintended outcomes of development programs - addresses two questions “Why do many development evaluations fail to detect unintended outcomes?” and “How can planners and evaluators address these challenges?”.  As all experienced program evaluators know, few, if any, development programs turn out exactly as planned. Such is the case of programs that prevent (intentionally or unintentionally) certain groups from accessing services or, even worse, that deteriorate living conditions among some groups.  The workshop will review why so many evaluation designs – including randomized control trials, quasi-experimental designs, theory-based evaluations and results-based M&E systems - often fail to capture unintended outcomes. 

The workshop will examine the situation with respect to theories of change (TOC).  While TOCs can, and often are, designed to identify unintended outcomes, experience shows that in many cases the agencies commissioning the evaluation are mainly interested in whether programs have achieved their intended outcomes and evaluators are sometimes discouraged from investing time and resources to identify unintended outcomes. We will present examples where TOCs are designed to identify unintended outcomes and will draw lessons on ways to strengthen  the ability of all TOCs to address these outcomes.

The workshop will cover the following topics: (1) Classifying unintended outcomes of international development programs; (2) Examples of evaluation designs failing to identify unintended  outcomes and the often serious consequences; (3) Methodological, political and real-world constraints that explain why conventional evaluation methodologies often fail to capture these outcomes;  (4) Case studies illustrating how randomized control trials and quasi-experimental designs often fail to identify unintended outcomes; (5) Strategies that planners and evaluators can adopt to strengthen their ability to identify unintended outcomes (e.g., through a creative and innovative use of mixed methods approaches).  The workshop will encourage active group participation and will encourage participants to share their experiences on challenges and useful strategies for addressing unintended outcomes. A number of group exercises will be included so that participants can apply the tools and techniques presented in the workshop.

The second workshop (one day) - Incorporating Gender into Monitoring and Evaluation at the Country, Program and Project Levels - discusses the challenges and opportunities for M&E to address gender inequalities at the national, program and project levels. Despite significant progress, gender inequalities persist in all countries. These inequalities both negate fundamental human rights and present serious barriers to the achievement of national development objectives and the promotion of equity, human rights and social justice. Notwithstanding widespread commitment to gender equality by governments and development agencies, and despite the compelling evidence on persistent gender inequalities, conventional M&E systems fail to address gender differences.

The workshop reviews experience and provides guidelines and tools and techniques  for developing gender-responsive M&E studies and systems at the national, program and project levels.  It draws on international experience of governments, donor agencies, and NGOs to outline the main steps in the design and implementation of gender-responsive M&E systems and approaches.  The workshop will include examples of the serious problems that can arise when gender is not addressed and also examples where gender has been successfully addressed at the national and project levels.  In addition to discussing methodological issues the workshop will focus on the real-world challenges and opportunities to incorporate gender into organizations that have budget and professional resource constraints and where management and staff may have reservations about the value of incorporating gender.

The third workshop (two days) - Assessing the outcomes and impacts of complex programs - reviews strategies for assessing outcomes and impacts of complex development interventions, such as general budget support, multi-component sector programs, and cross-cutting thematic programs (e.g. gender mainstreaming or peace-building).  The challenge for evaluators is that conventional evaluation methods such as randomized control trials and quasi-experimental designs, and many qualitative methods are not able to address the complex characteristics of the contexts in which programs are conceived and implemented, or specific traits of the programs themselves.  Exploring the interactions that can possibly unfold between program features and context traits requires tracing multiple causal paths and allowing for emergent designs where program objectives and intended outcomes change as the program evolves.  The central message is that complexity requires innovative evaluation designs by selecting a variety of methods among the many at our disposal. Practical tools and techniques are available to assess the outcomes of complex programs.  Many, but not all, of the proposed approaches involve “unpacking” complex programs into a set of components or elements that are easier to evaluate.  The challenge is then to “repack” the findings to recognize complexity and avoid simplification while addressing relevant evaluation questions.

The workshop reviews the contribution of complexity science in exploring the linkages between contexts and program designs.  A wide range of practical evaluation tools are discussed, and illustrated with examples from the field, including: theory-based approaches; quantitative, qualitative and mixed-methods designs; rating systems (often based on OECD/DAC criteria); and innovative Web-based approaches, such as concept mapping and other forms of expert consultation, crowdsourcing and participatory planning, and  the multiple applications of big data.  Participants will apply different strategies in group exercises.  Participants are encouraged to bring their own evaluations of complex interventions to illustrate promising approaches and to seek suggestions from their colleagues.

Here is the calendar for the second module:
SECOND WEEK AGENDA


Our People
Direction and management:
  • The director of the Summer Programme is Pier Giorgio Ardeni, professor of development economics at University of Bologna, policy advisor, expert in poverty reduction programs and statistical development. 
  • The manager and coordinator is Cecilia Tinonin, PhD, junior development practitioner.
Our trainers are well known, recognized international development experts with extensive field experience in low-income countries:
  • Ray C. Rist is Advisor of the Independent Evaluation Group of The World Bank and of UNDP and former President of the International Development Evaluation Association (IDEAS). During his academic career he concentrated in Anthropology and Sociology, and graduated at Washington University (Ph.D. in 1970), St. Louis. Rist can refer on a long list of academic and professional activities in the field of evaluation – among many others for The World Bank Institute and the United States Government. He was also one of the intiators and Co-Directors of the International Program for Develop-ment Evaluation Training (IPDET). Furthermore he authored or edited 28 books and more than 140 articles and monographs in areas as results-based management/measurement, program evaluation, policy analysis, public sector management, education, race relations, urban violence, international migration, youth unemployment, research methodology, and comparative government etc. Rist lectured in more than 85 countries, received several awards and honors and serves (resp. has served) on the editorial boards of nine professional journals. Retired from the World Bank, Dr. Rist continues to advise organizations and governments across the globe no how to design and build results-based M&E systems.  His career includes 15 years with senior appointments in the U.S. Government.  He has held professorships at Cornell, Johns Hopkins, and The George Washington universities.  He has authored or edited 32 books, written more than 140 articles, and lectured in 87 countries. He also has six grandchildren.
  • Michael Bamberger has evaluated development programs throughout Africa, Asia, Latin America, and the Middle East. He worked for more than a decade with NGOs in Latin America. During his 22 years with the World Bank, he worked on M&E in many different sectors, including gender and development, and evaluation capacity development. Since retiring from the Bank, he has consulted for a number of UN, multilateral, and bilateral development agencies; published widely on development evaluation; and served on the editorial board of several evaluation journals. He was recently appointed to the UNDP’ Evaluation Office International Evaluation Advisory Committee. His most recent book (co-authored with Jim Rugh and Linda Mabry) was Real World Evaluation: Working under Budget, Time, and Data Constraints (2012 Second Edition).
How to apply

Enrollment fees and applications deadlines


Early-Bird Registration
(before 1 June 2015)
Full registration
(between 1 June and 31 July 2015)
EXTENDED DEADLINE: 31 AUGUST 2015
One module 1.500 euro 1.600 euro
Two modules2.500 euro3.000 euro

The enrolment fee includes tuition and materials.
Room and boarding and travel expenses are not included.
Various lodging arrangement suggestions can be provided on request.

Partial tuition waivers will possibly be available for 
a limited number of
 applicants from developing countries only!
Please indicate in your application whether you need funding!

For those requesting funding, acceptance will be evaluated 
together with their granted fee status.

Payment instructions will be given to accepted applicants only.
Fees will have to be paid to the University of Bologna (credit cards are not accepted).


IF YOU WISH TO APPLY
:
Please send an application form and a short CV to:
rbmie2015@cid-bo.org


Download an
application form HERE, fill it out and send it in!

Applications will be accepted on a rolling basis.
Applicants must have an undergraduate degree
and must be proficient in English.

Enrol in this summer program in the exciting setting of Bologna!
Send in an application! Tell your colleagues!

For any information, please contact:
CID Summer School programs - Results/based M&E and Evaluation course: rbmie2015@cid-bo.org

PIER GIORGIO ARDENI (Academic Director): piergiorgio.ardeni@unibo.it
CECILIA TINONIN (Manager): ceciliatinonin@gmail.com

Important deadlines to remember


Application for Early-Bird Registration 1 June 2015
Credited payment (bank transfer) of Early-Bird Registration Fee 30 June 2015
Application for Full Registration 31 July 2015
EXTENDED DEADLINE:
31 AUGUST 2015
Credited payment (bank transfer) of Full Registration 31 August 2015