Monitoring and Evaluation Methods and ApplicationsCost-Inclusive Outcome Evaluation
A Professional Summer Training Programme
for Public Servants, Experts and Practitioners
Bologna, Italy, 5-9 September 2016
As part of the 11th Annual Edition of the joint Bologna Centre for International Development / Department of Economics Summer School Programme on Monitoring and Evaluation the programme's focus for the September 2015 course is on Cost-Inclusive Outcome and Impact Evaluation.
This course is primarily targeted toward officials who are faced with the challenge of managing and evaluating results. Results-based M&E systems are a continuous work in progress for both developed and developing countries and a cost-inclusive evaluation of outcomes necessary. When implemented properly these systems provide a continuous flow of information feedback into the system, which can help guide policymakers toward achieving the desired results. Seasoned program managers in the public administration and international organizations—where results-based M&E systems are now in place—are using this approach to gain insight into the performance of their respective organizations.
Module 1 (day 1, morning). Introduction to Outcome and Impact Evaluation: from M&E to IE
(day 1, afternoon). Standard Approaches to IE: Counterfactual Analysis and RCTs
Module 2 (day 2). From Social Impact to Social Value Beyond Cost-Benefit Analysis: New Metrics and Rhetorics of Performance Evaluation
Module 3 (day 3). Theory-based Evaluations: From Realist Syntheses to Developmental Approaches to Address Complex Programs and Unintended Outcomes
Module 4 (day 4). Comparing Quantitative, Qualitatives and Mixed Methods - an overview on Multivariate analysis, Regression Analysis, Social Network Analysis, In-depth Interviewing, Focus Groups, Case Studies or Qualitative Comparative Analysis
Module 5 (day 5). IE and its (Elusive) Use in Policy Design: Instrumental, Enlightening and Manipulative Uses
Morning session: Introduction to Outcome and Impact Evaluation: from Monitoring & Evaluation to Impact Evaluation
The morning session will present a variety of approaches to planning and designing evaluation and performance monitoring systems, ensuring that the beneﬁts of evaluation outweigh its costs. This module will suggest strategies for involving intended users and other key stakeholders in evaluation planning and design; and discuss the use of logic models, evaluability assessment and other exploratory evaluation approaches from goal setting, indicator selection, establishing baseline data and setting targets, to means of data analysis and reporting, to ensuring use and sustainability.
Special emphasis will be put on how to analyze program theories — including program activities, outputs, program outcomes, and contextual factors affecting the extent to which program activities produce desired program results.
Afternoon session: Standard Approaches to Impact Evaluation: Counterfactual Analysis and RCTs
The afternoon session will address the evaluation question of “did the program work?” by providing an overview of experimental and quasi-experimental designs. Speciﬁcally, the counterfactual analysis and the randomized control trials (RCTs) will be critically discussed as impact assessment strategies seeking to solve the methodological problem of causal attribution.
Second module (whole day)
From Social Impact to Social Value Beyond Cost-Beneﬁt Analysis: New Metrics and Rhetorics of Performance Evaluation
The module will address the question of whether the program generated social value. The Social Return On Investment (SROI) will be presented as a new
technique to evaluate the impacts of programs, their value with respect to their costs. From its emergence as a form of conventional beneﬁt-cost analysis that examines the contribution of speciﬁc activities to short- and long term outcomes of societal importance, SROI has evolved into a certiﬁed form of high-stakes performance measurement speciﬁcally for social enterprise organizations delivering health, environmental, and social care services.
Third module (whole day)
Theory-based Evaluations: From Realist Syntheses to Developmental Approaches to Address Complex Programs and Unintended Outcomes
The module will address the question of “how, why, for whom, and in what circumstances” did the program work?”
Building upon the broad-based family of theory-based evaluations, this module will explore the realist, developmental and complexity-based approaches to assessing overarching programs, with multi-objectives, multi-sites, and multi-actor components and ‘emergent’ results in times of highly pervasive uncertainty. A special focus on gender responsive evaluation approaches will also be presented to discuss such complex issues as gender inequalities and power imbalances, work-family interfaces and cooperative conﬂict between genders.
Fourth module (whole day)
Comparing Quantitative, Qualitatives and Mixed Methods — An overview on Multivariate analysis, Regression Analysis, Social Network Analysis, In-depth Interviewing, Focus Groups, Case Studies or Qualitative Comparative Analysis
The module will provide an overview on qualitative and quantitative methods for data analysis and offer advice on important methods for analyzing qualitative data, use of appropriate statistics and statistical tests, cost-effectiveness and cost-beneﬁts analysis, meta-analyses and evaluation syntheses. Participants will discuss the requirements that must be met to use these data analysis techniques through examples illustrating their application.
Fifth module (whole day)
Impact Evaluation and its (Elusive) Use in Policy Design: Instrumental, Enlightening and Manipulative Uses
The module will describe methods for getting evaluation results used, developing options and recommendations for contracting for evaluations and
reporting ﬁndings persuasively to overcome political and bureaucratic challenges to the use of evaluation ﬁndings.
Marra, M. (2015) "Cooperating for a More Egalitarian Society: Complexity Theory for Evaluating Gender Equity, EVALUATION, Vol. 21(1): 32-46.
Newcomer, K. E., Harry, H. P., Whaley, J. S. (2015) Handbook of Practical Program Evaluation, Jossey-Bass.
Patton, M. Q., (2010) Developmental Evaluation. Applying Complexity Concepts to Enhance Innovation and Use, Guilford Press.
Rogers, P. (2008) "Using Programme Theory to Evaluate Complicated and Complex Aspects of Interventions", EVALUATION, Vol. 14 no. 1: 29-48.
Forss, K., Marra, M., Schwartz, R. (2011) Evaluating the Complex. From Attribution to Contribution, Transaction Publishers, News Brunswick, US.
Weiss, C. (1997) Theory-based evaluation: past, present, and future. Progress and Future Directions in Evaluation: Perspectives on Theory, Practice, and Methods, 76: 41–55.
The course will be delivered by Mita Marra, expert evaluator and trainer. Mita Marra teaches Public Policy and Public Administration at the University of Salerno in Italy. Her current research interests revolve around the evaluation of complex policies for human and social development. Dr. Marra studied Economics and the University of Naples (1995) in Italy and received an MA in International Relations from the School of Advanced International Studies (SAIS) of Johns Hopkins University (Washington DC, 1998) and a PhD in Public Policy from theTrachtenberg School of Public Policy and Public Administration of the George Washington University (Washington DC, 2003). Since 1998, she has been working as a consultant with the World Bank, the United Nation Economic Commission for Europe, and a number of regional governments in Italy, conducting evaluation research on institutional development programs as well as gender equality and social policies. She has also been teaching in The World Bank’s International Program in Development Evaluation Training (IPDET) at Carlton University (Canada). Dr. Marra has authored a number of articles and books. For Transactions Publishers, NJ, in 2011, with Kim Forss and Robert Schwartz, she co-edited “Evaluating the Complex. Beyond Attribution and Contribution.” In 2014, with Kim Forss, she co-edited “Speaking Justice to Power: Ethical and Methodological Challenges for Evaluators.” Mita Marra is Associate Editor of the international journal Evaluation and Program Planning (July 2015) and the president-elect of the Italian Evaluation Association AIV (April 2013).
The director of the Summer Programme is Pier Giorgio Ardeni, professor of political economy development economics at University of Bologna, policy advisor, expert in monitoring adn evaluation, poverty reduction programs and statistical development. Prof. Ardeni is also the current president of the Cattaneo Institute Research Foundation.