Browse evaluation design Resources
-
A Guide to Measuring Advocacy and Policy This guide provides some perspective on where the field of philanthropy has been with regard to evaluation of advocacy and policy and also acknowledges the unique issues and challenges associated with measuring these efforts. In addition, this guide serves as an invitation to grantmakers to engage in and expand thinking about evaluation as it relates to advocacy and policy efforts. As seriously as many grantmakers take their investments in this area, foundations should also take seriously the need to advance evaluation of advocacy and policy work.
Author: Prepared for Annie E. Casey Foundation Research by Organizational Research Services Type: Research & Reports Date: Jan 1, 2007 Be the first to review this resource! Download (255.04 KB) -
A Guide to Measuring Advocacy and Policy Developed for the Annie E. Casey Foundation, this guide serves as a broad call to grantmakers to build and advance the field of advocacy and policy evaluation. The guide includes sections on the context of advocacy evaluation and evaluation design. Author: Reisman, Jane, et al. Type: Research & Reports Date: Mar 15, 2007
Download (255.04 KB) -
Advocacy Evaluation Update (Issue #3, March 2008) The third issue of Innovation Network's Advocacy Evaluation Update includes the following:
1) updates about advocacy evaluation resources available online;
2) highlights from the Advocacy and Policy Change TIG (Topical Interest Group) conference sessions (including research summaries and links to additional resources) presented at the American Evaluation Association's 2007 annual conference; and
3) interviews with three of the leaders of AEA's Advocacy and Policy Change TIG.
Author: Innovation Network Type: Newsletters & Periodicals Date: Mar 12, 2008 Be the first to review this resource! Download (584.99 KB) -
Agency Experiences with Outcome Measurement: Survey Findings As of January 2000, 400 United Ways across the country were asking programs they fund
to identify and measure their outcomes—the benefits or changes the programs want
participants to experience as a result of their services. United Ways are not alone. Many
state and local government agencies, foundations, managed care systems, and accrediting bodies
have added outcome measurement to the list of performance and accountability measures
they require of nonprofit organizations within their sphere.Author: United Way of America Type: Research & Reports Date: Jan 1, 2000 Be the first to review this resource! Download (209.74 KB) -
Basic Guide to Program Evaluation Carter McNamara's guide to evaluation, part of his Free Management Library and drawn from his book Field Guide to Nonprofit Program Design, Marketing and Evaluation. This guide covers the hows and whys of evaluation, an overview of evaluation and data collection methodologies and how to choose between them, analysis and interpretation advice, and other tips. Author: McNamara, Carter (ed.) Type: Websites & Online Tools Date: Jan 18, 2008
Web Link -
Catholic Relief Services' (CRS) Guidance for Developing Logical and Results Frameworks This document summarizes Catholic Relief Services’ (CRS) guidance for developing logical and results frameworks.
• logical framework “is a systematic and visual way to present and share your understanding of the relationships among the resources you have to operate your program, the activities you plan, and the changes or results you hope to achieve.”CRS’ proframe and the U.K. Department for International Development’s (DFID’s) logframe are examples of logical frameworks.
Author: Carlisle J. Levine Type: Workbooks & Guides Date: Jan 31, 2007 Be the first to review this resource! Download (196.44 KB) -
Coalition Assessment: Approaches for Measuring Capacity and Impact Why assess coalition capacity? How should a coalition be assessed? How can coalition assessment data be analyzed and used?
Author: Veena Pankaj, Kat Athanasiades, and Ann Emery Type: Research & Reports Date: Feb 4, 2014 Point K Pick Be the first to review this resource! Download (768.05 KB) -
Conquering the Dusty Shelf Report: Data Visualization for Evaluation In this blog post for Visualising Data, Johanna Morariu and Ann Emery share three tactics for tackling the Dusty Shelf Report in evaluation: captivating the readers with visuals, choosing the design that's right for the reader, and strengthening the dataviz literacy of readers. Author: Johanna Morariu and Ann Emery Type: Opinion (blog, editorial) Date: May 7, 2013 Be the first to review this resource! Web Link -
Current Advocacy Evaluation Practice Framing Paper Written for the Advocacy Evaluation Advances convening in January 2009, this paper summarizes the current state of advocacy evaluation practice. The paper identifies four evaluation design questions and then offers common responses to those questions: Who will do the evaluation?; What will the evaluation measure?; When will the evaluation take place?; and What methodology will the evaluation use?
Author: Julia Coffman Type: Research & Reports Date: Jan 31, 2009 Be the first to review this resource! Download (297 KB) -
Data and Information Visualization Throughout the Evaluation Life Cycle for Participatory Evaluation and Evaluation Capacity Innovation Network shared approaches and examples of how to incorporate innovative data and information visualization techniques throughout each stage of the evaluation life cycle to support participatory evaluation and build evaluation capacity. In the planning and design phase mind mapping can be used to promote brainstorming and idea generation. In the data collection stage, evaluators can use creative visuals to improve stakeholder understanding of and participation in data collection, and evaluators can adhere to good design principles to create effective data collection instruments. Author: Johanna Morariu, Myia Welsh, Veena Pankaj, Melissa March Type: Presentation Slides Date: Nov 3, 2011 Be the first to review this resource! Web Link -
Data and Information Visualization Throughout the Life Cycle for Participatory Evaluation and Evaluation Capacity Building This is a handout containing a list of great resources to help improve your data visualization skills. The list provides links to websites that will help you design the right color scheme (such as Design Seeds), websites that provides basic information about principles of design, and great examples of how evaluators, statisticians, and computer scientists are using data viz to help us understand data better.
Author: Johanna Morariu and Veena Pankaj Type: Websites & Online Tools Date: Jan 1, 2013 Be the first to review this resource! Download (392.28 KB) -
Designing a Results Framework for Achieving Results: A How-To Guide A results framework serves as a key tool in the development landscape, enabling practitioners to discuss and establish strategic development objectives and then link interventions to intermediate outcomes and results that directly relate to those objectives. This publication provides how-to guidance for developing results frameworks by discussing the following: Author: Independent Evaluation Group (IEG) Type: Workbooks & Guides Date: Sep 1, 2012 Be the first to review this resource! Download (834.16 KB) -
Dynamic Dozen: Delivery. Tips from top presenters from the American Evaluation Association This study consisted of interviews with a dozen of the top AEA presenters to get their secrets about how to make and deliver great presentations. Their comments were grouped into three stages of presenting: message, design, and
Author: Anjanette Raber Type: Workbooks & Guides Date: Aug 1, 2012 Be the first to review this resource! Download (489.15 KB) -
Dynamic Dozen: Design. Tips from top presenters from the American Evaluation Association This study consisted of interviews with a dozen of the top AEA presenters to get their secrets about how to make and deliver great presentations. Their comments were grouped into three stages of presenting: message, design, and delivery. This report focuses solely on Design, that is, the intentional composition of slides. While the context for their talks spanned long and short presentations, and included different types of audiences and purposes, their insights can be used or modified by evaluators for their own presentations at the AEA annual conference and elsewhere.
Author: Anjanette Raber Type: Workbooks & Guides Date: Aug 1, 2012 Be the first to review this resource! Download (431.09 KB) -
Dynamic Dozen: Message. Tips from top presenters from the American Evaluation Association This study consisted of interviews with a dozen of the top AEA presenters to get their secrets about how to make and deliver great presentations. Their comments were grouped into three stages of presenting: message, design, and delivery. This report focuses solely on Message, that is, the mindful planning of a structured presentation. While the context for their talks spanned long and short presentations, and included different types of audiences and purposes, their insights can be used or modified by evaluators for their own presentations at the AEA annual conference and elsewhere.
Author: Anjanette Raber Type: Workbooks & Guides Date: Aug 1, 2012 Be the first to review this resource! Download (555.08 KB) -
Evaluating System Change: A Planning Guide This methods brief provides guidance on planning effective evaluations of system change interventions. It begins with a general overview of systems theory and then outlines a three-part process for designing system change evaluations. This three-part process aligns (1) the dynamics of the targeted system or situation, (2) the dynamics of the system change intervention, and (3) the intended purpose(s) and methods of the evaluation.
Author: Margaret B. Hargreaves Type: Research & Reports Date: Apr 1, 2010 Be the first to review this resource! Download (1.63 MB) -
Evaluation Principles and Practices: An Internal Working Paper The purpose of this document is to advance the Foundation’s existing work so that our evaluation practices become more consistent across the organization. We hope to create more common understanding of our philosophy, purpose, and expectations regarding evaluation as well as clarify staff roles and available support. With more consistency and shared understanding, we expect less wheel re-creation across program areas, greater learning from each other’s efforts, and faster progress in designing meaningful evaluations and applying the results.
Author: Fay Twersky & Karen Lindblom Type: Research & Reports Date: Jan 22, 2013 Be the first to review this resource! Download (1.42 MB) -
Getting the Most from Evaluation In the June 6, 2007 issue the e-newsletter "Nonprofit Tools You Can Use" from the Fieldstone Alliance, Vince Hyman introduces an excerpt from the book Information Gold Mine: Innovative Uses of Evaluation by Paul Mattessich, et al. This excerpt discusses how a Colorado nonprofit, YouthZone, used its evaluation findings in a variety of ways. These uses included identifying infrastructure deficiencies, marketing, and informing changes to program design, among others. Author: Vince Hyman (Field Stone Alliance); Paul Mattessich, et al. Type: Newsletters & Periodicals Date: Jun 6, 2007 Be the first to review this resource! Web Link -
Guía para la Formulación de Marcos Lógicos y de Resultados de Catholic Relief Services (CRS) Sinopsis
Este documento presenta un resumen de la guía de CRS para formular marcos lógicos y de resultados.• Un marco lógico “es una forma sistemática y visual de presentar y compartir su visión de las relaciones entre los recursos con los que usted cuenta para operativizar su programa, las actividades que usted planifica y los cambios o resultados que espera alcanzar.”1 El proframe de CRS y el marco lógico del Departamento de Desarrollo Internacional del Reino Unido (DFID) son ejemplos de marcos lógicos.
Author: Carlisle J. Levine Type: Workbooks & Guides Date: Jan 31, 2007 Be the first to review this resource! Download (210.25 KB) -
Guidance Note #3: Introduction to Mixed Methods in Impact Evaluation Mixed methods (MM) evaluations seek to integrate social science disciplines with predominantly quantitative (QUANT) and predominantly qualitative (QUAL) approaches to theory, data collection, data analysis and interpretation. The purpose is to strengthen the reliability of data, validity of the findings and recommendations, and to broaden and deepen our understanding of the processes through which program outcomes and impacts are achieved, and how these are affected by the context within which the program is implemented.
Author: Michael Bamberger Type: Workbooks & Guides Date: Sep 5, 2012 Be the first to review this resource! Web Link -
How to Design a Monitoring and Evaluation Framework for a Policy Research Project This guidance note focuses on the designing and structuring of a monitoring and evaluation framework for policy research projects and programmes.
The primiary audience for this guidance note is people designing and managing monitoring and evaluation. However, it will be a useful tool for anyone involved in monitoring and evaluation activities.
The framework presented in this guidance note is intended to be used in a flexible manner depending on the purpose and characteristics of the research project.
Author: Methods Lab Type: Workbooks & Guides Date: Jan 1, 2016 Be the first to review this resource! Download (346.06 KB) -
How to Use Data Visualization to Better Tell Your Story Memos and metrics, emails and texts, newsletters and reports: Is your organization suffering from information overload? We consume 34 gigabytes, or 100,500 words, of information every day. Our brains are overwhelmed and struggling to keep up. Data visualization–or dataviz–is one of the strongest weapons against information overload. Author: Ann Emery Type: Opinion (blog, editorial) Date: Feb 1, 2014 Point K Pick Be the first to review this resource! Web Link -
Impact Evaluation in Practice This book provides an overview of impact evaluation from the perspective of the Wold Bank.
Author: Paul J. Gertler, Sebastian Martinez, Patrick Premand, Laura B. Rawlings, Christel M. J. Vermeersch Type: Workbooks & Guides Date: Jan 1, 2011 Be the first to review this resource! Download (3.06 MB) -
Logic Models in Participatory Evaluation Slides providing a basic introduction to the use of Logic Models inParticipatory Evaluaiton
Author: Douglas Bruce Type: Presentation Slides Date: Sep 1, 2011 Be the first to review this resource! Download (725.48 KB) -
Measuring Progress Towards Safety and Justice: A Global Guide to the Design of Performance Indicators across the Justice Sector From the introduction: "This guide is written for programme managers responsible for improving the delivery of safety, security, and access to justice in any part of the world. It should also be useful to a wide variety of government officials and to anyone interested in pursuing a disciplined course of institutional reform in the safety and justice sector." Author: Vera Institute of Justice Type: Workbooks & Guides Date: Jun 1, 2003 Be the first to review this resource! Download (848.46 KB)