Browse evaluation capacity building Resources
-
Pathfinder Advocate Edition: A Practical Guide to Advocacy Evaluation Pathfinder is a practical guide to the advocacy evaluation process. This edition guides advocates through the advocacy evaluation process from start to finish. Editions for evaluators and funders are also available. Drawn from Innovation Network’s research and consulting experience, Pathfinder encourages the adoption of a “learning-focused evaluation” approach, which prioritizes using knowledge for improvement. Author: Innovation Network Type: Workbooks & Guides Date: Nov 1, 2009 Point K Pick Be the first to review this resource! Download (1.12 MB) -
Pathfinder Evaluator Edition: A Practical Guide to Advocacy Evaluation Pathfinder is a practical guide to the advocacy evaluation process. This edition guides evaluators through the advocacy evaluation process from start to finish. Editions for advocates and funders are also available. Drawn from Innovation Network’s research and consulting experience, Pathfinder encourages the adoption of a “learning-focused evaluation” approach, which prioritizes using knowledge for improvement. Author: Innovation Network Type: Workbooks & Guides Date: Nov 1, 2009 Point K Pick Be the first to review this resource! Download (1.4 MB) -
Pathfinder Funder Edition: A Practical Guide to Advocacy Evaluation Pathfinder is a practical guide to the advocacy evaluation process. This edition guides funders through the advocacy evaluation process from start to finish. Editions for advocates and evaluators are also available. Drawn from Innovation Network’s research and consulting experience, Pathfinder encourages the adoption of a “learning-focused evaluation” approach, which prioritizes using knowledge for improvement. Author: Innovation Network Type: Workbooks & Guides Date: Nov 1, 2009 Point K Pick Be the first to review this resource! Download (1.17 MB) -
Pathfinder: Resource List This annotated bibliography is meant to accompany Innovation Network's Pathfinder: A Practical Guide to Advocacy Evaluation series. It includes resources for each of the topics covered in Pathfinder. Author: Innovation Network, Inc. Type: Workbooks & Guides Date: Nov 3, 2009 Point K Pick Be the first to review this resource! Download (211.66 KB) -
Performance Management and Evaluation: Two Sides of the Same Coin (Presentation slides) Performance management and evaluation-what's the difference? With an increasing emphasis on measurement and impact, service providers and their funders are pushing for increasingly sophisticated evaluation approaches such as experimental and quasi-experimental designs. However, experimental methods are rarely appropriate, feasible, or cost-effective for the majority of organizations and service providers. Author: Isaac Castillo, Ann K. Emery Type: Presentation Slides Date: Oct 16, 2013 Point K Pick Be the first to review this resource! Download (1.42 MB) -
Performance Monitoring And Evaluation Tips A guide on focus groups from the U.S. Agency for International Development (USAID). Author: USAID Center for Development Information and Evaluation Type: Workbooks & Guides Date: Jan 1, 1996
Web Link -
Performance Monitoring Framework for Conservation Advocacy, A This report was written in response to the New Zealand Department of Conservation’s environmental advocacy work. It "sets out a framework for monitoring the effectiveness of conservation advocacy programmes in increasing public awareness about, and involvement in, conservation." The report offers advocacy monitoring and evaluation guidelines applicable beyond the environmental advocacy field. Author: James, Bev Type: Research & Reports Date: Mar 1, 2001 Be the first to review this resource! Download (229.7 KB) -
Philanthropic Strategies and Tactics for Change: A Concise Framework This article discusses the various tactics grantmakers rely on to create impact. The author describes theories of change, theories of leverage, programmatic tactics, and grantmaking tactics. Within programmatic tactics he discusses the need for evaluations to allow for ongoing program adjustments and to inform future efforts. Author: Frumkin, Peter Type: Research & Reports Date: Aug 31, 2002 Be the first to review this resource! Web Link -
Power, Participation, and State-based Politics: An Evaluation of the Ford Foundation's Collaborations that Count Initiative The Applied Research Center (ARC) conducted a two-year participatory evaluation to provide an account of the Ford Foundation's "Collaborations That Count Initiative." The report identifies areas in which the 11 statewide collaborations succeeded, and draws attention to ways in which support to collaboration might be more effectively provided in the future. Author: Applied Research Center Type: Research & Reports Date: Apr 21, 2004 Be the first to review this resource! Download (1.85 MB) -
Program Development and Evaluation This site provides a comprehensive set of resources on planning and implementing an evaluation. Some of their tools require a free login. Author: University of Wisconsin - Extension Type: Websites & Online Tools Date: Jan 18, 2008 Be the first to review this resource! Web Link -
Program Evaluation Guide A general introduction to Evaluation, this guide walks the reader through the process of answering the following seven basic questions
1. What do we want to evaluate?
2. What is the purpose of the evaluation?
3. What type of evaluation do we want to use?
4. What information do we need to answer our questions?
5. How do we get the information?
6. How will we analyze the information?
7. How will we use and share the results?Author: Katie Cangemi and Maggie Litgen Type: Workbooks & Guides Date: Dec 31, 2011 Be the first to review this resource! Download (195.46 KB) -
Program Evaluation: Assessing and Measuring Your Program’s Performance Innovation Network's slideshow (from a presentation to the 2007 Nonprofit Technology Conference in Washington, D.C.) about the importance of planning for effective program evaluation, using our online tools, and leveraging online communities for knowledge-sharing. (1.15 MB .pdf file; may take some time to download.) Author: Innovation Network, Inc. Type: Presentation Slides Date: Jan 18, 2008 Be the first to review this resource! Download (1.15 MB) -
Program Evaluation: Igniting the Untapped Power An introductory piece about the power of evaluation, including Innovation Network's approach and things to consider before beginning an evaluation effort. Author: Innovation Network, Inc. Type: Opinion (blog, editorial) Date: Mar 1, 2002 Be the first to review this resource! Download (227.62 KB) -
ProPack III - The CRS Project Package: A Guide to Creating a SMILER M&E System The approach to M&E described in this guide is called SMILER. It is a comprehensive and practical approach to developing a project monitoring system that incorporates processes for learning based on robust evidence. It has been written for CRS project managers, technical, and M&E staff to guide their work with partners and communities by describing how to develop an M&E system in which data are systematically collected, reported and used to make project decisions.
Author: Susan Hahn, Guy Sharrock Type: Workbooks & Guides Date: Jun 15, 2010 Be the first to review this resource! Download (1.18 MB) -
Public Communication Campaign Evaluation First in the series of five papers from the Communications Consortium Media Center (q.v.), this paper is a "scan of challenges, criticisms, practice, and opportunities." Author Julia Coffman:
- Discusses recent events in the field of advocacy evaluation;
- Examines evaluation challenges, criticisms, and practice; and
- Includes sections on relevant theory, outcomes, and evaluation design.
Author: Coffman, Julia Type: Research & Reports Date: May 1, 2002 Be the first to review this resource! Download (190.47 KB) -
Readiness for Evaluation and Learning: Assessing Grantmaker and Grantee Capacity When undertaking a new organizational or program approach to evaluation, begin with questions of readiness. What is the existing EVALUATION PRACTICE of my organization or program? What is the existing EVALUATION CAPACITY of my organization or program? Author: Johanna Morariu, Innovation Network, Inc. Type: Research & Reports Date: Apr 1, 2012 Point K Pick Be the first to review this resource! Download (375.25 KB) -
Report from "Planning, Assessing and Learning from Advocacy Workshop" This report summarizes a 4-day workshop help in Accra, Ghana, in April 2006. The workshop was a collaboration between INTRAC and ActionAid. INTRAC was seeking to understand M&E as practiced on the ground as part of its preparation for an international conference, and ActionAid was motivated by a desire to present findings from three years of the Action Research Project and give participants a forum to express themselves. Author: INTRAC Type: Research & Reports Date: Apr 30, 2006 Be the first to review this resource! Download (47.11 KB) -
Report: "Ten Considerations for Advocacy Evaluation Planning: Lessons Learned from KIDS COUNT Grantee Experiences" The Annie E. Casey Foundation and Organizational Research Services, Inc. detail ten lessons learned from an evaluation of five KIDS COUNT grantees that began in 2007. The evaluation was designed to test some of the ideas presented in "A Guide to Measuring Advocacy and Policy", a report produced by AECF and ORS in 2006. Author: Organizational Research Services Type: Research & Reports Date: Jan 1, 2009 Be the first to review this resource! Download (289.93 KB) -
Report: "What Makes an Effective Advocacy Organization? A Framework for Determining Advocacy Capacity" TCC Group's Jared Raynor, Peter York and Shao-Chee Sim authored "What Makes an Effective Advocacy Organization? A Framework for Determining Advocacy Capacity" based on TCC's evaluation of a cohort of advocacy groups funded by The California Endowment. Author: TCC Group Type: Research & Reports Date: Jan 1, 2009 Be the first to review this resource! Download (5.81 MB) -
Room for Improvement: Foundations' Support of Nonprofit Performance Assessment Amidst growing pressure for nonprofits to measure and assess their performance, the Center for Effective Philanthropy (CEP) finds that nonprofits sorely lack the support they need. CEP finds that 81 percent of the nonprofits surveyed believe that nonprofits should demonstrate the effectiveness of their work by using performance measures.
CEP also finds that:
» Nonprofits very much want to be able to understand their performance and are taking steps to do so. Nonprofits want more help in performance assessment efforts than they are currently receiving from their foundation funders.Author: Brock, Andrea; Buteau, Ellie; Herring, An-Li Type: Research & Reports Date: Sep 1, 2012 Be the first to review this resource! Download (1.04 MB) -
Sample Outcomes Chain Outcomes can't all be attained at the same time, and some outcomes rely on the earlier achievement of others. Advocacy evaluation presents unique conceptual challenges to developing an outcomes chain. This outcomes chain contains short-term, intermediate, and long-term outcomes for both policy and infrastructure, as related to a fictional advocacy campaign. Author: Innovation Network, Inc. Type: Tipsheets & Paper Tools Date: Nov 23, 2005
Download (53.03 KB) -
Sample Size Calculator A handy online calculator for determining the appropriate sample size for a given population. This page also includes a discussion of sample size and other factors that affect the confidence interval of your data collection process. Author: Creative Research Systems Type: Websites & Online Tools Date: Jan 18, 2008 Be the first to review this resource! Web Link -
Sample Size Calculator This online sample size calculator can help you rapidly estimate how large your sample needs to be to compensate for different margins of error and confidence levels. Free and easy to use, the page also includes good introductory information on how to apply the resource. Author: Raosoft Type: Websites & Online Tools Date: Jan 1, 2004 Be the first to review this resource! Web Link -
Simplifying Complex Initiative Evaluation In this article, the author highlights two major lessons learned from theory-of-change and cluster evaluation about how to evaluate complex initiatives: articulation of a theory of change, and using the theory of change as a basis for evaluation planning. In addition, the author shares tips on how she applied the lessons to meet her evaluation challenges. Author: Coffman, Julia Type: Newsletters & Periodicals Date: Jun 1, 1999 Be the first to review this resource! Web Link -
Smart Chart 3.0 Spitfire Strategies produced Smart Chart (now in version 3.0) as "a tool to help nonprofits make smart communications choices." This free online tool is built around a chart with five "strategic decision sections":
Author: Spitfire Strategies Type: Websites & Online Tools Date: Feb 1, 2004
Web Link