Browse Practitioners Resources
-
Program Evaluation Guide A general introduction to Evaluation, this guide walks the reader through the process of answering the following seven basic questions
1. What do we want to evaluate?
2. What is the purpose of the evaluation?
3. What type of evaluation do we want to use?
4. What information do we need to answer our questions?
5. How do we get the information?
6. How will we analyze the information?
7. How will we use and share the results?Author: Katie Cangemi and Maggie Litgen Type: Workbooks & Guides Date: Dec 31, 2011 Be the first to review this resource! Download (195.46 KB) -
Program Evaluation: Assessing and Measuring Your Program’s Performance Innovation Network's slideshow (from a presentation to the 2007 Nonprofit Technology Conference in Washington, D.C.) about the importance of planning for effective program evaluation, using our online tools, and leveraging online communities for knowledge-sharing. (1.15 MB .pdf file; may take some time to download.) Author: Innovation Network, Inc. Type: Presentation Slides Date: Jan 18, 2008 Be the first to review this resource! Download (1.15 MB) -
Program Evaluation: Igniting the Untapped Power An introductory piece about the power of evaluation, including Innovation Network's approach and things to consider before beginning an evaluation effort. Author: Innovation Network, Inc. Type: Opinion (blog, editorial) Date: Mar 1, 2002 Be the first to review this resource! Download (227.62 KB) -
Project Evaluation Guide: Module 7, Culturally Responsive Evaluation The purpose of this module is to alert users to the importance of culturally responsive evaluation and to explain some of its key components. It discusses strategies that have been found to be useful in conducting evaluations that are responsive to all cultures.
Author: National Science Foundation Type: Workbooks & Guides Date: Nov 9, 2010 Point K Pick Be the first to review this resource! Web Link -
ProPack III - The CRS Project Package: A Guide to Creating a SMILER M&E System The approach to M&E described in this guide is called SMILER. It is a comprehensive and practical approach to developing a project monitoring system that incorporates processes for learning based on robust evidence. It has been written for CRS project managers, technical, and M&E staff to guide their work with partners and communities by describing how to develop an M&E system in which data are systematically collected, reported and used to make project decisions.
Author: Susan Hahn, Guy Sharrock Type: Workbooks & Guides Date: Jun 15, 2010 Be the first to review this resource! Download (1.18 MB) -
Pros and Cons of Evaluation A solid two-page overview of why foundations should get involved with evaluation presented in a straightforward Pro vs. Con fashion. Author: Janet Carter Type: Opinion (blog, editorial) Date: Jan 1, 2003 Be the first to review this resource! Download (57.57 KB) -
Prove & Improve – A Self-Evaluation Resource for Voluntary and Community Organisations This document is intended to provide a starting point for exploring evaluation - and in particular, self evaluation of outcomes - and for thinking about some of the issues involved in this essential area of developing and running a successful project. Author: Community Evaluation Northern Ireland Type: Websites & Online Tools Date: Oct 1, 2008
Download (508.77 KB) -
Public Communication Campaign Evaluation First in the series of five papers from the Communications Consortium Media Center (q.v.), this paper is a "scan of challenges, criticisms, practice, and opportunities." Author Julia Coffman:
- Discusses recent events in the field of advocacy evaluation;
- Examines evaluation challenges, criticisms, and practice; and
- Includes sections on relevant theory, outcomes, and evaluation design.
Author: Coffman, Julia Type: Research & Reports Date: May 1, 2002 Be the first to review this resource! Download (190.47 KB) -
Putting the system back into systems change: A framework for understanding and changing organizational and community systems This paper provides one framework—grounded in systems thinking and change literatures—for understanding and identifying the fundamental system parts and interdependencies that can help to explain system functioning and leverage systems change. The proposed framework highlights the importance of attending to both the deep and apparent structures within a system as well as the interactions and interdependencies among these system parts. Author: Pennie G. Foster-Fishman, Branda Nowell, Huilan Yang Type: Research & Reports Date: May 18, 2007 Be the first to review this resource! Download (381.29 KB) -
Rapid Evaluation The purpose of this guide is to introduce the basic concepts and methods used in rapid evaluations (REs), and to demonstrate how this approach can be applied to the various stages of program development and implementation.
It covers inportant terms/definition, when to use REs, advantages and disadvantages of REs, the five elements of an RE, and provides additional reading resources.
Author: The International Training and Education Center for Health (I-TECH) Type: Workbooks & Guides Date: Jan 1, 2008 Be the first to review this resource! Download (272.44 KB) -
Report from "Planning, Assessing and Learning from Advocacy Workshop" This report summarizes a 4-day workshop help in Accra, Ghana, in April 2006. The workshop was a collaboration between INTRAC and ActionAid. INTRAC was seeking to understand M&E as practiced on the ground as part of its preparation for an international conference, and ActionAid was motivated by a desire to present findings from three years of the Action Research Project and give participants a forum to express themselves. Author: INTRAC Type: Research & Reports Date: Apr 30, 2006 Be the first to review this resource! Download (47.11 KB) -
Report: "What Makes an Effective Advocacy Organization? A Framework for Determining Advocacy Capacity" TCC Group's Jared Raynor, Peter York and Shao-Chee Sim authored "What Makes an Effective Advocacy Organization? A Framework for Determining Advocacy Capacity" based on TCC's evaluation of a cohort of advocacy groups funded by The California Endowment. Author: TCC Group Type: Research & Reports Date: Jan 1, 2009 Be the first to review this resource! Download (5.81 MB) -
Room for Improvement: Foundations' Support of Nonprofit Performance Assessment Amidst growing pressure for nonprofits to measure and assess their performance, the Center for Effective Philanthropy (CEP) finds that nonprofits sorely lack the support they need. CEP finds that 81 percent of the nonprofits surveyed believe that nonprofits should demonstrate the effectiveness of their work by using performance measures.
CEP also finds that:
» Nonprofits very much want to be able to understand their performance and are taking steps to do so. Nonprofits want more help in performance assessment efforts than they are currently receiving from their foundation funders.Author: Brock, Andrea; Buteau, Ellie; Herring, An-Li Type: Research & Reports Date: Sep 1, 2012 Be the first to review this resource! Download (1.04 MB) -
Sample Outcomes Chain Outcomes can't all be attained at the same time, and some outcomes rely on the earlier achievement of others. Advocacy evaluation presents unique conceptual challenges to developing an outcomes chain. This outcomes chain contains short-term, intermediate, and long-term outcomes for both policy and infrastructure, as related to a fictional advocacy campaign. Author: Innovation Network, Inc. Type: Tipsheets & Paper Tools Date: Nov 23, 2005
Download (53.03 KB) -
Sample Questions for Gauging Progress in Advocacy Innovation Network developed this resource about advocacy evaluation: Although successful advocacy efforts may take decades, there are methods for measuring progress over shorter periods of time. This two-page handout provides some key ideas and sample questions to consider.
Author: Innovation Network, Inc. Type: Tipsheets & Paper Tools Date: Jun 10, 2008
Download (196.08 KB) -
Sample Size Calculator This online sample size calculator can help you rapidly estimate how large your sample needs to be to compensate for different margins of error and confidence levels. Free and easy to use, the page also includes good introductory information on how to apply the resource. Author: Raosoft Type: Websites & Online Tools Date: Jan 1, 2004 Be the first to review this resource! Web Link -
Sample Size Calculator A handy online calculator for determining the appropriate sample size for a given population. This page also includes a discussion of sample size and other factors that affect the confidence interval of your data collection process. Author: Creative Research Systems Type: Websites & Online Tools Date: Jan 18, 2008 Be the first to review this resource! Web Link -
Seeing the Forest (Beyond the Trees): Learning Across the Experiences of Seven Advocacy Evaluators [Slides] Advocacy and policy change evaluation continues to evolve and mature--from a fledgling field a few years ago to the flourishing field of today. Evaluators are advancing as well, developing an increasingly robust collective understanding about what works for advocacy evaluation. In this session a diverse group of seven advocacy evaluators explored and synthesized observations drawn from an array of real-world experiences. Panelists spoke to targeted questions, weaving in their wealth of experience and examples. Author: Johanna Morariu, Jara Dean-Coffey, Tom Kelly, Claire Hutchings, David Devlin Foltz, Robin Kane, Jared Raynor, Anne Gienapp, Type: Presentation Slides Date: Oct 19, 2013 Be the first to review this resource! Web Link -
Simplifying Complex Initiative Evaluation In this article, the author highlights two major lessons learned from theory-of-change and cluster evaluation about how to evaluate complex initiatives: articulation of a theory of change, and using the theory of change as a basis for evaluation planning. In addition, the author shares tips on how she applied the lessons to meet her evaluation challenges. Author: Coffman, Julia Type: Newsletters & Periodicals Date: Jun 1, 1999 Be the first to review this resource! Web Link -
Smart Chart 3.0 Spitfire Strategies produced Smart Chart (now in version 3.0) as "a tool to help nonprofits make smart communications choices." This free online tool is built around a chart with five "strategic decision sections":
Author: Spitfire Strategies Type: Websites & Online Tools Date: Feb 1, 2004
Web Link -
Social Movements and Philanthropy: How Foundations Can Support Movement Building Building on research conducted for the California Endowment, this article describes five core movement-building elements and provides a framework for activities that foundations can support to foster movement building. Movement building presents unique challenges to foundations. Because movements, by definition, must be driven by the people who are most affected, foundations cannot determine the goals and timetables of a movement. Foundation investments in movements are just that – investments for the long term. Author: Barbara Masters, M.A., and Torie Osborn, M.B.A. Type: Newsletters & Periodicals Date: Oct 2, 2010 Be the first to review this resource! Download (326.55 KB) -
Social Research Update: Photo-Interviewing for Research Rosalind Hurthworth offers insight into multiple uses of photography in evaluation. Author: Hurworth, Rosalind Type: Research & Reports Date: Mar 1, 2003 Be the first to review this resource! Download (156.06 KB) -
Social Watch Developing An Evaluation (1995-2000) This discussion paper summarizes many of the issues facing the Social Watch Coordinating Committee and Reference Group, in their attempt to evaluate the progress of Social Watch (an international network concerned with poverty eradication and equality). After describing the hurdles to evaluation, the author discusses what should be evaluated, what steps should be taken to develop an evaluation methodology, and next steps. Author: van Tuijl, Peter Type: Research & Reports Date: May 22, 1999 Be the first to review this resource! Web Link -
Software for Nonprofit Evaluation and Case Management Many systems exist to manage the wealth of data nonprofits collect. As part of our evaluation consulting work, Innovation Network team members are often asked to recommend systems to our nonprofit clients. The purpose of this document is to share with the field our most recent scan of existing software (a/o January 2010). We hope this information is helpful to you, and we encourage you to contact us with feedback. Author: Johanna Morariu, Innovation Network, Inc. Type: Workbooks & Guides Date: Jan 31, 2010 Point K Pick Be the first to review this resource! Download (86.51 KB) -
Speaking for Themselves: Advocates' Perspectives on Evaluation "Speaking for Themselves: Advocates' Perspectives on Evaluation" will give you a better understanding of advocates' views on evaluation, the advocacy strategies and capacities they find effective, and current evaluation practices. Based on Innovation's Network's research, the report includes recommendations for advocates, funders, and evaluators. Both the research and publication were made possible by the Annie E. Casey Foundation and The Atlantic Philanthropies. Author: Innovation Network, Inc. Type: Research & Reports Date: Aug 12, 2008 Point K Pick
Download (1.93 MB)