Browse Program Evaluation Resources
-
Approaching An Evaluation: Ten Issues to Consider Ten issues to consider before beginning an evaluation. Author: Brad Rose, Ph.D. Type: Websites & Online Tools Date: Jun 14, 2010
Web Link -
"Participatory Performance Story Reporting" Clear Horizon, an M&E company in Australia, describes its participatory approach, specifically as it relates to the method of performance story reporting (PPSR). The company defines performance story reports as "essentially a short report about how a program contributed to outcomes." This page on their website includes links to other pages that describe the structure, process, and limitations of this the PPSR method. Author: Clear Horizon Type: Websites & Online Tools Date: Dec 31, 2008 Be the first to review this resource! Download (161.39 KB) -
"Real-life Lessons Learned and Resources in Building Capacity for Advocacy and Policy Evaluation among KIDS COUNT Grantees" The Annie E. Casey Foundation and Organizational Research Services, Inc. recount ten lessons learned from an evaluation of five KIDS COUNT grantees that began in 2007. The evaluation was designed to test some of the ideas presented in "A Guide to Measuring Advocacy and Policy", a report produced by AECF and ORS in 2006. This handout, distributed at the American Evaluation Association 2008 conference, also contains a list of advocacy evaluation resources available on ORS's website. Author: Annie E. Casey Foundation; Organizational Research Services Type: Tipsheets & Paper Tools Date: Nov 6, 2008 Be the first to review this resource! Download (261.49 KB) -
#14ntcdataviz: DataViz! Tips, Tools, and How-tos for Visualizing Your Data [Resource Handout] This resource handout accompanied our presentation at the Nonprofit Technology Conference on March 13, 2014 in Washington, DC. Author: Johanna Morariu and Ann Emery, Innovation Network Type: Tipsheets & Paper Tools Date: Mar 13, 2014 Point K Pick Be the first to review this resource! Download (349.1 KB) -
#14ntcdataviz: DataViz! Tips, Tools, and How-tos for Visualizing Your Data [Slides] Presentation by Ann K. Emery, Johanna Morariu, and Andrew Means at the 2014 Nonprofit Technology Conference in Washington, DC. Author: Ann K. Emery and Johanna Morariu Type: Presentation Slides Date: Mar 13, 2014 Point K Pick Be the first to review this resource! Web Link -
#JAGUnity2014: DataViz for Philanthropists! Tips, Tools, and How-Tos for Communicating Better with Charts [Handout] This resource handout accompanied a presentation by Innovation Network's Johanna Morariu and Ann K. Emery for the Joint Affinity Group's Unity Conference on June 6, 2014 in Washington, DC. Author: Johanna Morariu and Ann K. Emery, Innovation Network Type: Tipsheets & Paper Tools Date: Jun 6, 2014 Be the first to review this resource! Download (255.28 KB) -
#JAGUnity2014: DataViz for Philanthropists! Tips, Tools, and How-Tos for Communicating Better with Charts [Slides] Presentation by Innovation Network's Johanna Morariu and Ann K. Emery at the Joint Affinity Groups Unity Conference, held June 6, 2014 in Washington, DC. Author: Johanna Morariu and Ann K. Emery, Innovation Network Type: Presentation Slides Date: Jun 6, 2014 Point K Pick Be the first to review this resource! Web Link -
#JAGUnity2014: Innovations in Evaluating Social Movements Today, social movement organizers are grappling with big questions: What is the long-term impact we are hoping to make? How can we measure the progress we've made thus far? How can we learn from past practice? On June 7, 2014, Innovation Netowrk's William Fenn spoke on a panel with with Deepak Pateriya and Sian O'Faolain of the Center for Community Change and Hillary Klein of Make the Road New York to try and answer some of these questions. Author: Will Fenn, Deepak Pateriya, Sian O'Faolain, Hillary Klein Type: Presentation Slides Date: Jun 7, 2014 Be the first to review this resource! Web Link -
#YNPNdc14: DataViz! Tips, Tools, and How-tos for Visualizing Your Data [Resource Handout] Johanna Morariu and Ann K. Emery presented at the Young Nonprofit Professionals Network 2014 Annual Leadership Conference, which was held on May 9, 2014 in Washington, DC. Author: Johanna Morariu and Ann K. Emery, Innovation Network Type: Tipsheets & Paper Tools Date: May 9, 2014 Point K Pick Be the first to review this resource! Download (253.7 KB) -
#YNPNdc14: DataViz! Tips, Tools, and How-tos for Visualizing Your Data [Slides] Johanna Morariu and Ann K. Emery presented at the Young Nonprofit Professionals Network 2014 Annual Leadership Conference, which was held on May 9, 2014 in Washington, DC. Author: Johanna Morariu and Ann K. Emery, Innovation Network Type: Presentation Slides Date: May 9, 2014 Point K Pick Be the first to review this resource! Web Link -
2008 Civic Engagement Evaluation Assessment and Recommendations for the Field 2008 was a historic year for civic participation in the United States. The Funders’ Committee for Civic Participation (FCCP) brings together grantmakers committed to enhancing democratic participation in all aspects of civic life. Its nearly 80 members comprised of private, public and community foundations collectively contributed scores of millions of dollars to non-partisan civic engagement efforts of all kinds nationwide.
Author: Lacy M. Serros Type: Research & Reports Date: Dec 1, 2009 Be the first to review this resource! Download (699.52 KB) -
5 Evaluation Lessons From a Recovering Program Officer (Presentation slides) As a profession, evaluators must constantly demonstrate the value of their work to remain relevant. Each evaluation represents a significant investment in resources that many argue could be used to provide more programming. Author: Will Fenn Type: Presentation Slides Date: Oct 17, 2013 Point K Pick Be the first to review this resource! Download (4.05 MB) -
5 Evaluation Lessons from a Recovering Program Officer (recording) As a profession, evaluators must constantly demonstrate the value of their work to remain relevant. Each evaluation represents a significant investment in resources that many argue could be used to provide more programming. A primary concern of maintaining the value proposition of the evaluation field is ensuring that evaluations remain manageable and useful to all participants. The Ignite session offered five succinct lessons from a former foundation Program Officer that can be applied across program types to help nonprofits and foundations improve evaluation coordination and use. Author: Will Fenn Type: Websites & Online Tools Date: Oct 17, 2013 Be the first to review this resource! Web Link -
8 Steps to Develop an Evaluation Plan Innovation Network identified eight key steps in our 2005 evaluation of a U.S. federal policy change campaign. These steps may also be useful in other advocacy evaluations. Author: Innovation Network, Inc. Type: Tipsheets & Paper Tools Date: Oct 1, 2005
Download (173.9 KB) -
9 Steps to Advocacy Evaluation Many organizations are using advocacy strategies to meet their missions. Just like any other work that foundations and nonprofits engage in, advocacy needs to be continually assessed, tweaked, and strengthened through a process of evaluation and learning. In this webinar with the National Committee for Responsive Philanthropy, Johanna Morariu and Will Fenn shared the nine steps of advocacy evaluation. The webinar is based on Innovation Network's report titled "Pathfinder: A Practical Guide to Advocacy Evaluation." The webinar took place on Wednesday, June 26, 2013. Author: Johanna Morariu and William Fenn, Innovation Network Type: Presentation Slides Date: Jun 26, 2013 Point K Pick Be the first to review this resource! Web Link -
A Dataviz Technique New to aea365 Johanna Morariu describes treemaps, a relatively new data visualization technique, especially to evaluators. The technique was created in the 1990s by Dr. Ben Shneiderman for mapping computer hard drive usage. Treemaps are useful for visualizing hierarchical data, or tree structure data. Area is used to proportionally illustrate differences in values, i.e., how many program participants fall into each of the nested categories. She also shares resources for making your own treemaps. Author: Johanna Morariu Type: Opinion (blog, editorial) Date: Jun 23, 2013 Be the first to review this resource! Web Link -
A Guide to Actionable Measurement Staff members from the Bill & Melinda Gates Foundation discuss the principles and practices they use to make decisions about data collection, analysis, and reporting. Their approach, "actionable measurement," emphasizes the collection of information for specific decisions or actions.
UPDATE: We had a broken link here. It was updated on Jan. 13, 2011.
Author: Bill & Melinda Gates Foundation Type: Research & Reports Date: May 10, 2010
Download (284.17 KB) -
A Guide to Measuring Advocacy and Policy Developed for the Annie E. Casey Foundation, this guide serves as a broad call to grantmakers to build and advance the field of advocacy and policy evaluation. The guide includes sections on the context of advocacy evaluation and evaluation design. Author: Reisman, Jane, et al. Type: Research & Reports Date: Mar 15, 2007
Download (255.04 KB) -
A Helpful Guide to Failure in Philanthropy. Use Carefully. Larry Blumenthal's article provides perspective about the fear of failure in philanthropy, and how to overcome that fear so foundations can learn from their mistakes (and other foundation's mistakes) to achieve their mission and program goals. Author: LARRY BLUMENTHAL Type: Newsletters & Periodicals Date: Jan 7, 2010 Be the first to review this resource! Web Link -
A Menu of Assessment Activities This brief menu of questions provides a simple, straightforward overview of evaluation activities. Author: Brett A. Magill Type: Tipsheets & Paper Tools Date: Feb 24, 2010
Download (86.05 KB) -
A Practical Guide for Engaging Stakeholders in Developing Evaluation Questions This guide describes a five-step process for engaging stakeholders in developing evaluation
questions, and includes four worksheets and a case example to further facilitate the planning
and implementation of your stakeholder engagement process.Step 1: Prepare for stakeholder engagement: This step includes collecting information about
the program or initiative being evaluated—its history, why it came into being, what it is trying
to accomplish and what success would look like.Author: Hallie Preskill and Nathalie Jones Type: Workbooks & Guides Date: Jun 1, 2009 Be the first to review this resource! Download (1001.41 KB) -
A Practical Guide to Evaluating Systems Change in a Human Services System Context This Guide is for evaluators who would like a practical “way in” to thinking about systems and systemschange. The key practical step the Guide takes is to limit the type of system to be evaluated to aparticular type of system (a type that systems change initiatives often target): a human servicesdelivery system (e.g. health, education, workforce development, etc.).Author: Nancy Latham Type: Research & Reports Date: Jan 1, 2015 Be the first to review this resource! Download (1.03 MB) -
Abdul Latif Jameel Poverty Action Lab Executive Training: Evaluating Social Programs 2009 This five-day program on evaluating social programs provides a thorough understanding of randomized evaluations and pragmatic step-by-step training for conducting one's own evaluation. While the course focuses on randomized evaluations, many of the topics, such as measuring outcomes and dealing with threats to the validity of an evaluation, are relevant for other methodologies.
Courses have been recorded and loaded as videos online, split up by topic. You may also access the course lecture notes and assignments.Author: Esther Duflo, Rachel Glennerster, & Abhijit Banerjee Type: Websites & Online Tools Date: May 1, 2009 Be the first to review this resource! Web Link -
Addressing Attribution of Cause and Effect in Small n Impact Evaluations: Towards an Integrated Framework With the results agenda in the ascendancy in the development community, there is an increasing need to demonstrate that development spending makes a difference, that it has an impact. This requirement to demonstrate results has fuelled an increase in the demand for, and production of, impact evaluations. There exists considerable consensus among impact evaluators conducting large n impact evaluations involving tests of statistical difference in outcomes between the treatment group and a properly constructed comparison group.
Author: Howard White and Daniel Phillips Type: Research & Reports Date: Jun 1, 2012
Download (637.96 KB) -
Advocacy & Policy Change Composite Logic Model and associated materials This collaborative work by more than 50 advocates, grantmakers, and evaluators offers a way to improve communication in the advocacy evaluation field by articulating common goals, outcomes, and indicators. Supplements to the Model include guiding questions, definitions, and samples based on hypothetical advocacy situations—one each for the intended "strategy" and "evaluation" uses of the Model. There is also an online tool based on the Model.
Author: Coffman, Julia, et al. Type: Tipsheets & Paper Tools Date: Apr 1, 2007
Web Link