Browse Evaluators Resources
-
Participatory Analysis: Expanding Stakeholder Involvement in Evaluation Veena Pankaj and Myia Welsh described Innovation Network's participatory approach to evaluation, highlighting how stakeholders can be involved in the analysis and interpretation of data. They also shared tips from Innovation Network's white paper titled "Participatory Analysis: Expanding Stakeholder Involvement in Evaluation." Author: Veena Pankaj and Myia Welsh Type: Opinion (blog, editorial) Date: Jun 6, 2011
Web Link -
Participatory Asset Mapping Toolkit Healthy City supports communities in identifying, organizing, and sharing its collective voice with decision makers at the local and state levels. Through their Community Research Lab, Healthy City share best practices and methods for Community-Based Organizations (CBOs) interested in supporting their strategies with research that combines community knowledge with Healthy City technologies. Toward this aim, they have developed the Community Research Lab Toolbox. The toolbox presents research concepts, methods, and tools through topical guides and toolkits. Author: Healthy City Type: Tipsheets & Paper Tools Date: Apr 8, 2012
Download (2.22 MB) -
Partnerships for Environmental Public Health Evaluation Metrics Manual The Partnership for Environmental Public Health Evaluation Metrics Manual provides examples of tangible metrics and program tools that PEPH grantees and others can use for planning and evaluation purposes. Sample metrics include measures for activities, outcomes and impacts related to partnership building, leveraging, product development and dissemination, education and training, and capacity building.
Author: Division of Extramural Research, National Institute of Environmental Health Science Type: Workbooks & Guides Date: Oct 25, 2010 Be the first to review this resource! Web Link -
Pathfinder Evaluator Edition: A Practical Guide to Advocacy Evaluation Pathfinder is a practical guide to the advocacy evaluation process. This edition guides evaluators through the advocacy evaluation process from start to finish. Editions for advocates and funders are also available. Drawn from Innovation Network’s research and consulting experience, Pathfinder encourages the adoption of a “learning-focused evaluation” approach, which prioritizes using knowledge for improvement. Author: Innovation Network Type: Workbooks & Guides Date: Nov 1, 2009 Point K Pick Be the first to review this resource! Download (1.4 MB) -
Pathfinder: Resource List This annotated bibliography is meant to accompany Innovation Network's Pathfinder: A Practical Guide to Advocacy Evaluation series. It includes resources for each of the topics covered in Pathfinder. Author: Innovation Network, Inc. Type: Workbooks & Guides Date: Nov 3, 2009 Point K Pick Be the first to review this resource! Download (211.66 KB) -
Pathways for Change 2013 Author: The Center for Evaluation Innovation Type: Research & Reports Date: Jan 1, 2013 Be the first to review this resource! Download (211.06 KB) -
Paying More Attention to Paying Attention (From Introduction)
In 1998 I wrote Paying Attention: Visitors and Museum Exhibitions, a book supported by a National Science Foundation (NSF) grant called “A Meta-analysis of Visitor Time/Use in Museum Exhibitions.” The grant accomplished three main goals:
Author: Beverly Serrell Type: Research & Reports Date: Jan 1, 2010 Be the first to review this resource! Web Link -
Performance Management and Evaluation: Two Sides of the Same Coin (Presentation slides) Performance management and evaluation-what's the difference? With an increasing emphasis on measurement and impact, service providers and their funders are pushing for increasingly sophisticated evaluation approaches such as experimental and quasi-experimental designs. However, experimental methods are rarely appropriate, feasible, or cost-effective for the majority of organizations and service providers. Author: Isaac Castillo, Ann K. Emery Type: Presentation Slides Date: Oct 16, 2013 Point K Pick Be the first to review this resource! Download (1.42 MB) -
Performance Monitoring And Evaluation Tips A guide on focus groups from the U.S. Agency for International Development (USAID). Author: USAID Center for Development Information and Evaluation Type: Workbooks & Guides Date: Jan 1, 1996
Web Link -
Performance Monitoring Framework for Conservation Advocacy, A This report was written in response to the New Zealand Department of Conservation’s environmental advocacy work. It "sets out a framework for monitoring the effectiveness of conservation advocacy programmes in increasing public awareness about, and involvement in, conservation." The report offers advocacy monitoring and evaluation guidelines applicable beyond the environmental advocacy field. Author: James, Bev Type: Research & Reports Date: Mar 1, 2001 Be the first to review this resource! Download (229.7 KB) -
Philanthropic Freedom: A Pilot Study Hudson Institute’s Center for Global Prosperity (CGP) is pleased to announce the publication of Philanthropic Freedom: A Pilot Study, the first time that the ease of giving has been fully measured and compared across 13 countries. The pilot study and each of the detailed country reports can be downloaded for free from www.Hudson.org/PhilanthropicFreedom.
Author: Hudson Institute Center for Global Prosperity Type: Research & Reports Date: Mar 28, 2013 Be the first to review this resource! Download (1.31 MB) -
Philanthropy and the Social Economy: Blueprint 2015 Philanthropy and the Social Sector 2015 is an annual industry forecast about the social economy -- private resources used for public benefit. Each year, the Blueprint provides an overview of the current landscape, points to major trends, and directs your attention to horizons where you can expect some breakthroughs in the coming year. This year I'm excited to broaden my horizons to include insights from 14 countries other than the United States. This is possible due to a new working relationship with betterplace lab in Berlin.
Author: Lucy Bernholz Type: Research & Reports Date: Dec 1, 2014 Be the first to review this resource! Download (2.36 MB) -
Picturing Your Data is Better Than 1000 Numbers: Data Visualization Techniques for Social Change Are you intrigued by infographics and how they could improve your communication strategy? Are you interested in what it takes for an organization to systematically use data? Or are you maybe even drowning in data and looking for someone to throw you a life-saving suggestion for software and other tools? Johanna Morariu, Beth Kanter, and Brian Kennedy presented a panel on data and information visualization at the 2012 Nonprofit Tech Conference. This video is a recording of the panel.
Author: Johanna Morariu Type: Presentation Slides Date: Be the first to review this resource! Web Link -
Portfolio Evaluation vs. Grant Evaluation In this webinar Johanna Morariu and Ehren Reed discuss four levels of evaluation: grant-level, portfolio-level, foundation-level, and issue-level. The presentation addresses the pros and cons of these four levels of evaluation, and when one level may be more appropriate than another. Also included are considerations for right-sizing your evaluation approach—design, data collection, analysis, and reporting differences between grant and portfolio evaluations.Author: Johanna Morariu and Ehren Reed Type: Websites & Online Tools Date: Feb 22, 2012 Be the first to review this resource! Web Link -
Program Development and Evaluation This site provides a comprehensive set of resources on planning and implementing an evaluation. Some of their tools require a free login. Author: University of Wisconsin - Extension Type: Websites & Online Tools Date: Jan 18, 2008 Be the first to review this resource! Web Link -
Program Evaluation Guide A general introduction to Evaluation, this guide walks the reader through the process of answering the following seven basic questions
1. What do we want to evaluate?
2. What is the purpose of the evaluation?
3. What type of evaluation do we want to use?
4. What information do we need to answer our questions?
5. How do we get the information?
6. How will we analyze the information?
7. How will we use and share the results?Author: Katie Cangemi and Maggie Litgen Type: Workbooks & Guides Date: Dec 31, 2011 Be the first to review this resource! Download (195.46 KB) -
Program Evaluation: Assessing and Measuring Your Program’s Performance Innovation Network's slideshow (from a presentation to the 2007 Nonprofit Technology Conference in Washington, D.C.) about the importance of planning for effective program evaluation, using our online tools, and leveraging online communities for knowledge-sharing. (1.15 MB .pdf file; may take some time to download.) Author: Innovation Network, Inc. Type: Presentation Slides Date: Jan 18, 2008 Be the first to review this resource! Download (1.15 MB) -
Project Evaluation Guide: Module 7, Culturally Responsive Evaluation The purpose of this module is to alert users to the importance of culturally responsive evaluation and to explain some of its key components. It discusses strategies that have been found to be useful in conducting evaluations that are responsive to all cultures.
Author: National Science Foundation Type: Workbooks & Guides Date: Nov 9, 2010 Point K Pick Be the first to review this resource! Web Link -
Proofiness: The Dark Arts of Mathematical Deception and the Evaluation Profession Johanna Morariu describes how she explains the merit and appropriateness of qualitative designs when helping individuals and organizations design and evaluation approach or when presenting qualitative findings. Author: Johanna Morariu Type: Opinion (blog, editorial) Date: Jan 18, 2012 Be the first to review this resource! Web Link -
ProPack III - The CRS Project Package: A Guide to Creating a SMILER M&E System The approach to M&E described in this guide is called SMILER. It is a comprehensive and practical approach to developing a project monitoring system that incorporates processes for learning based on robust evidence. It has been written for CRS project managers, technical, and M&E staff to guide their work with partners and communities by describing how to develop an M&E system in which data are systematically collected, reported and used to make project decisions.
Author: Susan Hahn, Guy Sharrock Type: Workbooks & Guides Date: Jun 15, 2010 Be the first to review this resource! Download (1.18 MB) -
Pros and Cons of Evaluation A solid two-page overview of why foundations should get involved with evaluation presented in a straightforward Pro vs. Con fashion. Author: Janet Carter Type: Opinion (blog, editorial) Date: Jan 1, 2003 Be the first to review this resource! Download (57.57 KB) -
Prove & Improve – A Self-Evaluation Resource for Voluntary and Community Organisations This document is intended to provide a starting point for exploring evaluation - and in particular, self evaluation of outcomes - and for thinking about some of the issues involved in this essential area of developing and running a successful project. Author: Community Evaluation Northern Ireland Type: Websites & Online Tools Date: Oct 1, 2008
Download (508.77 KB) -
Public Will Building: What it Means and How to Evaluate It [Handout] Strategic Partner Julia Coffman (of the Center for Evaluation Innovation) and Innovation Network Director Ehren Reed led a discussion on Public Will Building for The Connecticut Health Foundation's Leadership Fellows. Author: Julia Coffman and Ehren Reed Type: Research & Reports Date: Dec 14, 2011 Be the first to review this resource! Web Link -
Putting the system back into systems change: A framework for understanding and changing organizational and community systems This paper provides one framework—grounded in systems thinking and change literatures—for understanding and identifying the fundamental system parts and interdependencies that can help to explain system functioning and leverage systems change. The proposed framework highlights the importance of attending to both the deep and apparent structures within a system as well as the interactions and interdependencies among these system parts. Author: Pennie G. Foster-Fishman, Branda Nowell, Huilan Yang Type: Research & Reports Date: May 18, 2007 Be the first to review this resource! Download (381.29 KB) -
Rapid Evaluation The purpose of this guide is to introduce the basic concepts and methods used in rapid evaluations (REs), and to demonstrate how this approach can be applied to the various stages of program development and implementation.
It covers inportant terms/definition, when to use REs, advantages and disadvantages of REs, the five elements of an RE, and provides additional reading resources.
Author: The International Training and Education Center for Health (I-TECH) Type: Workbooks & Guides Date: Jan 1, 2008 Be the first to review this resource! Download (272.44 KB)