Browse data Resources
-
Pathfinder Funder Edition: A Practical Guide to Advocacy Evaluation Pathfinder is a practical guide to the advocacy evaluation process. This edition guides funders through the advocacy evaluation process from start to finish. Editions for advocates and evaluators are also available. Drawn from Innovation Network’s research and consulting experience, Pathfinder encourages the adoption of a “learning-focused evaluation” approach, which prioritizes using knowledge for improvement. Author: Innovation Network Type: Workbooks & Guides Date: Nov 1, 2009 Point K Pick Be the first to review this resource! Download (1.17 MB) -
Paying More Attention to Paying Attention (From Introduction)
In 1998 I wrote Paying Attention: Visitors and Museum Exhibitions, a book supported by a National Science Foundation (NSF) grant called “A Meta-analysis of Visitor Time/Use in Museum Exhibitions.” The grant accomplished three main goals:
Author: Beverly Serrell Type: Research & Reports Date: Jan 1, 2010 Be the first to review this resource! Web Link -
Performance Management and Evaluation: Two Sides of the Same Coin (Presentation slides) Performance management and evaluation-what's the difference? With an increasing emphasis on measurement and impact, service providers and their funders are pushing for increasingly sophisticated evaluation approaches such as experimental and quasi-experimental designs. However, experimental methods are rarely appropriate, feasible, or cost-effective for the majority of organizations and service providers. Author: Isaac Castillo, Ann K. Emery Type: Presentation Slides Date: Oct 16, 2013 Point K Pick Be the first to review this resource! Download (1.42 MB) -
Performance Monitoring And Evaluation Tips A guide on focus groups from the U.S. Agency for International Development (USAID). Author: USAID Center for Development Information and Evaluation Type: Workbooks & Guides Date: Jan 1, 1996
Web Link -
Philanthropic Freedom: A Pilot Study Hudson Institute’s Center for Global Prosperity (CGP) is pleased to announce the publication of Philanthropic Freedom: A Pilot Study, the first time that the ease of giving has been fully measured and compared across 13 countries. The pilot study and each of the detailed country reports can be downloaded for free from www.Hudson.org/PhilanthropicFreedom.
Author: Hudson Institute Center for Global Prosperity Type: Research & Reports Date: Mar 28, 2013 Be the first to review this resource! Download (1.31 MB) -
Picturing Your Data is Better Than 1000 Numbers: Data Visualization Techniques for Social Change Are you intrigued by infographics and how they could improve your communication strategy? Are you interested in what it takes for an organization to systematically use data? Or are you maybe even drowning in data and looking for someone to throw you a life-saving suggestion for software and other tools? Johanna Morariu, Beth Kanter, and Brian Kennedy presented a panel on data and information visualization at the 2012 Nonprofit Tech Conference. This video is a recording of the panel.
Author: Johanna Morariu Type: Presentation Slides Date: Be the first to review this resource! Web Link -
Portfolio Evaluation vs. Grant Evaluation In this webinar Johanna Morariu and Ehren Reed discuss four levels of evaluation: grant-level, portfolio-level, foundation-level, and issue-level. The presentation addresses the pros and cons of these four levels of evaluation, and when one level may be more appropriate than another. Also included are considerations for right-sizing your evaluation approach—design, data collection, analysis, and reporting differences between grant and portfolio evaluations.Author: Johanna Morariu and Ehren Reed Type: Websites & Online Tools Date: Feb 22, 2012 Be the first to review this resource! Web Link -
Power, Participation, and State-based Politics: An Evaluation of the Ford Foundation's Collaborations that Count Initiative The Applied Research Center (ARC) conducted a two-year participatory evaluation to provide an account of the Ford Foundation's "Collaborations That Count Initiative." The report identifies areas in which the 11 statewide collaborations succeeded, and draws attention to ways in which support to collaboration might be more effectively provided in the future. Author: Applied Research Center Type: Research & Reports Date: Apr 21, 2004 Be the first to review this resource! Download (1.85 MB) -
Program Development and Evaluation This site provides a comprehensive set of resources on planning and implementing an evaluation. Some of their tools require a free login. Author: University of Wisconsin - Extension Type: Websites & Online Tools Date: Jan 18, 2008 Be the first to review this resource! Web Link -
Program Evaluation Guide A general introduction to Evaluation, this guide walks the reader through the process of answering the following seven basic questions
1. What do we want to evaluate?
2. What is the purpose of the evaluation?
3. What type of evaluation do we want to use?
4. What information do we need to answer our questions?
5. How do we get the information?
6. How will we analyze the information?
7. How will we use and share the results?Author: Katie Cangemi and Maggie Litgen Type: Workbooks & Guides Date: Dec 31, 2011 Be the first to review this resource! Download (195.46 KB) -
Program Evaluation: Assessing and Measuring Your Program’s Performance Innovation Network's slideshow (from a presentation to the 2007 Nonprofit Technology Conference in Washington, D.C.) about the importance of planning for effective program evaluation, using our online tools, and leveraging online communities for knowledge-sharing. (1.15 MB .pdf file; may take some time to download.) Author: Innovation Network, Inc. Type: Presentation Slides Date: Jan 18, 2008 Be the first to review this resource! Download (1.15 MB) -
Proofiness: The Dark Arts of Mathematical Deception and the Evaluation Profession Johanna Morariu describes how she explains the merit and appropriateness of qualitative designs when helping individuals and organizations design and evaluation approach or when presenting qualitative findings. Author: Johanna Morariu Type: Opinion (blog, editorial) Date: Jan 18, 2012 Be the first to review this resource! Web Link -
ProPack III - The CRS Project Package: A Guide to Creating a SMILER M&E System The approach to M&E described in this guide is called SMILER. It is a comprehensive and practical approach to developing a project monitoring system that incorporates processes for learning based on robust evidence. It has been written for CRS project managers, technical, and M&E staff to guide their work with partners and communities by describing how to develop an M&E system in which data are systematically collected, reported and used to make project decisions.
Author: Susan Hahn, Guy Sharrock Type: Workbooks & Guides Date: Jun 15, 2010 Be the first to review this resource! Download (1.18 MB) -
Public Will Building: What it Means and How to Evaluate It [Handout] Strategic Partner Julia Coffman (of the Center for Evaluation Innovation) and Innovation Network Director Ehren Reed led a discussion on Public Will Building for The Connecticut Health Foundation's Leadership Fellows. Author: Julia Coffman and Ehren Reed Type: Research & Reports Date: Dec 14, 2011 Be the first to review this resource! Web Link -
Rapid Evaluation The purpose of this guide is to introduce the basic concepts and methods used in rapid evaluations (REs), and to demonstrate how this approach can be applied to the various stages of program development and implementation.
It covers inportant terms/definition, when to use REs, advantages and disadvantages of REs, the five elements of an RE, and provides additional reading resources.
Author: The International Training and Education Center for Health (I-TECH) Type: Workbooks & Guides Date: Jan 1, 2008 Be the first to review this resource! Download (272.44 KB) -
Remarks made at the Environmental Evaluators’ Network Forum: NAVIGATING EVALUATIVE COMPLEXITY IN THE AGE OF OBAMA The author draws on her vast evaluation experience, especially in federal evaluation, to confront issues of complexity in evaluation. She offers the idea of using comprehensive checklists, and supplies her own example.
An exerpt:
Author: Eleanor Chelimsky Type: Opinion (blog, editorial) Date: Jun 8, 2010 Point K Pick Be the first to review this resource! Download (81.2 KB) -
Report: "Ten Considerations for Advocacy Evaluation Planning: Lessons Learned from KIDS COUNT Grantee Experiences" The Annie E. Casey Foundation and Organizational Research Services, Inc. detail ten lessons learned from an evaluation of five KIDS COUNT grantees that began in 2007. The evaluation was designed to test some of the ideas presented in "A Guide to Measuring Advocacy and Policy", a report produced by AECF and ORS in 2006. Author: Organizational Research Services Type: Research & Reports Date: Jan 1, 2009 Be the first to review this resource! Download (289.93 KB) -
Sample Size Calculator This online sample size calculator can help you rapidly estimate how large your sample needs to be to compensate for different margins of error and confidence levels. Free and easy to use, the page also includes good introductory information on how to apply the resource. Author: Raosoft Type: Websites & Online Tools Date: Jan 1, 2004 Be the first to review this resource! Web Link -
Sample Size Calculator A handy online calculator for determining the appropriate sample size for a given population. This page also includes a discussion of sample size and other factors that affect the confidence interval of your data collection process. Author: Creative Research Systems Type: Websites & Online Tools Date: Jan 18, 2008 Be the first to review this resource! Web Link -
Seeing the Forest (Beyond the Trees): Learning Across the Experiences of Seven Advocacy Evaluators [Slides] Advocacy and policy change evaluation continues to evolve and mature--from a fledgling field a few years ago to the flourishing field of today. Evaluators are advancing as well, developing an increasingly robust collective understanding about what works for advocacy evaluation. In this session a diverse group of seven advocacy evaluators explored and synthesized observations drawn from an array of real-world experiences. Panelists spoke to targeted questions, weaving in their wealth of experience and examples. Author: Johanna Morariu, Jara Dean-Coffey, Tom Kelly, Claire Hutchings, David Devlin Foltz, Robin Kane, Jared Raynor, Anne Gienapp, Type: Presentation Slides Date: Oct 19, 2013 Be the first to review this resource! Web Link -
Small Picture, Big Picture: Using the Framework for Public Policy Advocacy in a Large-Scale Advocacy Campaign We were recently tasked with guiding evaluation for a funder's national advocacy campaign, and had to make sense of advocacy data contained in 110 grants. Where did we start? Julia Coffman's Framework for Public Policy Advocacy was the perfect tool. It helped us compare strategies employed by grantees individually, as well as step back and look at strategies used across the campaign. Author: Kat Athanasiades, Veena Pankaj Type: Opinion (blog, editorial) Date: Mar 12, 2014 Be the first to review this resource! Web Link -
Social Research Update: Photo-Interviewing for Research Rosalind Hurthworth offers insight into multiple uses of photography in evaluation. Author: Hurworth, Rosalind Type: Research & Reports Date: Mar 1, 2003 Be the first to review this resource! Download (156.06 KB) -
Software for Nonprofit Evaluation and Case Management Many systems exist to manage the wealth of data nonprofits collect. As part of our evaluation consulting work, Innovation Network team members are often asked to recommend systems to our nonprofit clients. The purpose of this document is to share with the field our most recent scan of existing software (a/o January 2010). We hope this information is helpful to you, and we encourage you to contact us with feedback. Author: Johanna Morariu, Innovation Network, Inc. Type: Workbooks & Guides Date: Jan 31, 2010 Point K Pick Be the first to review this resource! Download (86.51 KB) -
Sourcebook for Evaluating Global and Regional Partnership Programs: Indicative Principles and Standards The purpose of the indicative principles and standards contained in this Sourcebook is to help improve the independence and quality of program-level evaluations of GRPPs in order to enhance the relevance and effectiveness of the programs. The principal audiences for the Soucebook are the governing bodies and management units of GRPPs, as well as professional evaluators involved in the evaluation of these programs. Author: Independent Evaluation Group (IEG) Type: Workbooks & Guides Date: Jan 1, 2007 Be the first to review this resource! Download (833.37 KB) -
State of Evaluation - Russian translation The Project: Nonprofits hear a lot of talk about evaluation these days—metrics and measurements, indicators and impact, efficiency and effectiveness. Everyone, from donors to board members, seems to want evaluation results. But there was a big knowledge gap around evaluation practice: What are nonprofits really doing to evaluate their work? How are they really using evaluation results? What support are they getting? What else do they need?
Author: Innovation Network Type: Research & Reports Date: Feb 3, 2011 Be the first to review this resource! Download (6.78 MB)