Browse Planning Resources
-
Data Placemats: A DataViz Technique to Improve Stakeholder Understanding of Evaluation Results [Slides] At the American Evaluation Assocaition 2012 Annual Conference, Veena Pankaj describes various ways to improve stakeholder engagement, as well as ways to increase stakeholder understanding of evaluation results. Author: Veena Pankaj Type: Presentation Slides Date: Oct 25, 2012 Be the first to review this resource! Web Link -
Dataviz! Or, How to Win at Communication and Influence People (Resources Handout) Are you intrigued by data and information visualization—dataviz—and how it could improve your communication strategy? Are you interested in the range of dataviz options, but unsure which is right for you? Or are you maybe even drowning in data and looking for someone to throw you a life-saving suggestion for tools to transform your data into a message?
Author: Johanna Morariu Type: Tipsheets & Paper Tools Date: May 17, 2013 Point K Pick Be the first to review this resource! Download (510.3 KB) -
Designing a Results Framework for Achieving Results: A How-To Guide A results framework serves as a key tool in the development landscape, enabling practitioners to discuss and establish strategic development objectives and then link interventions to intermediate outcomes and results that directly relate to those objectives. This publication provides how-to guidance for developing results frameworks by discussing the following: Author: Independent Evaluation Group (IEG) Type: Workbooks & Guides Date: Sep 1, 2012 Be the first to review this resource! Download (834.16 KB) -
Developing a Monitoring and Evaluation Process for Capacity Building and Empowerment Can participation and empowerment in M&E be a reality in large scale projects and programmes? How can qualitative change be assessed in a participatory and empowering way which is also reliable and credible? This paper uses INTRAC's Central Asia programme (building NGO capacity in Kazakhstan and Kyrgyzstan) as a case study. Author: International NGO Training and Research Centre (INTRAC) Type: Research & Reports Date: Nov 6, 2002 Be the first to review this resource! Download (102.5 KB) -
Do-It-Yourself Logic Models: Examples, Templates, and Checklists [Handout] Logic models are nonprofit road maps: they help you diagram where you are now and where you hope to be in the future. They are used for program planning, program management, fundraising, communications, consensus-building, and evaluation planning.
Author: Johanna Morariu and Ann Emery, Innovation Network Type: Templates & Samples Date: Feb 25, 2014 Point K Pick Be the first to review this resource! Download (105.1 KB) -
Drawings as a Method of Program Evaluation and Communication with School-Age Children This article discusses using drawings as a means for obtaining children's perceptions in the evaluation process. Author: Evans, William and Reilly, Jackie Type: Research & Reports Date: Oct 16, 2008 Be the first to review this resource! Web Link -
Enhancing Program Performance with Logic Models An online course from the University of Wisconsin, Extension on developing and applying logic models. Designed for the beginner, this user-friendly course includes an audio track, worksheets, resources, and examples. Author: University of Wisconsin - Extension Type: Websites & Online Tools Date: Jan 1, 2002 Be the first to review this resource! Web Link -
Evaluability Assessment to Improve Public Health Policies, Programs, and Practices This article describes how evaluability assessment has benefited public health and could do so in future. We describe the rationale, history, and evolution of evaluability assessment. We outline the steps in the method and distinguish it from related concepts. We then illustrate how evaluability assessment can benefit public health in five ways:
Author: Laura C. Leviton, Laura Kettel Khan, Debra Rog, Nicola Dawkins, and David Cotton Type: Research & Reports Date: Jul 1, 2010 Be the first to review this resource! Download (380.53 KB) -
Evaluating Advocacy: A Model for Public Policy Initiatives This presentation, drawing on work done with the Coalition for Comprehensive Immigration Reform ("CCIR"), was given by Innovation Network staff at the 2006 American Evaluation Association Conference. The presentation discusses advocacy evaluation in general, some inherent challenges that apply more strongly to advocacy evaluation than to evaluation of traditional service programs, and some practical planning and evaluation structures developed as a result of the work with CCIR. Author: Innovation Network, Inc. Type: Presentation Slides Date: Nov 1, 2006 Be the first to review this resource! Download (1.42 MB) -
Evaluating Foundation-Supported Capacity Building: Lessons Learned This study of lessons learned from evaluations of philanthropic capacitybuilding programs used a national database of 473 programs, and a survey and interviews with 87 funders (82 foundations or foundation collaboratives, and five foundation-supported intermediaries) to answer two questions:
(1) How do foundations that support nonprofit capacity building evaluate their grantmaking and direct service activities?
(2) What lessons can be learned from valuation, both to improve these programs and justify the investments made in them?
Author: Thomas E. Backer, Jane Ellen Bleeg & Kathryn Groves Type: Research & Reports Date: Jan 1, 2010 Be the first to review this resource! Download (152.27 KB) -
Evaluating Social Innovation In this paper, the authors explore ways that common evaluation approaches and practices constrain innovation and offer lessons about an emerging evaluation approach—developmental evaluation—which supports the adaptation that is so crucial to innovation. For what kinds of grantmaking strategies should funders consider using developmental evaluation? What organizational conditions are necessary for it to work? How can grantmakers grapple with the challenging questions that developmental evaluation raises about innovation, accountability, rigor, and adaptation? Author: Hallie Preskill and Tanya Beer Type: Research & Reports Date: Aug 1, 2012 Point K Pick Be the first to review this resource! Download (341.7 KB) -
Evaluating System Change: A Planning Guide This methods brief provides guidance on planning effective evaluations of system change interventions. It begins with a general overview of systems theory and then outlines a three-part process for designing system change evaluations. This three-part process aligns (1) the dynamics of the targeted system or situation, (2) the dynamics of the system change intervention, and (3) the intended purpose(s) and methods of the evaluation.
Author: Margaret B. Hargreaves Type: Research & Reports Date: Apr 1, 2010 Be the first to review this resource! Download (1.63 MB) -
Evaluating the Effectiveness of DFID's Influence with Multilaterals This report is based on investigations carried out over five weeks involving approximately 40 organizations in the international NGO community. The report covers four main topics
Author: Davies, Rick Type: Research & Reports Date: Aug 1, 2001 Be the first to review this resource! Download (436.5 KB) -
Evaluation A Beginners Guide This document aims to present a user-friendly approach to the process of evaluation. It is
particularly targeted at those involved in programs working towards the introduction of human
rights concepts and values in educational curricula and teaching practices, who are initiating an
evaluation for the first time. It contains practical suggestions on how to effectively organize the
evaluation of such programs in order to learn from the work implemented so far.Author: Amnesty International Type: Workbooks & Guides Date: Jun 30, 1999
Web Link -
Evaluation as a Tool for Creating and Leading a Results-Based Learning Culture [Slides] Johanna Morariu and Will Fenn discussed effective evaluation initiatives, highlighting useful tools such as those available from Innovation Network’s Point K at the Emerging Practitioners in Philanthropy 2013 National Conference in Chicago, IL. Author: Johanna Morariu and Will Fenn Type: Presentation Slides Date: Apr 5, 2013 Be the first to review this resource! Web Link -
Evaluation Capacity Building: Examples and Lessons from the Field Innovation Network developed these three introductory evaluation documents as part of Building Nonprofit Capacity to Evaluate, Learn, and Grow Impact, a workshop we presented in partnership with Grantmakers for Effective Organizations' Scaling What Works initiative. Author: Johanna Morariu, Innovation Network, Inc. Type: Research & Reports Date: Apr 1, 2012 Point K Pick Be the first to review this resource! Web Link -
Evaluation Concepts Mindmap The Evaluation Concepts Mindmap is a visual showing different considerations in evaluation, such as assessment types, evaluation cycle, and data collection, and the components of those considerations. This is a useful tool for evaluation planning, as well as for gaining a broad perspective of what is important in the field of evaluation—all on one page. Author: Johanna Morariu, Innovation Network Type: Tipsheets & Paper Tools Date: Mar 28, 2013 Point K Pick
Download (259.32 KB) -
Evaluation Essentials for Nonprofits: Terms, Tips, and Trends [Slides] These slides are an excerpt from a fuller presentation for the Young Nonprofit Professionals Network, which was held on June 2, 2014 in Washington, DC. Author: Ann K. Emery and Johanna Morariu, Innovation Network, Inc. Type: Presentation Slides Date: Jun 2, 2014 Point K Pick Be the first to review this resource! Web Link -
Evaluation for Improvement: A Seven-Step Empowerment Evaluation Approach This manual is designed to help violence prevention organizations hire an empowerment evaluator who will assist them in building their evaluation capacity through a learn-by-doing process of evaluating their own strategies. It is for state and local leaders and staff members of organizations, coalitions, government agencies, and/or partnerships working to prevent violence. Some parts of the manual may also be useful to empowerment evaluators who work with these organizations. Author: Pamela J. Cox, Dana Keener, Tiffanee L. Woodard, & Abraham J. Wandersman Type: Workbooks & Guides Date: Jan 1, 2009 Be the first to review this resource! Download (2.86 MB) -
Evaluation for the Way We Work Michael Quinn Patton describes the developmental evaluation approach. Here is an excerpt from the article:
Author: Michael Quinn Patton Type: Newsletters & Periodicals Date: Mar 21, 2006 Be the first to review this resource! Download (886 KB) -
Evaluation Logic Model A logic model overview with links to workbooks, PowerPoint presentations, and other resources. Author: University of Wisconsin - Extension Type: Websites & Online Tools Date: Jan 18, 2008
Web Link -
Evaluation Needs Assessment The Evaluation Needs Assessment was created for use with a grantee cohort of twelve organizations, which received grant support and capacity building services for a period of three years.Directions: Share the tool in advance of an in-person meeting to allow for preparation. Meet to discuss an organization’s existing evaluation practice and goals for improvement. Seek mutual agreement of the evaluation capacity provider and grantee regarding how to apply evaluation technical assistance.Author: Johanna Morariu Type: Tipsheets & Paper Tools Date: Apr 25, 2011 Be the first to review this resource! Download (215.84 KB) -
Evaluation Plan Workbook (.pdf) Innovation Network's own workbook on evaluation planning. Can be used alone or in conjunction with the Evaluation Plan Builder at the Point K Learning Center. Author: Innovation Network, Inc. Type: Workbooks & Guides Date: May 14, 2005 Point K Pick
Download (978.38 KB) -
Evaluation Plan Workbook (in MS Word) Innovation Network's own workbook: an introduction to the processes and concepts of evaluation planning. Author: Innovation Network, Inc. Type: Templates & Samples Date: Dec 31, 2005 Point K Pick
Web Link -
Evaluation Principles and Practices: An Internal Working Paper The purpose of this document is to advance the Foundation’s existing work so that our evaluation practices become more consistent across the organization. We hope to create more common understanding of our philosophy, purpose, and expectations regarding evaluation as well as clarify staff roles and available support. With more consistency and shared understanding, we expect less wheel re-creation across program areas, greater learning from each other’s efforts, and faster progress in designing meaningful evaluations and applying the results.
Author: Fay Twersky & Karen Lindblom Type: Research & Reports Date: Jan 22, 2013 Be the first to review this resource! Download (1.42 MB)