Browse Design Data Resources
-
#YNPNdc14: DataViz! Tips, Tools, and How-tos for Visualizing Your Data [Slides] Johanna Morariu and Ann K. Emery presented at the Young Nonprofit Professionals Network 2014 Annual Leadership Conference, which was held on May 9, 2014 in Washington, DC. Author: Johanna Morariu and Ann K. Emery, Innovation Network Type: Presentation Slides Date: May 9, 2014 Point K Pick Be the first to review this resource! Web Link -
A Guide to Measuring Advocacy and Policy Developed for the Annie E. Casey Foundation, this guide serves as a broad call to grantmakers to build and advance the field of advocacy and policy evaluation. The guide includes sections on the context of advocacy evaluation and evaluation design. Author: Reisman, Jane, et al. Type: Research & Reports Date: Mar 15, 2007
Download (255.04 KB) -
Agency Experiences with Outcome Measurement: Survey Findings As of January 2000, 400 United Ways across the country were asking programs they fund
to identify and measure their outcomes—the benefits or changes the programs want
participants to experience as a result of their services. United Ways are not alone. Many
state and local government agencies, foundations, managed care systems, and accrediting bodies
have added outcome measurement to the list of performance and accountability measures
they require of nonprofit organizations within their sphere.Author: United Way of America Type: Research & Reports Date: Jan 1, 2000 Be the first to review this resource! Download (209.74 KB) -
Basic Guide to Program Evaluation Carter McNamara's guide to evaluation, part of his Free Management Library and drawn from his book Field Guide to Nonprofit Program Design, Marketing and Evaluation. This guide covers the hows and whys of evaluation, an overview of evaluation and data collection methodologies and how to choose between them, analysis and interpretation advice, and other tips. Author: McNamara, Carter (ed.) Type: Websites & Online Tools Date: Jan 18, 2008
Web Link -
Catholic Relief Services' (CRS) Guidance for Developing Logical and Results Frameworks This document summarizes Catholic Relief Services’ (CRS) guidance for developing logical and results frameworks.
• logical framework “is a systematic and visual way to present and share your understanding of the relationships among the resources you have to operate your program, the activities you plan, and the changes or results you hope to achieve.”CRS’ proframe and the U.K. Department for International Development’s (DFID’s) logframe are examples of logical frameworks.
Author: Carlisle J. Levine Type: Workbooks & Guides Date: Jan 31, 2007 Be the first to review this resource! Download (196.44 KB) -
Coalition Assessment: Approaches for Measuring Capacity and Impact Why assess coalition capacity? How should a coalition be assessed? How can coalition assessment data be analyzed and used?
Author: Veena Pankaj, Kat Athanasiades, and Ann Emery Type: Research & Reports Date: Feb 4, 2014 Point K Pick Be the first to review this resource! Download (768.05 KB) -
Conquering the Dusty Shelf Report: Data Visualization for Evaluation In this blog post for Visualising Data, Johanna Morariu and Ann Emery share three tactics for tackling the Dusty Shelf Report in evaluation: captivating the readers with visuals, choosing the design that's right for the reader, and strengthening the dataviz literacy of readers. Author: Johanna Morariu and Ann Emery Type: Opinion (blog, editorial) Date: May 7, 2013 Be the first to review this resource! Web Link -
Data and Information Visualization Throughout the Evaluation Life Cycle for Participatory Evaluation and Evaluation Capacity Innovation Network shared approaches and examples of how to incorporate innovative data and information visualization techniques throughout each stage of the evaluation life cycle to support participatory evaluation and build evaluation capacity. In the planning and design phase mind mapping can be used to promote brainstorming and idea generation. In the data collection stage, evaluators can use creative visuals to improve stakeholder understanding of and participation in data collection, and evaluators can adhere to good design principles to create effective data collection instruments. Author: Johanna Morariu, Myia Welsh, Veena Pankaj, Melissa March Type: Presentation Slides Date: Nov 3, 2011 Be the first to review this resource! Web Link -
Data and Information Visualization Throughout the Life Cycle for Participatory Evaluation and Evaluation Capacity Building This is a handout containing a list of great resources to help improve your data visualization skills. The list provides links to websites that will help you design the right color scheme (such as Design Seeds), websites that provides basic information about principles of design, and great examples of how evaluators, statisticians, and computer scientists are using data viz to help us understand data better.
Author: Johanna Morariu and Veena Pankaj Type: Websites & Online Tools Date: Jan 1, 2013 Be the first to review this resource! Download (392.28 KB) -
Designing a Results Framework for Achieving Results: A How-To Guide A results framework serves as a key tool in the development landscape, enabling practitioners to discuss and establish strategic development objectives and then link interventions to intermediate outcomes and results that directly relate to those objectives. This publication provides how-to guidance for developing results frameworks by discussing the following: Author: Independent Evaluation Group (IEG) Type: Workbooks & Guides Date: Sep 1, 2012 Be the first to review this resource! Download (834.16 KB) -
Evaluating System Change: A Planning Guide This methods brief provides guidance on planning effective evaluations of system change interventions. It begins with a general overview of systems theory and then outlines a three-part process for designing system change evaluations. This three-part process aligns (1) the dynamics of the targeted system or situation, (2) the dynamics of the system change intervention, and (3) the intended purpose(s) and methods of the evaluation.
Author: Margaret B. Hargreaves Type: Research & Reports Date: Apr 1, 2010 Be the first to review this resource! Download (1.63 MB) -
Evaluation Principles and Practices: An Internal Working Paper The purpose of this document is to advance the Foundation’s existing work so that our evaluation practices become more consistent across the organization. We hope to create more common understanding of our philosophy, purpose, and expectations regarding evaluation as well as clarify staff roles and available support. With more consistency and shared understanding, we expect less wheel re-creation across program areas, greater learning from each other’s efforts, and faster progress in designing meaningful evaluations and applying the results.
Author: Fay Twersky & Karen Lindblom Type: Research & Reports Date: Jan 22, 2013 Be the first to review this resource! Download (1.42 MB) -
Guía para la Formulación de Marcos Lógicos y de Resultados de Catholic Relief Services (CRS) Sinopsis
Este documento presenta un resumen de la guía de CRS para formular marcos lógicos y de resultados.• Un marco lógico “es una forma sistemática y visual de presentar y compartir su visión de las relaciones entre los recursos con los que usted cuenta para operativizar su programa, las actividades que usted planifica y los cambios o resultados que espera alcanzar.”1 El proframe de CRS y el marco lógico del Departamento de Desarrollo Internacional del Reino Unido (DFID) son ejemplos de marcos lógicos.
Author: Carlisle J. Levine Type: Workbooks & Guides Date: Jan 31, 2007 Be the first to review this resource! Download (210.25 KB) -
Guidance Note #3: Introduction to Mixed Methods in Impact Evaluation Mixed methods (MM) evaluations seek to integrate social science disciplines with predominantly quantitative (QUANT) and predominantly qualitative (QUAL) approaches to theory, data collection, data analysis and interpretation. The purpose is to strengthen the reliability of data, validity of the findings and recommendations, and to broaden and deepen our understanding of the processes through which program outcomes and impacts are achieved, and how these are affected by the context within which the program is implemented.
Author: Michael Bamberger Type: Workbooks & Guides Date: Sep 5, 2012 Be the first to review this resource! Web Link -
How to Design a Monitoring and Evaluation Framework for a Policy Research Project This guidance note focuses on the designing and structuring of a monitoring and evaluation framework for policy research projects and programmes.
The primiary audience for this guidance note is people designing and managing monitoring and evaluation. However, it will be a useful tool for anyone involved in monitoring and evaluation activities.
The framework presented in this guidance note is intended to be used in a flexible manner depending on the purpose and characteristics of the research project.
Author: Methods Lab Type: Workbooks & Guides Date: Jan 1, 2016 Be the first to review this resource! Download (346.06 KB) -
How to Use Data Visualization to Better Tell Your Story Memos and metrics, emails and texts, newsletters and reports: Is your organization suffering from information overload? We consume 34 gigabytes, or 100,500 words, of information every day. Our brains are overwhelmed and struggling to keep up. Data visualization–or dataviz–is one of the strongest weapons against information overload. Author: Ann Emery Type: Opinion (blog, editorial) Date: Feb 1, 2014 Point K Pick Be the first to review this resource! Web Link -
Impact Evaluation in Practice This book provides an overview of impact evaluation from the perspective of the Wold Bank.
Author: Paul J. Gertler, Sebastian Martinez, Patrick Premand, Laura B. Rawlings, Christel M. J. Vermeersch Type: Workbooks & Guides Date: Jan 1, 2011 Be the first to review this resource! Download (3.06 MB) -
Logic Models in Participatory Evaluation Slides providing a basic introduction to the use of Logic Models inParticipatory Evaluaiton
Author: Douglas Bruce Type: Presentation Slides Date: Sep 1, 2011 Be the first to review this resource! Download (725.48 KB) -
Orientações da Catholic Relief Services (CRS) para desenvolver matrizes lógicas e estruturas de resultados Visão geral
Este documento sintetiza as orientações da Catholic Relief Services (CRS) para desenvolver matrizes lógicas e estruturas de resultados.Author: Carlisle J. Levine Type: Workbooks & Guides Date: Jan 31, 2007 Be the first to review this resource! Download (207.59 KB) -
Portfolio Evaluation vs. Grant Evaluation In this webinar Johanna Morariu and Ehren Reed discuss four levels of evaluation: grant-level, portfolio-level, foundation-level, and issue-level. The presentation addresses the pros and cons of these four levels of evaluation, and when one level may be more appropriate than another. Also included are considerations for right-sizing your evaluation approach—design, data collection, analysis, and reporting differences between grant and portfolio evaluations.Author: Johanna Morariu and Ehren Reed Type: Websites & Online Tools Date: Feb 22, 2012 Be the first to review this resource! Web Link -
Proofiness: The Dark Arts of Mathematical Deception and the Evaluation Profession Johanna Morariu describes how she explains the merit and appropriateness of qualitative designs when helping individuals and organizations design and evaluation approach or when presenting qualitative findings. Author: Johanna Morariu Type: Opinion (blog, editorial) Date: Jan 18, 2012 Be the first to review this resource! Web Link -
ProPack III - The CRS Project Package: A Guide to Creating a SMILER M&E System The approach to M&E described in this guide is called SMILER. It is a comprehensive and practical approach to developing a project monitoring system that incorporates processes for learning based on robust evidence. It has been written for CRS project managers, technical, and M&E staff to guide their work with partners and communities by describing how to develop an M&E system in which data are systematically collected, reported and used to make project decisions.
Author: Susan Hahn, Guy Sharrock Type: Workbooks & Guides Date: Jun 15, 2010 Be the first to review this resource! Download (1.18 MB) -
Software for Nonprofit Evaluation and Case Management Many systems exist to manage the wealth of data nonprofits collect. As part of our evaluation consulting work, Innovation Network team members are often asked to recommend systems to our nonprofit clients. The purpose of this document is to share with the field our most recent scan of existing software (a/o January 2010). We hope this information is helpful to you, and we encourage you to contact us with feedback. Author: Johanna Morariu, Innovation Network, Inc. Type: Workbooks & Guides Date: Jan 31, 2010 Point K Pick Be the first to review this resource! Download (86.51 KB) -
The 6 D's of Needs Assessments This handy one-page tipsheet outlines the six D's that need to be answered for a successful Needs Assessment:
- Deficit
- Develop
- Describe
- Desires
- Duplication
- Demand
Author: Community Soplutions Type: Tipsheets & Paper Tools Date: Jan 1, 2012 Be the first to review this resource! Download (1.47 MB) -
The Challenge of Assessing Policy and Advocacy Activities The paper is designed to outline an approach to policy change evaluation grounded in the experience of experts and foundation colleagues. (See Appendix A for the research methodology.) This paper first posits three key priorities in evaluating policy change work, drawn from interviews with grantees and staff from The California Endowment on their needs concerning policy change evaluation. It also discusses the challenges inherent in monitoring and assessing these types of grants.
Author: Commissioned by The California Endowment. Researched and Written by Blueprint Research and Design, Inc. Type: Research & Reports Date: Oct 1, 2005 Be the first to review this resource! Download (2.84 MB)