Browse data collection Resources
-
Picturing Your Data is Better Than 1000 Numbers: Data Visualization Techniques for Social Change Are you intrigued by infographics and how they could improve your communication strategy? Are you interested in what it takes for an organization to systematically use data? Or are you maybe even drowning in data and looking for someone to throw you a life-saving suggestion for software and other tools? Johanna Morariu, Beth Kanter, and Brian Kennedy presented a panel on data and information visualization at the 2012 Nonprofit Tech Conference. This video is a recording of the panel.
Author: Johanna Morariu Type: Presentation Slides Date: Be the first to review this resource! Web Link -
Portfolio Evaluation vs. Grant Evaluation In this webinar Johanna Morariu and Ehren Reed discuss four levels of evaluation: grant-level, portfolio-level, foundation-level, and issue-level. The presentation addresses the pros and cons of these four levels of evaluation, and when one level may be more appropriate than another. Also included are considerations for right-sizing your evaluation approach—design, data collection, analysis, and reporting differences between grant and portfolio evaluations.Author: Johanna Morariu and Ehren Reed Type: Websites & Online Tools Date: Feb 22, 2012 Be the first to review this resource! Web Link -
Power, Participation, and State-based Politics: An Evaluation of the Ford Foundation's Collaborations that Count Initiative The Applied Research Center (ARC) conducted a two-year participatory evaluation to provide an account of the Ford Foundation's "Collaborations That Count Initiative." The report identifies areas in which the 11 statewide collaborations succeeded, and draws attention to ways in which support to collaboration might be more effectively provided in the future. Author: Applied Research Center Type: Research & Reports Date: Apr 21, 2004 Be the first to review this resource! Download (1.85 MB) -
Program Development and Evaluation This site provides a comprehensive set of resources on planning and implementing an evaluation. Some of their tools require a free login. Author: University of Wisconsin - Extension Type: Websites & Online Tools Date: Jan 18, 2008 Be the first to review this resource! Web Link -
Program Evaluation Guide A general introduction to Evaluation, this guide walks the reader through the process of answering the following seven basic questions
1. What do we want to evaluate?
2. What is the purpose of the evaluation?
3. What type of evaluation do we want to use?
4. What information do we need to answer our questions?
5. How do we get the information?
6. How will we analyze the information?
7. How will we use and share the results?Author: Katie Cangemi and Maggie Litgen Type: Workbooks & Guides Date: Dec 31, 2011 Be the first to review this resource! Download (195.46 KB) -
Program Evaluation: Assessing and Measuring Your Program’s Performance Innovation Network's slideshow (from a presentation to the 2007 Nonprofit Technology Conference in Washington, D.C.) about the importance of planning for effective program evaluation, using our online tools, and leveraging online communities for knowledge-sharing. (1.15 MB .pdf file; may take some time to download.) Author: Innovation Network, Inc. Type: Presentation Slides Date: Jan 18, 2008 Be the first to review this resource! Download (1.15 MB) -
Proofiness: The Dark Arts of Mathematical Deception and the Evaluation Profession Johanna Morariu describes how she explains the merit and appropriateness of qualitative designs when helping individuals and organizations design and evaluation approach or when presenting qualitative findings. Author: Johanna Morariu Type: Opinion (blog, editorial) Date: Jan 18, 2012 Be the first to review this resource! Web Link -
ProPack III - The CRS Project Package: A Guide to Creating a SMILER M&E System The approach to M&E described in this guide is called SMILER. It is a comprehensive and practical approach to developing a project monitoring system that incorporates processes for learning based on robust evidence. It has been written for CRS project managers, technical, and M&E staff to guide their work with partners and communities by describing how to develop an M&E system in which data are systematically collected, reported and used to make project decisions.
Author: Susan Hahn, Guy Sharrock Type: Workbooks & Guides Date: Jun 15, 2010 Be the first to review this resource! Download (1.18 MB) -
Public Will Building: What it Means and How to Evaluate It [Handout] Strategic Partner Julia Coffman (of the Center for Evaluation Innovation) and Innovation Network Director Ehren Reed led a discussion on Public Will Building for The Connecticut Health Foundation's Leadership Fellows. Author: Julia Coffman and Ehren Reed Type: Research & Reports Date: Dec 14, 2011 Be the first to review this resource! Web Link -
Rapid Evaluation The purpose of this guide is to introduce the basic concepts and methods used in rapid evaluations (REs), and to demonstrate how this approach can be applied to the various stages of program development and implementation.
It covers inportant terms/definition, when to use REs, advantages and disadvantages of REs, the five elements of an RE, and provides additional reading resources.
Author: The International Training and Education Center for Health (I-TECH) Type: Workbooks & Guides Date: Jan 1, 2008 Be the first to review this resource! Download (272.44 KB) -
Remarks made at the Environmental Evaluators’ Network Forum: NAVIGATING EVALUATIVE COMPLEXITY IN THE AGE OF OBAMA The author draws on her vast evaluation experience, especially in federal evaluation, to confront issues of complexity in evaluation. She offers the idea of using comprehensive checklists, and supplies her own example.
An exerpt:
Author: Eleanor Chelimsky Type: Opinion (blog, editorial) Date: Jun 8, 2010 Point K Pick Be the first to review this resource! Download (81.2 KB) -
Report: "Ten Considerations for Advocacy Evaluation Planning: Lessons Learned from KIDS COUNT Grantee Experiences" The Annie E. Casey Foundation and Organizational Research Services, Inc. detail ten lessons learned from an evaluation of five KIDS COUNT grantees that began in 2007. The evaluation was designed to test some of the ideas presented in "A Guide to Measuring Advocacy and Policy", a report produced by AECF and ORS in 2006. Author: Organizational Research Services Type: Research & Reports Date: Jan 1, 2009 Be the first to review this resource! Download (289.93 KB) -
Sample Size Calculator A handy online calculator for determining the appropriate sample size for a given population. This page also includes a discussion of sample size and other factors that affect the confidence interval of your data collection process. Author: Creative Research Systems Type: Websites & Online Tools Date: Jan 18, 2008 Be the first to review this resource! Web Link -
Sample Size Calculator This online sample size calculator can help you rapidly estimate how large your sample needs to be to compensate for different margins of error and confidence levels. Free and easy to use, the page also includes good introductory information on how to apply the resource. Author: Raosoft Type: Websites & Online Tools Date: Jan 1, 2004 Be the first to review this resource! Web Link -
Seeing the Forest (Beyond the Trees): Learning Across the Experiences of Seven Advocacy Evaluators [Slides] Advocacy and policy change evaluation continues to evolve and mature--from a fledgling field a few years ago to the flourishing field of today. Evaluators are advancing as well, developing an increasingly robust collective understanding about what works for advocacy evaluation. In this session a diverse group of seven advocacy evaluators explored and synthesized observations drawn from an array of real-world experiences. Panelists spoke to targeted questions, weaving in their wealth of experience and examples. Author: Johanna Morariu, Jara Dean-Coffey, Tom Kelly, Claire Hutchings, David Devlin Foltz, Robin Kane, Jared Raynor, Anne Gienapp, Type: Presentation Slides Date: Oct 19, 2013 Be the first to review this resource! Web Link -
Social Research Update: Photo-Interviewing for Research Rosalind Hurthworth offers insight into multiple uses of photography in evaluation. Author: Hurworth, Rosalind Type: Research & Reports Date: Mar 1, 2003 Be the first to review this resource! Download (156.06 KB) -
Software for Nonprofit Evaluation and Case Management Many systems exist to manage the wealth of data nonprofits collect. As part of our evaluation consulting work, Innovation Network team members are often asked to recommend systems to our nonprofit clients. The purpose of this document is to share with the field our most recent scan of existing software (a/o January 2010). We hope this information is helpful to you, and we encourage you to contact us with feedback. Author: Johanna Morariu, Innovation Network, Inc. Type: Workbooks & Guides Date: Jan 31, 2010 Point K Pick Be the first to review this resource! Download (86.51 KB) -
Sourcebook for Evaluating Global and Regional Partnership Programs: Indicative Principles and Standards The purpose of the indicative principles and standards contained in this Sourcebook is to help improve the independence and quality of program-level evaluations of GRPPs in order to enhance the relevance and effectiveness of the programs. The principal audiences for the Soucebook are the governing bodies and management units of GRPPs, as well as professional evaluators involved in the evaluation of these programs. Author: Independent Evaluation Group (IEG) Type: Workbooks & Guides Date: Jan 1, 2007 Be the first to review this resource! Download (833.37 KB) -
State of Evaluation - Russian translation The Project: Nonprofits hear a lot of talk about evaluation these days—metrics and measurements, indicators and impact, efficiency and effectiveness. Everyone, from donors to board members, seems to want evaluation results. But there was a big knowledge gap around evaluation practice: What are nonprofits really doing to evaluate their work? How are they really using evaluation results? What support are they getting? What else do they need?
Author: Innovation Network Type: Research & Reports Date: Feb 3, 2011 Be the first to review this resource! Download (6.78 MB) -
State of Evaluation 2012: Evaluation Practice and Capacity in the Nonprofit Sector The State of Evaluation 2012 report marks the second time Innovation Network has surveyed the U.S. nonprofit field to learn about evaluation practices and capacities! To learn more about the project, visit www.stateofevaluation.org. Author: Johanna Morariu, Katherine Athanasiades, and Ann K. Emery of Innovation Network, Inc. Type: Research & Reports Date: Oct 1, 2012 Point K Pick Be the first to review this resource! Download (2.63 MB) -
State of Evaluation in the Social Sector Measurement, evaluation, and learning are hotter than ever in the social sector. Foundations and nonprofits are focused on answering the question What difference are we making? And the field of evaluation has advanced in promising ways, developing meaningful evaluation approaches to better fit the latest philanthropic and nonprofit strategies. Author: Johanna Morariu and Will Fenn Type: Templates & Samples Date: Mar 21, 2013 Be the first to review this resource! Web Link -
State of the Field: Updated, Longitudal Findings about Nonprofit and Philanthropic Evaluation Practices and Capacities [Slides] The State of Evaluation project provides valuable insight to all those who work in and with the nonprofit sector. The project is designed to collect longitudinal data to document evaluation trends in the U.S. nonprofit sector, including how nonprofits staff evaluation, how evaluation is funded, why evaluation is undertaken, how evaluation results are used, and much more. This year marks the beginning of longitudinal data and analysis, drawing from the first iteration of the project in 2010. Author: Johanna Morariu Type: Presentation Slides Date: Oct 27, 2012 Be the first to review this resource! Web Link -
Statistical Confidence in a Survey: How Many is Enough? An informative article about the "myths, misinformation, and must-haves" about sample sizes and confidence intervals. Author: Van Bennekom, Fred Type: Newsletters & Periodicals Date: Oct 1, 2003 Be the first to review this resource! Web Link -
Statistics Every Writer Should Know Simple explanations of basic statistics written for reporters and writers, but useful for everyone who needs to work with statistics and evaluation reporting. Author: Niles, Robert Type: Websites & Online Tools Date: Jan 18, 2008 Be the first to review this resource! Web Link -
Statistics Tutorial Offered at a number of international workshops, this PowerPoint tutorial aims to help you develop a 'gut feeling' for key statistical concepts, concentrating on meaning rather than formulae. Author: Dix, Alan Type: Presentation Slides Date: Jan 18, 2008
Web Link