Browse Capacity Building Resources
-
Creative Ways to Solicit Youth Input: A Hands-On Guide for Youth Practitioners To understand how our programs are doing, improve the quality of those programs and report to funders and other stakeholders, we collect information from a variety of sources: staff, parents, and the youth themselves. This manual provides ideas for other, creative ways to get input from youth.
Part One outlines a process you can use to plan your evaluation questions. Next, we present some key ideas to consider when implementing your evaluation, analyzing the results and reporting them.
Author: Public Profit Type: Workbooks & Guides Date: May 1, 2012
Download (1.69 MB) -
Current Advocacy Evaluation Practice Framing Paper Written for the Advocacy Evaluation Advances convening in January 2009, this paper summarizes the current state of advocacy evaluation practice. The paper identifies four evaluation design questions and then offers common responses to those questions: Who will do the evaluation?; What will the evaluation measure?; When will the evaluation take place?; and What methodology will the evaluation use?
Author: Julia Coffman Type: Research & Reports Date: Jan 31, 2009 Be the first to review this resource! Download (297 KB) -
Dabbling in the Data: A Hands-On Guide to Participatory Data Analysis Many quality improvement trainings either ignore data interpretations or offer limited guidance about how to dig into the data. This means that whoever speaks up first sets the agenda for the group—hardly a rigorous or fruitful process. Educators and children and youth professionals are increasingly interested in utilizing data to support continuous quality improvement, but few resources are available to help practitioners know what to do.Author: Public Profit Type: Research & Reports Date: Jan 1, 2014 Be the first to review this resource! Download (1.3 MB) -
Data and Information Visualization Throughout the Evaluation Life Cycle for Participatory Evaluation and Evaluation Capacity Innovation Network shared approaches and examples of how to incorporate innovative data and information visualization techniques throughout each stage of the evaluation life cycle to support participatory evaluation and build evaluation capacity. In the planning and design phase mind mapping can be used to promote brainstorming and idea generation. In the data collection stage, evaluators can use creative visuals to improve stakeholder understanding of and participation in data collection, and evaluators can adhere to good design principles to create effective data collection instruments. Author: Johanna Morariu, Myia Welsh, Veena Pankaj, Melissa March Type: Presentation Slides Date: Nov 3, 2011 Be the first to review this resource! Web Link -
Data Collection Tips: Build On What’s Out There Innovation Network's own tipsheet, "Build on What's Out There" discusses how to use existing tools to create your own data collection instruments, followed by a list of databases of instruments that are available on the Web. Author: Innovation Network, Inc. Type: Tipsheets & Paper Tools Date: May 15, 2005 Be the first to review this resource! Download (163.99 KB) -
Data Collection Tips: Survey Development Innovation Network's tips to avoid some common issues that nonprofits face when constructing a survey. Author: Innovation Network, INc. Type: Tipsheets & Paper Tools Date: May 15, 2005 Be the first to review this resource! Download (198.88 KB) -
Data Placemats: A DataViz Technique to Improve Stakeholder Understanding of Evaluation Results [Slides] At the American Evaluation Assocaition 2012 Annual Conference, Veena Pankaj describes various ways to improve stakeholder engagement, as well as ways to increase stakeholder understanding of evaluation results. Author: Veena Pankaj Type: Presentation Slides Date: Oct 25, 2012 Be the first to review this resource! Web Link -
Designing a Results Framework for Achieving Results: A How-To Guide A results framework serves as a key tool in the development landscape, enabling practitioners to discuss and establish strategic development objectives and then link interventions to intermediate outcomes and results that directly relate to those objectives. This publication provides how-to guidance for developing results frameworks by discussing the following: Author: Independent Evaluation Group (IEG) Type: Workbooks & Guides Date: Sep 1, 2012 Be the first to review this resource! Download (834.16 KB) -
Developing a Monitoring and Evaluation Process for Capacity Building and Empowerment Can participation and empowerment in M&E be a reality in large scale projects and programmes? How can qualitative change be assessed in a participatory and empowering way which is also reliable and credible? This paper uses INTRAC's Central Asia programme (building NGO capacity in Kazakhstan and Kyrgyzstan) as a case study. Author: International NGO Training and Research Centre (INTRAC) Type: Research & Reports Date: Nov 6, 2002 Be the first to review this resource! Download (102.5 KB) -
Developing effective coalitions: an eight step guide This step-by-step guide to coalition building helps partnerships launch and stabilize successfully. It supports advocates and practitioners in every aspect of the process-from determining the appropriateness of a coalition to selecting members, defining key elements, maintaining vitality, and conducting ongoing evaluations.
Author: Prevention Institute Type: Workbooks & Guides Date: Jan 1, 2002 Be the first to review this resource! Download (419.65 KB) -
Developmental Evaluation This presentation by Ricardo Wilson-Grau at the Michigan Association for Evaluation's annual conference is based on the concept of Developmental Evaluation (DE) elaborated by Michael Quinn Patton over the past 20 years and now crystallised in a book – Developmental Evaluation: Applying Complexity Concepts to Enhance Innovation and Use. The presentation presents an overview of DE, explains the basic theory behind DE, when can DE be used in evaluation, as well as the differences between DE and traditional evaluation. Author: Ricardo Wilson-Grau Type: Presentation Slides Date: May 3, 2000 Be the first to review this resource! Download (5.05 MB) -
Drawings as a Method of Program Evaluation and Communication with School-Age Children This article discusses using drawings as a means for obtaining children's perceptions in the evaluation process. Author: Evans, William and Reilly, Jackie Type: Research & Reports Date: Oct 16, 2008 Be the first to review this resource! Web Link -
Dynamic Dozen: Delivery. Tips from top presenters from the American Evaluation Association This study consisted of interviews with a dozen of the top AEA presenters to get their secrets about how to make and deliver great presentations. Their comments were grouped into three stages of presenting: message, design, and
Author: Anjanette Raber Type: Workbooks & Guides Date: Aug 1, 2012 Be the first to review this resource! Download (489.15 KB) -
Dynamic Dozen: Design. Tips from top presenters from the American Evaluation Association This study consisted of interviews with a dozen of the top AEA presenters to get their secrets about how to make and deliver great presentations. Their comments were grouped into three stages of presenting: message, design, and delivery. This report focuses solely on Design, that is, the intentional composition of slides. While the context for their talks spanned long and short presentations, and included different types of audiences and purposes, their insights can be used or modified by evaluators for their own presentations at the AEA annual conference and elsewhere.
Author: Anjanette Raber Type: Workbooks & Guides Date: Aug 1, 2012 Be the first to review this resource! Download (431.09 KB) -
Dynamic Dozen: Message. Tips from top presenters from the American Evaluation Association This study consisted of interviews with a dozen of the top AEA presenters to get their secrets about how to make and deliver great presentations. Their comments were grouped into three stages of presenting: message, design, and delivery. This report focuses solely on Message, that is, the mindful planning of a structured presentation. While the context for their talks spanned long and short presentations, and included different types of audiences and purposes, their insights can be used or modified by evaluators for their own presentations at the AEA annual conference and elsewhere.
Author: Anjanette Raber Type: Workbooks & Guides Date: Aug 1, 2012 Be the first to review this resource! Download (555.08 KB) -
Echoes from the Field: Proven Capacity-Building Principles for Nonprofits This October 2001 report summarizes the findings of a study produced in collaboration with the Environmental Support Center and funded by the David and Lucile Packard Foundation. The study identified nine principles critical to effective capacity building services: self-determination, trust, readiness, ongoing learning, team and peer learning, sensitivity to different learning styles, awareness of organizational culture, interrelatedness of organizational elements, and timeframe. Author: Innovation Network, Inc. Type: Research & Reports Date: Oct 1, 2001
Download (185.83 KB) -
Effective Advocacy Evaluation: The Role of Funders Johanna Morariu and Kathleen Brennan of Innovation Network produced this article for The Foundation Review to discuss the role of grantmakers in advocacy evaluation. The authors provide several recommendations based on their research into the practices of both advocacy grantmakers and grantees. Author: Johanna Gladfelter Morariu and Kathleen Brennan, Innovation Network Type: Research & Reports Date: Oct 1, 2009 Be the first to review this resource! Download (728.77 KB) -
eNonprofits Benchmark Study: Measuring Email Messaging, Online Fundraising, and Internet Advocacy Metrics This report provides a snapshot of key metrics and benchmarks for nonprofit e-mail communications, online fundraising, and online advocacy. According to the authors, "Organizations will be able to use this study to begin to understand how to look at and analyze their own online communications data. It will provide context and comparisons for organizations doing their own ongoing reporting."
Author: M & R Strategic Services and the Advocacy Institute Type: Research & Reports Date: Mar 10, 2012 Be the first to review this resource! Download (2.25 MB) -
Essentials of Survey Research and Analysis This workbook for community researchers includes chapters on survey basics, types of data, questionnaire formats, constructing questions and answers, assessing reliability, identifying research problems and defining solutions, sampling, coding and data organization, formatting and testing, storing data, data entry and quality control, analysis, and reporting. Author: Polland, Ronald J. Type: Workbooks & Guides Date: Jan 1, 2005 Be the first to review this resource! Download (260.35 KB) -
Evaluability Assessment to Improve Public Health Policies, Programs, and Practices This article describes how evaluability assessment has benefited public health and could do so in future. We describe the rationale, history, and evolution of evaluability assessment. We outline the steps in the method and distinguish it from related concepts. We then illustrate how evaluability assessment can benefit public health in five ways:
Author: Laura C. Leviton, Laura Kettel Khan, Debra Rog, Nicola Dawkins, and David Cotton Type: Research & Reports Date: Jul 1, 2010 Be the first to review this resource! Download (380.53 KB) -
Evaluating Advocacy: A Model for Public Policy Initiatives This presentation, drawing on work done with the Coalition for Comprehensive Immigration Reform ("CCIR"), was given by Innovation Network staff at the 2006 American Evaluation Association Conference. The presentation discusses advocacy evaluation in general, some inherent challenges that apply more strongly to advocacy evaluation than to evaluation of traditional service programs, and some practical planning and evaluation structures developed as a result of the work with CCIR. Author: Innovation Network, Inc. Type: Presentation Slides Date: Nov 1, 2006 Be the first to review this resource! Download (1.42 MB) -
Evaluating Complexity: Propositions for Improving Practice Evaluating Complexity: Propositions for Improving Practice asks civil society organizations to challenge the assumptions behind traditional evaluation models as they take steps toward evaluating complex initiatives and initiatives in complex environments.
Author: FSG Type: Research & Reports Date: Nov 17, 2014 Be the first to review this resource! Download (2.85 MB) -
Evaluating Foundation-Supported Capacity Building: Lessons Learned This study of lessons learned from evaluations of philanthropic capacitybuilding programs used a national database of 473 programs, and a survey and interviews with 87 funders (82 foundations or foundation collaboratives, and five foundation-supported intermediaries) to answer two questions:
(1) How do foundations that support nonprofit capacity building evaluate their grantmaking and direct service activities?
(2) What lessons can be learned from valuation, both to improve these programs and justify the investments made in them?
Author: Thomas E. Backer, Jane Ellen Bleeg & Kathryn Groves Type: Research & Reports Date: Jan 1, 2010 Be the first to review this resource! Download (152.27 KB) -
Evaluating Networks for Social Change: A Casebook In response to the growing interest of grantmakers and network builders, this casebook profiles nine evaluations that address key questions about network effectiveness while expanding what is known about assessment approaches that fit with how networks develop and function.
Author: Center for Evaluation Innovation Type: Research & Reports Date: Jul 1, 2014 Be the first to review this resource! Download (1.12 MB) -
Evaluating Public Policy Grantmaking: A Resource for Funders This publication asserts that funders can determine appropriate performance measures by identifying the incremental steps that lead to policy change. The author argues that many factors necessary for policy change can be measured quantitatively and/or qualitatively, including civic participation, public perceptions, community networks, policymaker support, and organizational capacity. The report also asks funders to bear in mind that while a particular policy objective may not have been achieved, their support may have laid the groundwork for future victories. Author: Snowdon, Ashley Type: Research & Reports Date: Jun 1, 2004 Be the first to review this resource! Download (333.45 KB)