Coalition Assessment: Approaches for Measuring Capacity and Impact (February 2014)
Why assess coalition capacity? How should a coalition be assessed? How can coalition assessment data be analyzed and used?
Innovation Network has been evaluating coalitions since 2006, beginning with the Coalition for Comprehensive Immigration Reform, a national effort to secure passage by the U.S. Congress for comprehensive immigration reform.
This new publication provides practitioners and funders with insights into the coalition assessment process along with concrete examples and lessons we’ve learned from our own work.
Findings from our State of Evaluation 2012 survey of a national sample paint a fascinating picture of how U.S. nonprofits use evaluation. The State of Evaluation 2012 report marks the second time Innovation Network has surveyed the U.S. nonprofit field to learn about evaluation practices and capacities.
Advocacy Funding Strategy: Lessons Learned for Funders of Advocacy Efforts & Evaluations (May 2012)
Based on our recent work with The Colorado Trust, former Innovation Network Director Ehren Reed prepared a strategic learning evaluation report in conjunction with Nancy Csuti, The Colorodo Trust's Director of Research, Evaluation, and Strategic Learning.
This May 2012 report provides a dual set of insights from the perspectives of both grantmakers and independent evaluators on the role of learning and accountability in grant delivery and implementation.
Innovation Network developed these three introductory evaluation documents as part of Building Nonprofit Capacity to Evaluate, Learn, and Grow Impact, a workshop we presented in partnership with Grantmakers for Effective Organizations' Scaling What Works initiative.
Evaluation Capacity Building: Funder Initiatives to Strengthen Grantee Evaluation Capacity and Practice (June 2011)
We are sharing lessons learned from our consulting practice through a series of white papers. In this publication, Myia Welsh and Johanna Morariu reflect on various strategies and techniques Innovation Network has used to build evaluation capacity among groups of grantees. The paper also offers recommendations for funders considering an evaluation capacity building initiative.
Participatory Analysis: Expanding Stakeholder Involvement in Evaluation (April 2011)
Participatory evaluation involves engaging a program's stakeholders in the evaluation process—making them active participants, rather than passive observers.We often use participatory data analysis as part of our overall participatory evaluation approach.
In this publication, Veena Pankaj, Myia Welsh, and Laura Ostenso share some techniques we have used to give evaluation participants a more active role in the review and analysis of their evaluation data. They also discuss the benefits and challenges of the participatory analysis process.
Innovation Network's own workbook, offering an introduction to the processes and concepts of the logic model. This workbook can be used alone or in conjunction with the Logic Model Builder at the Point K Learning Center.
Nonprofits hear a lot of talk about evaluation these days—metrics and measurements, indicators and impact, efficiency and effectiveness. Everyone wants evaluation results, but there is a knowledge gap around evaluation practice: What are nonprofits really doing to evaluate their work? How are they really using evaluation results? What support are they getting? What else do they need?
These are the questions we sought to answer in our State of Evaluation project. There are many excellent studies available that examine evaluation practice at a particular time or in a particular region. Ours is the first nationwide project that systematically and repeatedly collects data from U.S. nonprofits about their evaluation practices.
We hope the survey results will build understanding:
For nonprofits, to see how their evaluation practices compare to their peers.
For donors and funders, to better understand how they can support evaluation practice throughout the sector.
For evaluators, to have more context for the existing evaluation practices and capacities of their nonprofit clients.
Pathfinder: A Practical Guide to Advocacy Evaluation (2009)
These publications for the advocacy evaluation field draw on our research and consulting experience. Pathfinder encourages readers to adopt a “learning-focused evaluation” approach, which prioritizes using knowledge for improvement. Three tailored editions—one each for advocates, evaluators, and funders—offer a guided pathway through the advocacy evaluation experience:
The Pathfinder series was commissioned by The Atlantic Philanthropies.
Articles Featured in The Foundation Review (2009)
The Foundation Review is the first peer-reviewed journal of philanthropy, written by and for foundation staff and boards and those who work with them implementing programs. Its third issue, released in the summer of 2009, focused on advocacy and policy change.
Speaking for Themselves: Advocates Perspectives on Evaluation (August 2008)
From the beginning of our advocacy evaluation work, we heard a lot of perspectives from funders and evaluators on the particular challenges of evaluating advocacy--but we weren't hearing anything from the people working on the advocacy front lines.
In August 2008, Innovation Network released Speaking for Themselves: Advocates' Perspectives on Evaluation. Produced with the support of the Annie E. Casey Foundation and The Atlantic Philanthropies, the report examined the current state of advocacy strategy and evaluation practice.
We have also presented findings from these white papers and research reports through conference presentations, panels, brown bags, webinars, and more. Materials from conference presentations and panels are available on our Where We've Been page. Webinar recordings and blog posts are available on our Multimedia page.
For more information about our research, or to discuss ways we can work together, contact Director Johanna Morariu 202-728-0727 x103 or jmorariu [at] innonet [dot] org.