Posted on

Strengthening Partnerships for Education Through Collaborative Community Action and Collective Impact

Current systems are not working to meet different community’s needs across the Denver Metro area, especially when it comes to our educational systems, such as the early identification of young learners’ needs to the persistence of equity gaps in educational attainment and completion.

In Colorado less than half of Colorado children receive developmental screenings to identify potential social, emotional or behavioral challenges or developmental delays; which if unidentified can result in serious challenges that affect all areas of their lives. College enrollment and completion rates in Colorado demonstrate that equity gaps remain, with degree attainment for Hispanic and African American adults in Denver remaining significantly lower than those of white adults: 29% and 39%, compared to 64%[1]. These issues, among so many others, lead us to the question: what would it take to change the stats and create a more equitable education system in Colorado?

During our  October Social Innovators Breakfast we had the opportunity to meet and learn from three great organizations, who shared their experiences and learnings in achieving their goals through collaborative community action (CCA) and collective impact (CI). The panelist included:

  • Diana Higuera, Executive Director and Founder of the Rocky Mountain Welcome Center (RMWC), whose mission is to foster intercultural learning, understanding and integration among immigrants, refugees and Colorado residents through different programs and partnerships.
  • Eileen Auer Bennet, Executive Director of Assuring Better Child and Health Development (ABCD), a statewide nonprofit focused on improving the lives of Colorado children through early identification of developmental needs.
  • Therese Ivancovich, Executive Director of The Denver Education Attainment Network (DEAN), a collective impact initiative focused on increasing educational attainment and closing the attainment gap for students in Denver.

The panelists shared what brought them to the CCA/CI space, talked about how CCA/CI has evolved their work, discussed how they measure impact, and gave advice on starting or growing a CCA/CI initiative. We are grateful to our three panelist and we a sharing a reflection of learnings we gathered from these organizations that you can use to drive your own Collective Impact initiative. No matter what stage an initiative is at, these are some skills we learned that an initiative must have:

Commitment – ensure leaders and partners are committed to the vision and overall goal of the initiative.

Be Nimble – change the initiative direction, if necessary, and be able to take partners along the way.

Build Trust – develop trust within an initiative to not only create partner buy-in, but also build confidence between partners if the direction has to shift.

Do Your Homework – know who is at the table and what their motivations are.

Do What You Are Best At – know what your strengths are and focus on those. Let other partners do what they are best at.

We are passionate about bringing a systems lens to all of our work and often share resources and ideas for how to find and act on leverage points, use systems mapping to help change the game, and how experimentation can help drive social change. Additionally, we have many free tools and resources available if you are considering or already involved in a collaborative community action or collective impact initiative, this includes our full report When Collective Impact Has an Impact.


Do you have other lessons to share? Is there a topic you would like to see us explore in this blog? Tell us in the comments! Stay up to date on Spark latest news by following us on social media and subscribe to our newsletter today!

[1] Erase Equity Gaps. (2017). Colorado Department of Higher Education. Available:

Related Publications: When Collective Impact Has an Impact

Posted on

Join Us at AEA 2019!

With the American Evaluation Association (AEA) coming up November 11-16 in Minneapolis, M.N., our team has been thinking about the theme of the conference “Paths to the Future of Evaluation.” We constantly challenge ourselves to ensure our values are present in our work; our practices produce actionable learning; and that we share our experiences to build the evaluation field, and more importantly, to test our perspectives.

We’re excited to present with our evaluation colleagues and share innovative practices that lead to meaningful change in ever-changing contexts. See below for an overview our 3 sessions, and check out the AEA website to find more sessions.

First Spark-Facilitated Session

Date & Time: Thursday November 14, 2019 | 5pm – 5:45pm

Title: Managing Complexities of Community Development: A Spotlight on Evaluators’ Creativity

Spark Staff: Rebecca Ochtera, Former Associate Director

Session Colleagues: Cheryl Kelly, Kaiser Permanente Colorado; Stanley Varnhagen, University of Alberta; Timothy Marc Pearson, Wyoming Survey & Analysis Center

The Partners in Evaluation & Research Center is evaluating 9 agencies funded to reduce health disparities transform places through social, economic, political, and physical changes. The evaluation uses a logic model approach guided by an established equity-oriented framework that emphasizes the importance of addressing upstream determinants of health using comprehensive, multi-sectoral, and systems-level approaches. The evaluation is assessing if the policy, system, and environmental changes increased availability (e.g., more affordable housing), accessibility (e.g., access to vouchers), and acceptability of social and economic resources (e.g., increase residents with quality housing). Because the impacts on individual factors will not be fully realized until several years after the changes occur, the evaluators are creating evidence tables that document the likely impact the social change will have on individuals. This session will discuss their findings of how this evaluation is using an equity-oriented framework to assess the potential impact on health disparities.

Second Spark-Facilitated Session

Date & Time: Friday November 15, 2019 | 11:30am – 12:15pm

Title: Tracking Policymaker Champion Development: A New Tool to Support Policy Advocacy Evaluation and Capacity Building

Spark Staff: Joby Schaffer, Senior Researcher

Session Colleagues: Nathan Madden, Ewing Marion Kauffman Foundation; Cherie Collins Sims, MEDA

Direct policymaker engagement is a key feature of many advocacy strategies, and evaluators often aim to assess whether and how an advocacy organization contributed to a policymaker’s development into an issue champion. While various tracking tools are available, many are difficult to implement, do not provide detailed insights that enable strategic learning, or lack a means of rolling up detailed tracking to tell the story of an organization’s policymaker engagement efforts. This session will showcase a new tool developed with support from the Entrepreneur’s Policy Network (EPN), an initiative led by the Ewing and Marion Kauffman Foundation to help entrepreneur support organizations (ESO’s) engage in advocacy. Through a series of user-design sessions with the ESO’s participating in the EPN, the tool for tracking policymaker engagement was refined to better meet the needs of advocacy organizations while preserving features needed to enable the evaluation to capture insights about their policymaker engagement for reporting purposes.

This panel will provide attendees with a diverse set of perspectives on this new tool, offering them insights into how their respective groups are likely to respond to similar tracking tools. Each panelist will present on their experiences with the tool, including what it enables, what challenges arouse in either implementing or developing the tool, and what supports were essential to their continued use of the tool. The panelist will be available to answer questions from the audience, provide attendees a unique opportunity to engage not only other evaluators but a funder and an advocacy organization using the tool.

Third Spark-Facilitated Session

Date & Time: Friday November 15, 2019 | 5:45pm – 6:30pm

Title: Evaluating community engagement with a lens towards adaptive learning: Lessons drawn from two multi-site, state-wide initiatives addressing health equity

Spark Staff: Rebecca Ochtera, Former Associate Director

Session Colleagues: Veena Pankaj, Innovation Network, Inc.

There’s a growing desire among philanthropy to change policies and systems that promote inequities within society. As a result, foundations are increasingly funding initiatives that involve community members in the problem-solving process. While this involvement is more likely to generate sustainable and innovative solutions, it also underpins the importance of integrating adaptive approaches into evaluation practice to ensure we capture the process and the lessons emerging from the work.

Facilitators will present two multi-site initiatives addressing health inequity through community engagement and use these examples to generate discussion around adaptive evaluation approaches. Facilitators will highlight methods used for emergent learning and demonstrate how data collected through these approaches influenced work at both the community and funder level. Through dialogue, participants will gain insights on community engagement evaluation practices and how to strengthen their evaluation strategy to fit the changing nature of complex community engagement initiatives.

Posted on

Community Navigator Work

Joby Schaffer, MA, Associate Researcher at Spark Policy Institute

This last month, in partnership with the Denver Foundation’s M. Julie Patiño, Barclay Jones, and LaDawn Sullivan; Joby Schaffer wrote an article featured in The Foundation Review.

The article, Community Navigation as a Field of Practice: Reframing Service Delivery to Meet the Needs of Communities’ Marginalized Populations, calls out lessons learned through the Basic Human Needs Navigator Learning Community to improve the work of community navigators in connecting underprivileged populations with service providers.

To learn more about community navigation, the Navigator Learning Community, and the lessons learned during the five-year journey, read the full article available here.

For more information on public health and education opportunities, visit

Posted on

Embracing Values in Evaluation Practice

Research has traditionally defined rigor as obtaining an unbiased estimate of impact, suggesting the need for experimental or quasi-experimental methods and objective, quantitative measures in order to obtain trustworthy results.

I’ve spent the past few months as a member of Colorado’s Equitable Evaluation Collaboratory, which aims to examine the role evaluation plays in supporting or inhibiting progress toward equity and identifying opportunities to integrate equitable evaluation principles into practice. In particular, I’ve reflected on how the research tradition has impacted evaluation’s working orthodoxies including the notion that “credible evidence comes from quantitative data and experimental research” and “evaluators are objective.”

On the surface, these statements don’t appear particularly problematic, but dig a little deeper and we begin to see how value judgments are an integral part of how we practice evaluation. The types of projects we take on, the questions we ask, the frameworks we use, the types of data we collect, and the ways we interpret results – are all deeply rooted in what we value. As an evaluator focused on use, I aim to make these practice decisions in partnership with my clients; however, suggesting that I, or any evaluator, does not play an active role in making these decisions discounts our inherent position of power.

Now that I’ve tuned into the orthodoxies, I see them everywhere, often dominating the conversation. In a meeting last week, a decision-maker was describing the path forward for making a controversial policy decision. He wanted to remove subjectivity and values from the conversation by developing guidelines rooted in “evidence-based practice” and turned to me to present the “facts.”

As a proponent of data-driven decision making, I value the role of evidence; however, there is a lot to unpack behind what we have declared – through traditional notions of rigor – “works” to improve health and social outcomes. Looking retrospectively at the evidence, and thinking prospectively about generating new knowledge, it’s time to ask ourselves some hard questions, including:

  • What interventions do we choose to study? Who developed them? Why did they develop them?
  • What have we (as a society) chosen not to investigate?
  • What population have we “tested” our interventions on? Have we looked for potentially differential impacts?
  • What outcomes do we examine? Who identified these impacts to be important?
  • Who reported the outcomes? Whose perspective do we value?
  • What time-period do we examine? Is that time-period meaningful to the target population?
  • Do we look for potentially unintended consequences?

As we begin to unpack the notion of “what works” we begin to see the decision-points, the values and the inherent power and privilege in what it means to be an evaluator. It is time that we owned the notion that what we choose to study and how we choose to measure success are not objective, rather, they are inherently subjective. And importantly, our choices communicate values.

So how do we begin to embrace our role? As a step forward, I have started including a discussion of values, both mine and my clients, at the beginning of a project and clarifying how those values will influence the evaluation scope and process. Explicitly naming the importance of equity during the evaluative process has helped keep the goals of social change and social justice front and center.  Naming values helps stakeholders acknowledge their power and provides a lens through which to make decisions.

Equitable evaluation is an expedition into the unknown, requiring a transformation in how we conceptualize our role as evaluator. Having taken my initial steps into the Upside Down, I look forward to the many unknowns.

In what way do you see values showing up in your evaluative work?