Posted on

Strengthening Partnerships for Education Through Collaborative Community Action and Collective Impact

Current systems are not working to meet different community’s needs across the Denver Metro area, especially when it comes to our educational systems, such as the early identification of young learners’ needs to the persistence of equity gaps in educational attainment and completion.

In Colorado less than half of Colorado children receive developmental screenings to identify potential social, emotional or behavioral challenges or developmental delays; which if unidentified can result in serious challenges that affect all areas of their lives. College enrollment and completion rates in Colorado demonstrate that equity gaps remain, with degree attainment for Hispanic and African American adults in Denver remaining significantly lower than those of white adults: 29% and 39%, compared to 64%[1]. These issues, among so many others, lead us to the question: what would it take to change the stats and create a more equitable education system in Colorado?

During our  October Social Innovators Breakfast we had the opportunity to meet and learn from three great organizations, who shared their experiences and learnings in achieving their goals through collaborative community action (CCA) and collective impact (CI). The panelist included:

  • Diana Higuera, Executive Director and Founder of the Rocky Mountain Welcome Center (RMWC), whose mission is to foster intercultural learning, understanding and integration among immigrants, refugees and Colorado residents through different programs and partnerships.
  • Eileen Auer Bennet, Executive Director of Assuring Better Child and Health Development (ABCD), a statewide nonprofit focused on improving the lives of Colorado children through early identification of developmental needs.
  • Therese Ivancovich, Executive Director of The Denver Education Attainment Network (DEAN), a collective impact initiative focused on increasing educational attainment and closing the attainment gap for students in Denver.

The panelists shared what brought them to the CCA/CI space, talked about how CCA/CI has evolved their work, discussed how they measure impact, and gave advice on starting or growing a CCA/CI initiative. We are grateful to our three panelist and we a sharing a reflection of learnings we gathered from these organizations that you can use to drive your own Collective Impact initiative. No matter what stage an initiative is at, these are some skills we learned that an initiative must have:

Commitment – ensure leaders and partners are committed to the vision and overall goal of the initiative.

Be Nimble – change the initiative direction, if necessary, and be able to take partners along the way.

Build Trust – develop trust within an initiative to not only create partner buy-in, but also build confidence between partners if the direction has to shift.

Do Your Homework – know who is at the table and what their motivations are.

Do What You Are Best At – know what your strengths are and focus on those. Let other partners do what they are best at.

We are passionate about bringing a systems lens to all of our work and often share resources and ideas for how to find and act on leverage points, use systems mapping to help change the game, and how experimentation can help drive social change. Additionally, we have many free tools and resources available if you are considering or already involved in a collaborative community action or collective impact initiative, this includes our full report When Collective Impact Has an Impact.

 

Do you have other lessons to share? Is there a topic you would like to see us explore in this blog? Tell us in the comments! Stay up to date on Spark latest news by following us on social media and subscribe to our newsletter today!

[1] Erase Equity Gaps. (2017). Colorado Department of Higher Education. Available: http://masterplan.highered.colorado.gov/goal-2-erase-equity-gaps/

Related Publications: When Collective Impact Has an Impact

Posted on

Three Tips for Making Network Analysis Actionable for Your Social Impact Project

Three Tips for making network analysis actionable for your social impact project

Joby Schaffer, MA, Associate Researcher at Spark Policy InstituteMany of our partners have adopted what Jed Miller and Rob Stuart called “Network-Centric Thinking.” They recognize that long-term sustainable progress on today’s social problems rarely comes from the efforts of a single organization. Rather, progress requires a strategy involving networks of organizations with the aim of producing network effects.

However, the strategist and evaluator’s task of connecting network strategy to network effects to final outcomes is often difficult, not least because networks are embedded in complex, adaptive systems in which cause and effect relationships are rarely straightforward. Moreover, because quantitative social network analysis (SNA) is often new to many social impact organizations, it is easy to get bogged down in superficial findings to the determinant of more actionable insights.

Three Tips for making network analysis actionable for your social impact projectThere are now a large number of resources on designing network analyses for complex evaluations (see some of our favorites below), but we’ve found three tips particularly useful for ensuring a network analysis yields actionable insights. In short, a design for evaluating a network should:

  • Start by adopting a framework for how network structure leads to network effects;
  • Avoid the lure of only using quantitative SNA; and  
  • Design your network analysis with future data collections in mind: connecting change in the network to outcomes is one of the most powerful insights you’ll uncover.  

Get a Framework

Our partners often make use of theories of change, systems maps, scenario mapping, power analyses, and other tools to frame the nature of the problem they want to address and to develop strategies to guide their work. For learning partners like us, these tools are often a key part of developing and shaping evaluation questions and hypotheses. However, because network theory is relatively new to most people, the expected impact of network strategies is often underspecified in these documents.

For example:

An initiative may agree that the presence of working relationships among cross-sector partners is an important interim outcome…

       …with the expectation these partnerships will help address an upstream driver of a problem…

       …but they may not fully consider how the strengths and weaknesses of the current network structure alters
their chances of activating this “network effect”…

      …which in turn limits their understanding of which actions are needed to advance the network strategy.

Frameworks help to address these problems because they relate network structure to network effects. For example, Peter Plastrik and Madeleine Taylor describe three networks [pdf] on the basis of the depth of their connections – connectivity, alignment, and production. If an initiative aims for cross-sector collaboration (production), but the initial network analysis reveals little connectivity between organizations, it’s best to engage in more connectivity-related and alignment-related network building tasks before encouraging project collaboration.

Choose a Multi-method Approach

When most people think about network analysis, they think of network maps or strange-sounding network statistics like density or centrality. This is quantitative SNA, and it is an essential tool for describing structural properties of a network. Among other things, an SNA will reveal gaps in the network (e.g. perhaps organizations from a certain sector are underrepresented), show areas of deep or shallow connections (e.g connectivity among one subset and alignment among another subset), and identify which organizations play important roles in the network (e.g. bring unique partners to the network).

However, if used alone, SNA may mask a lot of the network information leaders need to make effective decisions. For example, network strategy often involves developing structures for coordination, including convenings, working groups, and shared measurement systems. While it’s possible to use SNA to wrangle some insights about whether these coordinating efforts lead to more effective partnerships, it’s often more meaningful to hear from participants how these structures influenced their work. In short, interviews are much better at capturing the organizational and inter-organizational effects of the network – innovations, greater efficiencies realized, knowledge and information shared, etc.

Design with the Future in Mind

It is good practice to design any evaluation with pre- and post-interventions in mind. Especially for quantitative SNA, it is worth the upfront time to identify what you hope your network will look like in the future, not just examine it today. Repeated network maps can show how the network is evolving over time, which is a great way to identify how coordinating efforts are producing network-level effects (e.g., better representation of certain sectors at convening events, connections made between subsets of the networks, etc.). Again, adopting a framework can be very useful. Many frameworks explicitly describe the stages of network evolution and provide guidance on how to identify and manage a network in transition.

The more social change agents adopt network-centric thinking, the better the chances we’ll make real progress on today’s social problems. We can support this mindset by ensuring our network analyses produce actionable insights. We’ve found these three tips are useful to our work. Based on your experiences, what other tips do you recommend?

New to network thinking or network analysis? Here’s a few of our favorite resources.

Posted on

Summer Spark News: Making the Commitment

All gender restroom equity

All gender restroom equityOur mission at Spark has long been to develop innovative and research-based solutions to society’s complex problems. After a summer that has seen devastating violence and expressions of xenophobia and hate across the country – as well as outpourings of generosity and stories of strength – we see more than ever the need to address racial and structural inequities head-on in order to create meaningful systems change. A few years ago, Spark developed our commitment to diversity, inclusion, and equity. Since then, we have worked to actualize this commitment, ensuring it is not just words, but that it is embedded in our actions, deeds, and culture. To this end, we have developed an organizational statement on equity and inclusion, which you can read in full here.

This month’s newsletter focuses on what we are doing to keep equity and inclusion front and center in all we do. One example is our new all-gender restroom, which became official this month, in an effort to create a more welcoming space for all persons. The newsletter includes actionable and accessible tools to help support that work, including toolkits on equity, using data, and complex decision-making, as well as blogs on mapping tools to support place-based work.

Read the rest of the newsletter. Want to receive more updates like this? You can subscribe to our newsletter here.

Posted on

Our Commitment to Addressing Inequity

Spark Policy Institute

Complex problems are just that: complex. They stem from complicated interactions among multiple actors, against the backdrop of history, systems, and institutions. Within these interactions, we cannot overlook the way race, ethnicity, gender identity, sexual orientation, country of origin, religion, economic status – all the things that make us unique, and the “isms” they engender – are entangled with inequities. Nor can we overlook the need to address these biases in order to drive the meaningful change we are all looking to create.

Our mission at Spark has long been to develop innovative and research-based solutions to society’s complex problems. After a summer of devastating violence and expressions of xenophobia and hate across the country – as well as outpourings of generosity and stories of strength – we are more committed than ever to addressing structural inequities head-on in order to create meaningful systems change.

A few years ago, we developed Spark’s organizational commitment to diversity and inclusion:

Spark Policy Institute believes diverse perspectives are key to achieving meaningful change. We are committed to fostering an organizational culture where all people are treated fairly; supporting communities with tailored approaches that lead to a successful future; and ensuring all voices are heard, particularly those most affected by the change.

Since then, we have been working to actualize this commitment, ensuring it is not just words; that it is embedded in our relationships, work, and culture. What does this mean in practice? It means we:

  • Recognize the assets, diverse voices, perspectives, and knowledge communities bring to the table in a way that fosters trust, respect, and acceptance.
  • Approach all of our interactions with integrity.
  • Keep equity front and center.
  • Respect lived experience and social identity.
  • Continue to focus on internal and external capacity-building, providing fair and equitable access to culturally-appropriate tools, learning, and support.

While we cannot dismantle centuries of institutional “isms” overnight or in isolation, we can each take steps toward a more just, equitable world. We can show, through words and deeds, that we are committed to driving meaningful systems change. Complex problems take time, resources, and hard work to solve. We believe with diverse voices, innovation, and continued dedication, they can be solved.

And we won’t stop until they are.

Posted on

Redefining Rigor: Describing quality evaluation in complex, adaptive settings

This blog is co-authored by Dr. Jewlya Lynn, Spark Policy Institute, and Hallie Preskill, FSG. The blog is also posted on FSG’s website: www.fsg.org 

Traditionally, evaluation has focused on understanding whether a program is making progress against pre-determined indicators. In this context, the quality of the evaluation is often measured in part by the “rigor” of the methods and scientific inquiry. Experimental and quasi-experimental methods are highly-valued and seen as the most rigorous designs, even when they may hamper the ability of the program to adapt and be responsive to its environment.

Evaluations of complex systems-change strategies or adaptive, innovative programs cannot use this same yardstick to measure quality. An experimental design is hard to apply when a strategy’s success is not fully defined upfront and depends on being responsive to the environment. As the recognition of the need for these programs, and consequently the number of complex programs grows, so does the need for a new yardstick. In recognition of this need, we proposed a new definition of rigor at the 2015 American Evaluation Association annual conference, one that broadens the ways we think of quality in evaluation to encompass things that are critical when the target of the evaluation is complex, adaptive, and emergent.

We propose that rigor be redefined to include a balance between four criteria:

  • Quality of the Thinking: The extent to which the evaluation’s design and implementation engages in deep analysis that focuses on patterns, themes and values (drawing on systems thinking); seeks alternative explanations and interpretations; is grounded in the research literature; and looks for outliers that offer different perspectives.
  • Credibility and Legitimacy of the Claims: The extent to which the data is trustworthy, including the confidence in the findings; the transferability of findings to other contexts; the consistency and repeatability of the findings; and the extent to which the findings are shaped by respondents, rather than evaluator bias, motivation, or interests.
  • Cultural Responsiveness and Context: The extent to which the evaluation questions, methods, and analysis respect and reflect the stakeholders’ values and context, their definitions of success, their experiences and perceptions, and their insights about what is happening.
  • Quality and Value of the Learning Process: The extent to which the learning process engages the people who most need the information, in a way that allows for reflection, dialogue, testing assumptions, and asking new questions, directly contributing to making decisions that help improve the process and outcomes.

The concept of balancing the four criteria is at the heart of this redefinition of rigor. Regardless of its other positive attributes, an evaluation of a complex, adaptive program that fails to take into account systems thinking will not be responsive to the needs of that program. Similarly, an evaluation that fails to provide timely information for making decisions, lacks rigor even if the quality of the thinking and legitimacy of the claims is high.

The implications of this redefinition are many.

  • From an evaluator’s point of view, it provides a new checklist of considerations when designing and implementing an evaluation. It suggests that specific, up front work will be needed to understand the cultural context, the potential users of the evaluation and the decisions they need to make, and the level of complexity in the environment and the program itself. At the same time, it maintains the same focus the traditional definition of rigor has always had on leveraging learnings from previous research and seeking consistent and repeatable findings. Ultimately, it asks the evaluator to balance the desire for the highest-quality methods and design with the need for the evaluation to have value for the end-user, and for it to be contextually appropriate.
  • From an evaluation purchaser’s point of view, it provides criteria for considering the value of potential evaluators, evaluation plans, and reports. It can be a way of articulating up-front expectations or comparing the quality of different approaches to an evaluation.
  • From a programmatic point of view, it provides a yardstick by which evaluators can not only be measured, but by which the usefulness and value of their evaluation results can be assessed. It can help program leaders and staff have confidence in the evaluation findings or have a way of talking about what they are concerned about as they look at results.

Across evaluators, evaluation purchases and users of evaluation, this redefinition of rigor provides a new way of articulating expectations from evaluation and elevating the quality and value of the evaluations. It is our hope that this balanced approach helps evaluators, evaluation purchasers and evaluation users to share ownership over the concept of rigor and finding the right balance of the criteria for their evaluations.