Posted on

Measuring the Impact of Communications

Image of stakeholders

Many of us have fond memories of PBS, whether it be watching a documentary with family or engaging with development programs in school. PBS is continuing its mission to serve the American public with high-quality programming and services by launching the New Generation Initiative (NGI). The 18-month collaborative pilot spans six PBS stations across the U.S., and encourages child caregivers (including families, friends, and neighbors) to sign up for a text messaging service provided by the Colorado nonprofit, Bright by Three. This initiative has funding from The Corporation for Public Broadcasting and The David & Lucile Packard Foundation.

Bright by Three provides an innovative texting program that combines expert child development tips, activities, and local events to adults caring for young children in Arizona, Colorado, Fresno, Indianapolis, New York City, North Carolina, and San Antonio with the goals of:

  • Increasing parent and caregiver engagement; and
  • Strengthening parenting and caregiving skills.

Rocky Mountain PBS is leading the initiative, and has engaged Spark Policy Institute to learn how the different PBS sites collectively had an impact in early childhood education in their communities, and on the parents in the communities.

But how do you measure the impact of a public awareness campaign?

Because the pilot must demonstrate accountability to its funders and organizational leaders as well as improving and adapting as it is implemented, Spark Policy Institute is using a strategic learning approach. This means finding the balance between what is easy to measure (such as text sign-ups) and what is most useful (whether caretakers had a shift in the attitudes, strategies, actions, etc.). A systematic review of qualitative data can help surface rich and compelling information. We’re also using a collective interpretation of data collected in real-time, to inform the campaign regularly and enable the greatest outcome in child development.

We look forward to sharing what we learn through this project, and invite you to apply strategic learning in your own work by exploring our Tools for Social Innovators: Strategic Learning Toolkit.

Posted on

The Case for Developmental Evaluation

This blog is co-authored by Marci Parkhurst and Hallie Preskill from FSG, Dr. Jewlya Lynn from Spark Policy Institute, and Marah Moore from i2i Institute. It is also posted on FSG’s website: www.fsg.org 

In a recent blog post discussing the importance of good evidence in supporting systems change work, evaluation expert Lisbeth Schorr wrote, “To get better results in this complex world, we must be willing to shake the intuition that certainty should be our highest priority…” Rather, she argues, “it is time for all of us to think more expansively about evidence as we strive to understand the world of today and to improve the world of tomorrow.” [Emphasis added]

At the annual American Evaluation Association Conference (AEA) in November, practitioners, funders, and academics from around the world gave presentations and facilitated discussions around a type of evaluation that is specifically designed to meet this need for a more expanded view of evidence. It’s called developmental evaluation, and, as noted by other commentators, it took this year’s AEA conference by storm.

What is developmental evaluation?

Developmental evaluation (DE) “is grounded in systems thinking and supports innovation by collecting and analyzing real-time data in ways that lead to informed and ongoing decision making as part of the design, development, and implementation process.” As such, DE is particularly well-suited for innovations in which the path to success is not clear. By focusing on understanding what’s happening as a new approach is implemented, DE can help answer questions such as:

  • What is emerging as the innovation takes shape?
  • What do initial results reveal about expected progress?
  • What variations in effects are we seeing?
  • How have different values, perspectives, and relationships influenced the innovation and its outcomes?
  • How is the larger system or environment responding to the innovation?

DE can provide stakeholders with a deep understanding of context and real-time insights about how a new initiative, program, or innovation should be adapted in response to changing circumstances and what is being learned along the way.

A well-executed DE will effectively balance accountability with learning; rigor with flexibility and timely information; reflection and dialogue with decision-making and action; and the need for a fixed budget with the need for responsiveness and flexibility. DE also strives to balance expectations about who is expected to adapt and change based on the information provided (i.e., funders and/or grantees).

The case for developmental evaluation

Developmental evaluation (DE) has the potential to serve as an indispensable strategic learning tool for the growing number of funders and practitioners that are focusing their efforts on facilitating systems change. But, DE is different from other approaches to evaluation. Articulating what exactly DE looks like in practice, what results it can produce, and how those results can add value to a given initiative, program, or innovation is a critical challenge, even for leaders who embrace DE in concept.

To help meet the need for a clear and compelling description of how DE differs from formative and summative evaluation and what value it can add to an organization or innovation, we hosted a think tank session at AEA in which we invited attendees to share their thoughts on these questions. We identified 4 overarching value propositions of DE, which are supported by quotes from participants:

1) DE focuses on understanding an innovation in context, and explores how both the innovation and its context evolve and interact over time.

  • “DE allows evaluators AND program implementers to adapt to changing contexts and respond to real events that can and should impact the direction of the work”.
  • “DE provides a systematic way to scan and understand the critical systems and contextual elements that influence an innovation’s road to outcomes.”
  • “DE allows for fluidity and flexibility in decision-making as the issue being addressed continues to evolve.”

2) DE is specifically designed to improve innovation. By engaging early and deeply in an exploration of what a new innovation is and how it responds to its context, DE enables stakeholders to document and learn from their experiments.

  • “DE is perfect for those times when you have the resources, knowledge, and commitment to dedicate to an innovation, but the unknowns are many and having the significant impact you want will require learning along the way.”
  • “DE is a tool that facilitates “failing smart” and adapting to emergent conditions.”

3) DE supports timely decision-making in a way that monitoring and later-stage evaluation cannot. By providing real-time feedback to initiative participants, managers, and funders, DE supports rapid strategic adjustments and quick course corrections that are critical to success under conditions of complexity.

  • “DE allows for faster decision-making with ongoing information.”
  • “DE provides real time insights that can save an innovation from wasting valuable funds on theories or assumptions that are incorrect.”
  • “DE promotes rapid, adaptive learning at a deep level so that an innovation has greatest potential to achieve social impact.”

4) Well-executed DE uses an inclusive, participatory approach that helps build relationships and increase learning capacity while boosting performance.

  • “DE encourages frequent stakeholder engagement in accessing data and using it to inform decision-making, therefore maximizing both individual and organizational learning and capacity-building. This leads to better outcomes.”
  • “DE increases trust between stakeholders or participants and evaluators by making the evaluator a ‘critical friend’ to the work.”
  • “DE can help concretely inform a specific innovation, as well as help to transform an organization’s orientation toward continuous learning.”

Additionally, one participant offered a succinct summary of how DE is different from other types of evaluation: “DE helps you keep your focus on driving meaningful change and figuring out what’s needed to make that happen—not on deploying a predefined strategy or measuring a set of predefined outcomes.”

We hope that these messages and talking points will prove helpful to funders and practitioners seeking to better understand why DE is such an innovative and powerful approach to evaluation.

Have other ideas about DE’s value? Please share them in the comments.

Learn more about developmental evaluation:

Posted on

Making planning actionable: lessons from the Colorado Farm to School Task Force

Farm to School Logo

“Never doubt that a small group of thoughtful, committed citizens can change the world; indeed, it’s the only thing that ever has.”

This is one of my favorite quotes from Margaret Mead, a cultural anthropologist known for her holistic understanding of human adaptation and the interconnectedness of all aspects of human life. Mead believed in our innate capacity to learn from each other to create change, viewing diversity as a resource that allows for greater exchange of ideas and experiences.

Farm to School LogoOne of the best parts of my job at Spark is the opportunity to partner with thoughtful, committed groups of people every day who create systems-level change to society’s most complex problems. One of those groups is the Colorado Farm to School Task Force. In its early stages, the Task Force teamed up with Spark and embarked on an intensive Strategic Roadmapping session. Strategic Roadmaps start with defining the change we want to see in the world and working backwards to define smaller changes that will lead to that big change. True to Mead’s observations of interconnectedness, Strategic Roadmaps consider the broader context within which a group is working and focus on the “why” of the work rather than the “how,” allowing for adaptation in a changing environment.

Colorado Farm to School RoadmapAt the end of the day, a Strategic Roadmap is not just a pretty picture (although they certainly look nice!) – it is actionable! In fact, the Task Force revisits their Roadmap quarterly to integrate new learning about the context and environment in which they are working, and to plan their shorter- and longer-term strategies moving forward. One of the most powerful ways the Task Force uses the Roadmap is to identify priorities and evaluate whether emerging opportunities are likely to influence the changes they hope to see in the world. It’s easy for a statewide body to get lost in the large and dynamic field of food systems; the Roadmap is one tool to facilitate strategic action within such an environment. Indeed, the Task Force is a diverse group of citizens committed to changing the world through strategic learning and thoughtful action.

 

Interested in learning more about adaptive planning like the Strategic Roadmap? Our Adaptive Planning Toolkit guides users through the roadmapping process and provides broader strategies for planning in dynamic environments.

Curious about farm to school and what it means for our students, local economies, and food systems? October is National Farm to School Month, a great time to learn about the organizations working to improve healthy eating in schools across the state and the nation.

Posted on

Planning for Adaptation

Logo 8I’ve spent a lot of time over the last decade thinking about, experimenting with, and refining tools for planning in complex, adaptive settings. As we put together Spark’s Adaptive Planning Toolkit, I’ve had the opportunity to reflect back and think about the genesis of the tools and what we have learned over the years.

I have tremendous admiration for all of the partners I’ve worked with who have tackled complex problems with adaptive approaches. That they can work amid such great uncertainty is impressive in and of itself, but the fact that they are willing to approach solving the problems in ways that are, themselves, uncertain and untested is even more laudable.

The stakeholders who came together to prevent another tragedy like the Columbine school shooting not only didn’t know how to integrate the many different service systems to prevent a future shooting, they were also brand new to systems mapping, which was a critical part of developing a plan for change. I remember the walls covered with boxes and lines, as participants tried to break down how the system functioned today in order to figure out how it could function tomorrow.

DLPLogoFINALThe leaders who formed the core of the Daylight Project, focused on improving access to behavioral health services for deaf and hard of hearing consumers, similarly tackled a complex problem using tools that were untested and new to them. Consumer stories helped inform their work along the way, but so did real-time strategic learning, which included gathering data about their environment and forecasting the likelihood of success for each partner organization they invited to join the effort.

Scenario MappingRecently, The Colorado Health Foundation used an adaptive planning process to develop their Consumer Advocacy funding strategy. Using scenario planning tools, mapping of current funding, and even a post-mortem, they went all out with adaptive planning. Unlike the previous examples, by this point Spark, as their partners in crime, had a well-established repertoire of adaptive planning tools. However, similar to the experiences in the first two examples, this approach was still new and out of the comfort zone for the organization, yet they embraced it fully and developed a truly creative, results focused, and adaptive funding strategy.

I am personally very excited to share our adaptive planning tools. I believe in them. I have seen them help many different types of groups make a meaningful difference on truly difficult problems. I also believe this idea of adaptive planning is a work in progress – we have some pieces pulled together, but by no means is this the be all, end all of planning in complex settings. I am excited to learn how others are doing adaptive planning and hope you will participate by sharing your stories and building our common base of tools for how to do this difficult work.