Posted on

Redefining Rigor: Describing quality evaluation in complex, adaptive settings

This blog is co-authored by Dr. Jewlya Lynn, Spark Policy Institute, and Hallie Preskill, FSG. The blog is also posted on FSG’s website: www.fsg.org 

Traditionally, evaluation has focused on understanding whether a program is making progress against pre-determined indicators. In this context, the quality of the evaluation is often measured in part by the “rigor” of the methods and scientific inquiry. Experimental and quasi-experimental methods are highly-valued and seen as the most rigorous designs, even when they may hamper the ability of the program to adapt and be responsive to its environment.

Evaluations of complex systems-change strategies or adaptive, innovative programs cannot use this same yardstick to measure quality. An experimental design is hard to apply when a strategy’s success is not fully defined upfront and depends on being responsive to the environment. As the recognition of the need for these programs, and consequently the number of complex programs grows, so does the need for a new yardstick. In recognition of this need, we proposed a new definition of rigor at the 2015 American Evaluation Association annual conference, one that broadens the ways we think of quality in evaluation to encompass things that are critical when the target of the evaluation is complex, adaptive, and emergent.

We propose that rigor be redefined to include a balance between four criteria:

  • Quality of the Thinking: The extent to which the evaluation’s design and implementation engages in deep analysis that focuses on patterns, themes and values (drawing on systems thinking); seeks alternative explanations and interpretations; is grounded in the research literature; and looks for outliers that offer different perspectives.
  • Credibility and Legitimacy of the Claims: The extent to which the data is trustworthy, including the confidence in the findings; the transferability of findings to other contexts; the consistency and repeatability of the findings; and the extent to which the findings are shaped by respondents, rather than evaluator bias, motivation, or interests.
  • Cultural Responsiveness and Context: The extent to which the evaluation questions, methods, and analysis respect and reflect the stakeholders’ values and context, their definitions of success, their experiences and perceptions, and their insights about what is happening.
  • Quality and Value of the Learning Process: The extent to which the learning process engages the people who most need the information, in a way that allows for reflection, dialogue, testing assumptions, and asking new questions, directly contributing to making decisions that help improve the process and outcomes.

The concept of balancing the four criteria is at the heart of this redefinition of rigor. Regardless of its other positive attributes, an evaluation of a complex, adaptive program that fails to take into account systems thinking will not be responsive to the needs of that program. Similarly, an evaluation that fails to provide timely information for making decisions, lacks rigor even if the quality of the thinking and legitimacy of the claims is high.

The implications of this redefinition are many.

  • From an evaluator’s point of view, it provides a new checklist of considerations when designing and implementing an evaluation. It suggests that specific, up front work will be needed to understand the cultural context, the potential users of the evaluation and the decisions they need to make, and the level of complexity in the environment and the program itself. At the same time, it maintains the same focus the traditional definition of rigor has always had on leveraging learnings from previous research and seeking consistent and repeatable findings. Ultimately, it asks the evaluator to balance the desire for the highest-quality methods and design with the need for the evaluation to have value for the end-user, and for it to be contextually appropriate.
  • From an evaluation purchaser’s point of view, it provides criteria for considering the value of potential evaluators, evaluation plans, and reports. It can be a way of articulating up-front expectations or comparing the quality of different approaches to an evaluation.
  • From a programmatic point of view, it provides a yardstick by which evaluators can not only be measured, but by which the usefulness and value of their evaluation results can be assessed. It can help program leaders and staff have confidence in the evaluation findings or have a way of talking about what they are concerned about as they look at results.

Across evaluators, evaluation purchases and users of evaluation, this redefinition of rigor provides a new way of articulating expectations from evaluation and elevating the quality and value of the evaluations. It is our hope that this balanced approach helps evaluators, evaluation purchasers and evaluation users to share ownership over the concept of rigor and finding the right balance of the criteria for their evaluations.

Posted on

Why you shouldn’t do this on your own: Making Your Stakeholder Engagement Process Successful

Picture of two men sparring
 Learning from the Coordinated Chronic Disease Project

During my time in the public sector, I observed many stakeholder engagement processes that went really well and led to meaningful change. Unfortunately, I also process observed like this:

Picture of two men sparringParticipants arrive. They have been told it’s an opportunity to provide input to an important planning process. After listening to a 20 minute presentation, audience members sign up to share their input. In three minute comments, audience members rush to get to their main point, largely focusing on their strongly held views. As the staff listen, they feel exhausted by the idea of bridging all these conflicting priorities. The information is mostly left unused in the final plan.

This week’s blog highlights a real life example on how to put your stakeholder engagement process successfully into action so you never have to sit through or participate in a process like the one described above.

Tips to Make your Stakeholder Engagement Efforts Successful

In my last blog, I thought I could do this on my own: Why engaging stakeholders throughout your initiative is so important, I shared what stakeholder engagement is and why it is important. I also offered four tips to make your stakeholder engagement process successful, including defining your stakeholders early in the process, developing a stakeholder engagement plan, developing a communication plan, and using a high-quality facilitator. Please keep your eye out for our upcoming checklist that has a bit more detail about each tip and how to put them into action.

Making it Happen – The Coordinated Chronic Disease State Framework

In 2012, Spark implemented a stakeholder engagement process to develop the Coordinated Chronic Disease State Framework, an initiative led by the Colorado Department of Public Health and Environment.  Here’s how we did it:

noun_341553_ccThe stakeholders were identified. We worked collaboratively with CDPHE to identify a broad range of stakeholders at the state and local level. The stakeholders included local public health, higher education, health care providers and associations, community organizations, state agencies, advocacy organizations, provider and family members, board members, funders, and researchers.

noun_14382A stakeholder engagement plan was developed & implemented. We used a two-pronged approach by hosting seven community forums and convening a State Advisory Team. Over 125 stakeholders attended forums in Montrose, Frisco, Denver, Sterling, La Junta, Alamosa, and Durango. They and the Advisory Team gained a deeper understanding of a coordinated chronic disease approach, provided input on themes and approaches from the community forums and prioritized strategies to include in the framework.

noun_33104_ccA communication plan was developed and implemented. We partnered with the CDPHE Health Communications Unit to develop messages and materials to reach our stakeholders. A monthly newsletter was distributed, meetings were broadcast and archived on-line, a webpage was created on CDPHE’s website, and messages were sent out through Twitter and Facebook.

noun_175971_ccAll meetings were facilitated thoughtfully. Our staff facilitated the community forums and State Advisory Team meetings. Our approach to facilitation established trust and engaged all members. For community forum participants, this was their experience:

 

Participants arrive and have been told that the state is seeking to develop a coordinated approach to chronic disease programming. After listening to a presentation on CDPHE’s chronic disease efforts and a cross-walk of state chronic disease plans, participants self-select into small groups. The groups discuss their vision for the coordination of the chronic disease programming and discuss action steps in five domain areas (community-clinical linkages, health systems, policy and environmental changes, education and communications, and data surveillance). Each group reported their small group discussions out to the large group. They are told how their information will be used by CDPHE and the State Advisory Team before adjourning.

The  Take-Away

Not every stakeholder participation process is going to look just like my example here. Every situation is different, and every set of stakeholders in a particular issue will have their own challenges to face. But I’m hoping that by telling you this story – about how we’ve engaged stakeholders about the Coordinated Chronic Disease State Framework – you might see not only how the change you seek might be advanced by engaging your stakeholders thoughtfully, but also how to accomplish that engagement.

Resources
  • Community Toolbox Stakeholder Engagement Tools: The Community Tool Box is a big fan of participatory process. That means involving as many as possible of those who are affected by or have an interest in any project, initiative, intervention, or effort. In this section, they discuss how to find and involve the right stakeholders and respond to their needs.
  • Brochures on Public Involvement, Environmental Protection Agency: Due to extensive mandates requiring public involvement in environmental processes, the EPA has provided many tools on their website for engaging a broad range of stakeholders. In particular, the brochures are relevant to engaging the public on any issue. They provide steps and information on budgeting for public involvement, identifying people to involve, technical assistance, outreach, using public input, evaluating public involvement, improving public meetings, and overcoming barriers.