Chapter 13

Developmental Evaluation

13

Overview of Developmental Evaluation

Innovative processes like the codesign require a new approach to monitoring and evaluation. 

Traditional monitoring, evaluation, and learning (MEL) is not suitable

Performance-based and predictive causal MEL model approaches are poorly suited to assessing results and impact in such circumstances. 

Program design not controlled by one person

A “design and implement” mindset is not going to align with the core principles of co-design

  • Complex and dynamic system with varied civic space contexts and challenges
  • Varied and evolving strategies  across regional hubs (structure, staff, governance)
  • Flexible range of support from Helper Hub that shifts over time based on need
  • Could not pre-determine evaluation questions to assess outcomes of the network’s diverse and localized solutions to defend civic space.

Tools + Methods

Best Practices

Building BlockDescription Specific methods
Orientation of the DE consultantUndertake investigative work to build a deeper understanding of the I4C programme/project in terms of its goals, composition of the community, problem/opportunity, resources, stakeholders, and broader context.Face to face and Skype meetings (bi-weekly) Access to G: drive documents 
Building relationshipsForming working partnerships of two types (a) with support hub members – to get on the same page and share roles or deepen elements of the work/ tools along those that team members are already good at or interested in (b) with hub leaders so I can be their sparing partner for shaping the journey of their hubs – key to this is to find the areas of their interest so they can be motivated, and then slowly push towards my interests Conduct 1-2- 1 meetings with Hub leaders, joined by helper hub point persons. Ask 3 main questions at the start of the meetingCompositionGoals (short, medium, long term)Challenges/ opportunities
PLUS the learning journeys (to speak with Juanita on 1st June) and Hub action plans (presented at Civicus meeting on 29th May) developed in BKK  Distil out important DE elements for learning frameworks
Developing a learning frameworkFind out early about each hubs composition, goals, challenges/opportunities – conduct one to one meetingsCo-created with them ‘fit for purpose’ learning framework for guiding to guide development by mapping key challenges and opportunities, highlighting potential areas for DE learning, and identifying feedback mechanisms and modalitiesDraft hub-learning framework (2nd meeting) – based on first meeting. Key part of the DE methodology 
Envisage two main layers – the hub -level layer and the global layer, feeding into each other 
Use the event (Tuesday) to further develop these 
Orienting group (support hub members and 6 hub leaders) Support hub leaders stakeholders surface and test their assumptions, articulate and refine their models, extend their understanding, and cultivate a culture that supports learning. These activities help groups to develop and maintain an adaptive orientation in complex and unknown territory.On-going but framed around the learning framework 
Clear methodology to come out in the draft methodology piece circulated before, during and after an event
Data collection and observingStudy each of the 6 hub’s unfolding situation in order to help the group identify leverage points, assess their efforts, and stay true to the core intent and principles of their initiative. Collect data as well as listen and identify (1) Key developmental moments; (2) Group dynamics; (3) Structure; (4) Action/inaction; and (5) Threats and opportunities, For developmental moments use tools such as Outcome Mapping /Outcome Harvesting, social network analysis (SNA), use Nvivo to use effectively analyse data and map group dynamics, network structure, action/inaction, network development, threats and opportunities at agreed intervals. Use this to feedback to hubs 
Pattern buildingSense-making is largely participatory in developmental evaluation: DEs work with the group to help them identify patterns, integrate new information, and consider the implications of what they’re seeing.Use Nvivo analysis to read into patterns over time (e.g. at 6 months intervals) and develop facilitating questions – Hub leaders monthly meetings. Come up with simpler methods of seeing these patterns for non-Nvivo users
Ongoing interventionsGet embedded in each hub initiative as a member of the team so as to help shape the work by (1) Asking questions; (2) Facilitating discussion; (3) suggesting tools, (4) Sourcing or providing information; (5) Modelling; (6) Pausing the action; (7) Reminding; and (8) Connecting.Ask questions on Network Content – Civic space Innovation 
Based on data, largely analysed through the Nvivo tool and also more directly
Tools developmentSuggest new tools such as Outcome Mapping, Outcome Harvesting and AIIM but do it in such a way that the core direction of travel remains the same and relevant elements of tools are suggested. Develop adapted tools to make it easier for each hub, and listen to what they are already using so as to add value to what they know as the main approach 
Overarching analysis Given that we have 7 (regional) + 1 (helper) hubs in different contexts, think through the overarching themes/ issues across them and devise some kind of I4C thought lines on civic spaceSpend a good amount of time (saving from different budget lines) to do this analysis at least – possibly do a briefing/ discussion paper out of it.  

What if we do parts online and offline?

The flow of building blocks in the previous section combines online and offline tools. Depending on your needs, you can combine both or choose one. 

Few additional useful tips to keep in mind:

  • Whereas Development/innovation is about ‘experimenting’ new solutions to complex and often systemic problems, evaluation for these types of interventions is about critical/reflective assessment of information obtained from the same innovations and their environment. The aim is to know the specific elements of the innovation (content or process) that worked, and under what circumstances.
  • Adaptive learning is the hallmark of innovation because the context in which innovation happens always changes and is not predictable and hence requires that what one does to make change happen towards the desired end also needs to change. In this situation, Development Evaluation provides adaptive learning with the evidence from data generated real time during the process of innovation so that the next action(s) can be evidence-informed rather than just intuitive or a pure guess. 
  • The focus is on the change process itself, as in what the innovation is about or what it seeks to change, the agents of change (individuals and/or organisations) and the context in which the change or innovation is happening.
  • The Development Evaluator is a critical friend or peer (and not a judge!) who uses the benefit of not being fully involved in the daily life of the intervention process itself to help stand back and use data to bring new insights into how change is happening, how organisations are making it happen and how such changes might be explained and scaled up. He uses what the implementing organisations are finding as his/ her main resource of data. 
  • Active sharing with peers so as to triangulate perspectives, gain new insights from others that might be innovating on the same issue, though in different contexts, for motivation, co-creation of new knowledge and building a community of change makers or innovators beyond the specific context in which one is working.  In cases where organisations are working in the same civic space, this sharing can also lead to the creation of the needed critical mass for advocacy for expanding space.

image_pdfDownload chapter