By Helen Derbyshire, Julian Barr and Steve Fraser
Lots of development programmes and organisations are grappling with the challenge of how to be more effective by being more adaptive. How do we modify our usual ways of working to better influence and learn from complex processes of change in uncertain and constantly changing contexts? How do we do this in ways that don’t stifle the very processes of change we’re trying to support?
In SAVI – a DFID funded social accountability programme in Nigeria – we have been trying to work in politically smart, adaptive ways since the start of the programme 8 years ago. Our approach evolved from our personal experience of previous DFID-funded programmes in Nigeria, as well as from key staff members’ long term involvement in successful home-grown development in Nigeria. It has continued to evolve over the life of the programme.
In the coming months, we will face new challenges, as we try to carry forward our learning into a successor programme – DFID’s new Public Sector Accountability & Governance Programme in Nigeria – which starts in early May.
Given the growing interest in adaptive ways of working, now seems like a good time to share some of the tools, processes and learning we have developed to date.
Our latest paper “Moving Targets, Widening Nets” describes how we go about defining, monitoring and learning from incremental and adaptive change. We have always tried to do this in ways that satisfy DFID’s demand for results, whilst at the same time supporting our partners (civil society groups, media organisations and state-level parliaments) to drive local reform in politically smart and adaptive ways. The paper includes links to monitoring tools we have evolved over several years – and which we will continue to improve and adapt in the new programme.
Our idea in sharing our approach and tools is not in any way to set ourselves up as an example of ‘best practice’ – but to show how a level of politically smart adaptation can be achieved in the context of managing a DFID programme – with all the target-setting, reporting and budget requirements that typically entails. Working in more adaptive ways is not easy. It involves professional and commercial risk, as well as constantly swimming against the tide – challenging conventions and expectations internally and externally. It means admitting to and learning from what doesn’t work, not just from what does. But over time and with support from understanding DFID staff and external reviewers, we have made some headway.
In the context of increasingly active donor encouragement for adaptive programming, our learning and tools may be useful to design teams and programmes now grappling with the challenges of putting these principles into practice – some helpful food for thought..
We hope this paper will trigger debate between practitioners, as well as between policy makers and programme designers, and we hope to benefit from your views, experience and learning in return. What processes and tools have you developed that are helping you to work in more adaptive ways? What lessons have you learned in this regard?
For those of you who don’t have time to read the whole paper, these are some of our main conclusions on supporting and measuring incremental and adaptive change:
- Create as much flexibility as you can for partners to respond to opportunity and learn by doing. Too often in social accountability programmes, partners’ hands are tied by pre-planned activities, targets and rigid top-down results frameworks. We encourage staff and partners alike to learn by doing and progress in regular learning loops. In simple terms, this means: developing a plan of action based on analysis of context and capacity focused on a realistic and achievable short term goal; putting the plan into action; reflecting on what they have done and achieved; and using this analysis to inform the next stage. Our expectation is initial achievement of comparatively low level results will gradually build partners’ confidence, credibility and networks to take on bigger challenges and achieve higher level results with greater impact on citizens’ lives over time.
- Create structured space for reflection. Busy people often don’t prioritise time for reflective Create structured space for staff and partners to stand back from their day to day work, reflect honestly on what they have done, see the bigger picture, and plan on the basis of their shared learning.
- Own your results framework, take control of it and make it work for you. The SAVI Results Framework – like our Theory of Change and monitoring tools and processes – has been shaped by SAVI staff through continual processes of reflection and adaptive learning, and is now in its 14th official iteration since the start of the programme. Indicators at all levels have been modified to meet changing DFID requirements and to reflect and shape the evolving programme – extending the principle of adaptive programming to the M&E system itself. Whilst results frameworks come in for a lot of criticism, owning your results framework and using it in a flexible and reflective way as a part of learning, means it can assist rather than act as an obstacle to adaptive programming.
- Recognise and use the ability of the results framework to measure process results and qualitative change. Our Results Framework includes a basket of indicators which capture both qualitative process-based changes and the tangible results which derive from these. The results framework sets out what the programme will achieve: our separate and complementary Theory of Change maps out This helps us to measure qualitative change in quantitative ways – using composite numerical indices to measure partners’ progress through the stages of the Theory of Change.
- Outcome harvesting, or retrospective recording of results, is a good way of capturing results that are not predictable in advance. In governance programmes such as SAVI, it is hard to predict where change will happen, particularly in terms of government responsiveness. To accommodate this, we use an open ended “concrete change” outcome indicator. This commits us to influencing a target number of governance improvement results (defined as tangible examples of state government responsiveness to their citizens, influenced by SAVI partners) – without predicting in advance exactly what or where they will be. We use “Results Evidence Sheets” to capture and tell the structured and evidenced ‘back story’ to these results. This back story links results in terms of government action to the intermediate processes of change that brought them about.
Concerns are sometimes raised that adaptive programming is incompatible with situations where programmes have to satisfy rigorous donor requirements for predictable planning and regular reporting. Our experience shows that innovative approaches to measuring programme processes and results can help to make learning and adaptation central to a successful mainstream development programme. We will be refining our approach and continuing to share our learning in our successor programme – and we look forward to learning from your experiences too.
Helen Derbyshire, Results Communications Lead, SAVI; Julian Barr; Technical Director, SAVI and Non-Executive Director, Itad Ltd.; and Steve Fraser, Deputy Team Leader (Technical), SAVI