What our past programmes can teach us about future programme design
What our past programmes can teach us about future programme design
21 October, 2021 •Working in programme monitoring and evaluation provides an excellent vantage point to help with organisational learning. You collect monitoring data to see how implementation is going, assess to what extent results have been achieved and reflect on what underlying assumption may or may not actually be valid. And when your programme relies on research uptake and stakeholder engagement as foundational building blocks, feeding in those learnings to understand what’s working and not working is essential to success.
Here at Cenfri, we’ve recently completed two funded 5-year programmes which aimed to
- strengthen integrity and risk management roles of the financial sector and facilitate remittance flows within and into Africa and
- to mobilise the data market and catalyse high quality, client-centric data generation and use. In both of these programmes we supported various public and private sector stakeholders with data and insights as well as implementation and technical assistance, with the ultimate aim to advance financial sector development and improve financial inclusion.
Engaging with the relevant players in the space was one of the first steps for us to be able to productively contribute with these programmes. So, as they came to an end, we took the opportunity to pause and reflect on our work with both of the key stakeholder groups: public and private. What happened as we worked with them? What barriers did we encounter as we moved forward? Did they find our insights useful, and were they able to leverage them productively in their work?
Fortunately, we were able to look back over the full 5 years of the programmes through our record of learnings that we documented to keep our funders well informed. This helped us to surface details of enablers and constraints around what drove demand for our research as well as our experience of the stakeholders’ ability and willingness to engage over the course of the programmes. These are a few key points that emerged:
- Research demand: We found that published evidence was a crucial building block for us to drive change and engage stakeholders. Having robust research and insights to put on the table helped open doors for moving implementation work forward, as this provided a solid grounding for the work and allowed us to further build on published evidence as a way to begin and facilitate discussions. We found that simple, data-rich outputs were particularly effective, as these presented key facts and figures in a way that was relevant to our stakeholders and easy to engage with.
- Private sector: We found that, specifically with private sector work, more was often needed to move the research work itself forward. In order to produce relevant insights we often needed to iteratively engage over a longer period than planned at the start of the programme, specifically around access to data and to conduct research. Many of the stakeholders we worked with were extremely protective of information (e.g. to rigorously protect their IP). However, without information sharing, it became hard for us to understand the data gaps they had and the optimal research directions and interventions that could work for them. We have found that working towards addressing specific mutually agreed use cases and using milestones along the way helped to hold stakeholders’ interest in the research process as well as better connecting our research to longer-term value for the business.
- Public sector: With regulators, we found that trust was an essential precursor to engagement – this took time, but without this time investment work would stall. Trust helped with buy-in from regulators at the national level and opened doors for us to amplify our work exponentially at the regional level and beyond in the fora where they participated. This included being introduced as a trusted partner and being invited, by the stakeholders we supported through technical assistance, to present to regional and standard-setting bodies.
None of these points were surprising to the team, having lived these programmes for the past 5 years, but it was useful to take these as a starting point to further reflect on what this tells us about what we could do better. As an organisation who values learning from our experience, we’re now working to apply these insights by taking them on board to improve our future work.
One particular area this has already been helpful to us is around how we might design similar programmes differently, knowing what we know now. So, here are four design tips we’ll be keeping in mind for our next programme, which may also help you if you’re embarking on designing a programme of your own:
- Specifically, build in time for building trust and data acquisition: We often encountered delays at the start of projects. Two main recurring causes for this were not having enough time set aside for trust-building to get people on board with the project work and not planning enough time to get the data we needed to do the research. In our future programme design, we will be acknowledging that time needs to be specifically dedicated to building up stakeholder relationships, especially early on and with new partners, and making sure more time for this ends up in the project plan. In addition, we will be planning for longer lead times on data as to achieve real market change and growth, it’s essential we are able to access the data we need. This will help to make project plans more realistic and give more us much-needed flexibility to iterate around data sets. In turn, this will help us to make sure we have an opportunity to explore gaps and missing data with partners to best get what’s needed for the research. So, we’re taking this challenge into future programmes as a learning to sign relevant contractually binding arrangements with providers we work with to include and make it clear from the start as far as possible what is needed for our research, so we can better achieve and demonstrate the impact of our work.
- Take context into account, early and often: We saw time and again in our programme learnings that contextual factors make or break what’s possible to achieve with our interventions. In future, we are aiming to add in a situation analysis step at the start of a project with a new stakeholder and to design flexible interventions that specifically try to take their context (e.g. time and resourcing constraints) into account. This flexible approach will help us to understand as much as we can around context going into an intervention as well as pave the way to actively be able to build trust by best supporting the stakeholder where they are at, at that moment in time.
- Be realistic about what we can hope to change within the scope and timeframe of the programme: In both our programmes, we were ambitious in what we hoped to achieve, especially considering that many longer-term programme outcomes fell outside our direct sphere of influence. In the end, we did well in achieving our programme targets, and where we did fall short these results often heavily relied on our partners rolling out products we’d provided guidance and inputs into. The partner timeframes for these rollouts were outside of our control, and as the Programmes were ending in 2020/21 many were unfortunately delayed due to Covid-19 global pandemic. Being very clear in the programme Theory of Change what work falls within the scope of the programme’s direct influence, and what falls outside of it, would have helped us to better take into account the inherent increased risk as the context rapidly changed.
- Revisit the workplan and ToC to regularly reflect on what’s working and what’s not: In both programmes, the work planning process structured main check-in points around progress and to inform the next period’s activities. However, the analysis of the learnings, as well as experience, shows that this was likely not frequent enough. Building in more internal reflection and learning points spread throughout the year would have been useful. This would have helped us better assess if sufficient progress has been made and to respond if any major stumbling blocks came up. This would have also provided a platform to formally share learnings across the different programme workstreams to help reflect, inform course corrections and actively apply our learnings to our next steps. Using these kinds of check-in points to also make final decisions on whether to continue to pursue work with a specific provider and under what circumstances, is something we are thinking about actively incorporating into future programmes.
All four of these tips can help to pave the way for a more adaptive, responsive and realistic way of working within our programmes. We see applying these internally as an important step to help us continue on our organisation’s journey of learning. And we’re looking forward to seeing how this helps us deliver on our ambition of continuing to provide useful insights and implementation support to all the stakeholders we work with.