Global Report on Results
One of the most important factors for achieving impact is the ability to learn – bothfrom others and from own successes and failures. Throughout the current strategicperiod NCA has continued to reflect on own practice. Most of this reflection istriggeredby input from various stakeholders, shifts in trends in the internationaldevelopment regime and close dialogue with partners and back donors. Thisreflection constitutes a major part of the organisational learning and has led tovarious strategic decisions in NCA recently.
One main reflection is the need for NCA to continue concentrating its efforts both geographically and thematically in order to develop more holistic programming, scale up activities and have a larger impact on the societies we aim to change for the better.
A second reflection is the need to better document the inputs and outcomes from strengthening NCA’s core civil society partners, mainly faith-based actors. Most of these have a huge potential to enhance their role as change agents in their societies, but this will require NCA to better play its role as accompanier, challenger and facilitator. This is the core of NCA’s added value. It comes at a cost that needs to be better analysed, but NCA is convinced that it potentially has great impact.
A third reflection is the need to work more systematically on building the capacity of faith-based partners. This report documents well the potential these structures and organisations have as change agents, but NCA belives there still is a huge untapped potential.
These reflections have all impacted on NCA’s new Global Strategy (from 2016) and led to a series of strategic decisions aimed at positioning the organisation for even greater results in the next strategic period.
Four years after the introduction of the global programmes, it is time for NCA to take stock of its experiences so far with the concept to prepare for future programming. An internal review carried out in 2014 revealed that a majority of the respondents saw clear benefits from having global programmes and expressed that they wanted to keep the concept. The global programmes were seen to increase programme quality and improve accountability and partner relations. Most respondents felt that the global programmes had helped NCA to become more focused and that there was an increased sense of belonging to a global team. The global programmes have also contributed to improve NCA’s tracking and communication of results. Among the challenges identified was a felt lack of flexibility - as all NCA’s international work has to be defined within the matrix of the 12 global programmes – and more bureaucracy related to reporting. There is a sense within the organisation that we want to keep the global programme system, but that there should be fewer programmes with more flexibility to adapt to local context.
NCA recognises that the concept of global programmes is to some extent contradictory to a focus on documenting country level impact, as the global programme set-up focuses on what NCA can achieve as a common denominator (global outcomes) across all countries where we work, rather than on country level impact as such. From 2016, the global programme concept will be carried forward, but slightly adjusted in order to better capture country level impact. There will also be fewer global programmes than the current 12, and only programmes with a clear potential for upscaling of NCA’s work - including upscaling our work with and through religious actors - will be carried forward.
Another series of lessons learned is related to the global programmes’ corresponding set of global outcomes and indicators. Each global programme contains a “menu” of global outcomes and one global indicator – a so-called Selected Output Indicator (SOI). Each NCA country programme was asked to select global outcomes from the relevant menus for each global programme and develop their country level programmes within the frame of these global outcomes.
As for the SOIs, these were mainly a statistic measuring the number of rights-holders involved in or benefitting from a specific activity and were meant to give an indication of some selected quantitative data at output level within each global programme (e.g. the number of households with access to renewable energy). The SOIs were not meant to be comprehensive. In addition, each country programme selected context-based outcome indicators. This represented a significant change in the way NCA plans and tracks results, and it was believed that it would lead to an increased attention to the importance of working towards wider impact.
Among the benefits from working with SOIs is that it has contributed to creating more awareness within the organisation of the importance of developing good indicators as a means of tracking and reporting on results. NCA also believes that the SOIs are a good tool for communicating some aspects of our work to donors and that they have contributed to making our annual Global Report more interesting. On the other side, the SOI experience has taught NCA a great deal about challenges linked to systematic data collection, storage and analysis. NCA will carry forward some form of global key statistics à la SOI, but will provide clearer definitions and more explicit guidelines as to what is to be counted and not. Emphasis will be given to develop more systematic data collection, monitoring and storage systems and provide adequate training to staff.
Another important source of internal reflection and learning is evaluation practice. Between 2011 and 2014, NCA conducted 75 evaluations of different sorts66 in 22 countries, as well as one regional and one global evaluation.
In an effort to enhance the quality, usefulness and accessibility of evaluations, NCA developed an evaluation policy for its international programmes, which was adopted and rolled out in 2014. This policy formalised a series of measures, such as establishing steering groups, using a standard terms of reference when undertaking the evaluation, and requiring the provision of a management response to evaluations. The evaluation policy stresses the importance of annually analysing all NCA evaluations by applying a set of criteria ranked from 1 (poor) to 5 (excellent) used to score and make qualitative judgments – a fruitful learning exercise carried out for the 2013 and 2014 evaluations.
While the evaluation policy was not fully implemented in 2014 (less than half of the evaluations applied it) the quality of those that did was considerably higher when it was followed. For example, evaluations guided by a steering group ranked noticeably higher than those without one. Moreover, evaluations with steering groups score higher in all categories (i.e. quality of terms of reference; methodology; contextual analysis; evaluation criteria; crosscutting issues; and findings and recommendations).
The positive impact that this policy has had on organisational learning is also reflected in a more systematic application of NCA’s management response to the evaluation’s findings and recommendations. From 2011-2013, around 40% of all country offices responded to the evaluations. In 2014, this increased to 77%.
NCA country offices made a concerted effort to move from lessons learned to lessons implemented by transforming findings and recommendations into concrete changes. Some of these lessons have enriched programme design, implementation and monitoring. Other lessons have resulted in the enhancement or revision of partnership approaches; encouraged increased community ownership of programme interventions and collaboration with dutybearers, or enabled improved analysis of programme impact. Country office management teams have for the most part relied on evaluation findings for strategic and management decisions during programme implementation, for example when restructuring programmes.
Although there have been many improvements on how NCA conducts and learns from its evaluations during this strategic period, more will done in the future to bolster institutional learning and evidence-based programming. This includes facilitating the full implementation of the evaluation policy and enhancing NCA’s culture of learning, including the promoting horizontal knowledge sharing across the organisation.Back