Let us focus on assessing what changes and why, and use that information for decision making.

Imagine that you find your three-year-old with a proud grin on her face, wielding a marker pen. Her baby sister sits near by… her face carefully decorated, forehead to chin, with permanent ink! You think: What happened? Why? And then perhaps a few seconds later: What have I learned? What am I going to do?

What happened and why are exactly the questions we need to focus on when we are managing MSD programmes. Then we must determine what we have learned and how we will adapt our programme in response. No need for academic debates around attribution versus contribution.

“Can’t we just say that we contributed, attribution is really hard to do?”
This is the wrong question. What you really want to know is: Did you make a difference? If so, how? Sometimes that is pretty easy to figure out. Sometimes it takes a bit more effort.

An agricultural programme supported one dairy company to improve their cheese making process by introducing a quality assurance system. Two years later, a few more dairy companies had also installed that system. The programme assessed why and how they did that: one company had bought the same equipment from the same foreign supplier; another company had hired the same consultant to design their system. The programme searched for alternative causes for the change, and found none. There you have your attribution: the change that occurred is due to your intervention.

It can be much more challenging than this. There can be so many direct and indirect factors that, together, caused a change. But if we stop investigating to understand why and how these changes occurred, we haven’t learned, and we don’t know what to do next. So we need to carry out our contribution analyses to understand causality. Contribution analysis helps us to structurally unpack cause and effect by gathering and analysing evidence that both supports and challenges a theory of change.

An export promotion programme reported that their promotional activities contributed to a 5 per cent increase in exports. That’s great, but also meaningless. To what extent was that increase due to their support to a dozen exporters? To what extent was that increase due to changes in customs procedures? To what extent was that increase due to changes in market demand by the importing countries? It’s probably due to a combination of the three factors. The programme sought to understand the significance of each of these factors. Lots of questions lead this investigation. They are not always easy to answer, but are very important.

“If we can’t prove attribution, we can’t report anything.”
This is more than a missed opportunity. It is not taking into account the programme’s purpose. That means the push to report attributable impact numbers negatively affects some programmes. That hurts.

A vocational skills development programme supported a few training centres to improve their curricula and teaching methods to meet the demand from the emerging IT sector. Employment rates for graduates were high. But the programme also noted that many graduates, after a few years of employment, founded or joined start-ups - a healthy sign for the IT sector. Was that attributable to their intervention? Probably not. Was it important to understand, important to consider, important to report? Yes, definitely.

Let’s be realistic - systems are complex. Influencing them is an art in itself. Assessing system changes is often not straightforward. And investigating our contribution requires a culture of honest enquiry.

Let’s be pragmatic too. The guidance paper, A Pragmatic Approach to Assessing System Change, builds on what programmes are already doing to assess and understand system change. It provides guidance on how to sensibly assess system changes and how to use that information to adapt strategies and interventions.

It’s not about whether we can attribute changes to our interventions, or whether it’s better to assess our contribution to changes. These are not two different animals. They are part of a continuum. We should aim to assess causality.

But how does a programme report impact that is sometimes attributable and sometimes contributed to?

We need to have a discussion on what and how to report, given the different layers of change that programmes aim to influence.

We are curious to hear your suggestions on how donors and programmes should deal with the numbers issue. The floor is all yours.

1 comment

  • Flexibility and raising donor awareness

    Thanks for these insights--I agree the pressure to assign attribution is misleading in market systems approaches. We are better off leaning into how to better understand the system and how it is changing. Keeping in mind the theory of change, ensuring room for adaptive management and reporting flexibility, adopting a growth mindset, and building partnerships with donors that understand all of the above are important.

    Also--there's something wrong with your contribution analysis link.

    Jillian Baker (2 years, 10 months ago)
Add your comment

Sign up or log in to comment and contribute.

Sign up