Are our market systems analyses fit for purpose? A recent review by the ILO Lab suggests we need to shift focus from paper to practice.

In 2014 the ILO undertook a critical review into the quality of its value chain analysis reports. The idea was to see how ‘systemic’ they were.

The report found that only five per cent got deep enough into the market system to look into supporting functions or rules. Most had stopped at mapping core value chain performance. We wrote about the implication of these findings in a BEAM blog - Analyse this: getting better at understanding how market systems work.

Almost five years later, it’s time for the sequel. We re-ran the review on a new sample of value chain and market systems analyses. We also expanded the scope to look at both those commissioned by ILO, and those where ILO had collaborated with external partners.

We found significant improvements. 77 per cent of reports now reached a level of understanding that can be considered adequate in order to identify the underlying reason(s) why the market system was underperforming.

This came as no surprise thanks, in part, to the fact that the Lab Project started in 2014, and was the first initiative within the ILO – or indeed anywhere – to have a dedicated focus on market systems development for decent work. The Lab has supported analysis from Afghanistan to Zambia, and in sectors as diverse as the global garment supply chain to local construction, tourism and agriculture.

But the review wasn’t all good news. We added an extra assessment criterion to see to what extent each analysis has been used to shape the focus of interventions. We collected data through a series of key informant interviews with ILO staff. Surprisingly, and quite worryingly, we found that almost two thirds (64 per cent) of reports had little or no effect on eventual programme or intervention design. Only a quarter of reports (24 per cent) could be considered to have a high degree of use – where the majority of interventions were based on the findings of the analysis. Reports are doing great at identifying systemic constraints, but their findings and recommendations are almost entirely ignored.

What’s going wrong? In order to ascertain exactly why the reports were not used, we mined institutional knowledge and the experience of those involved in the report to come up with a list of reasons:

  • While the donor was interested in pursuing a market systems approach, the implementing organisation was not
  • No funding was secured for the programme
  • Government priorities changed such that the programme switched focus to another sector
  • The programme decided to pursue a more ‘direct delivery’ approach and roll out a standardised training course
  • The programme did not have a core market systems development (MSD) focus. While there was interest in using the findings, they ultimately did not have the flexibility to change their programme design/logframe
  • The programme team was not comfortable taking a facilitative approach, or came on board after the analysis, meaning they felt little ownership over the findings

There were a number of confounding issues. Most reports were far too long, at an average length of 63 pages. This makes them difficult to digest, and risks wasting even more time and effort if the eventual report is not used.

Some analyses were conducted for programmes where interventions were already underway, or for programmes which did not have a core market systems focus – making it much harder for the analysis to have influence.

A growing movement within the MSD community seeks to embrace more complexity-aware approaches that have very little (or no) upfront analysis. Of course, analysis – the fundamental task of understanding how and why the market system is not working – should always be an iterative exercise that emphasises ‘learning by doing’.

In our experience, however, a degree of upfront analysis is still necessary to provide a steer to the programme team – many of whom may be unfamiliar with the market systems approach and require some structure rather than working from a ‘blank slate’.

We believe, therefore, that MSD should remain fundamentally analysis-driven – but that excessively long reports and analytical exercises are likewise not helpful.

The challenge is to find the ‘sweet spot’ between no analysis and too detailed analysis. The question is, how much is enough?

The Lab is keen to engage with partner organisations to codify a ‘leaner’ approach to market systems analyses (MSA). This could help mitigate the risk that the number and quality of MSA reports increases at a faster rate than their utility in improving development practice.

Add your comment

Sign up or log in to comment and contribute.

Sign up