Using Human Centred Design (HCD) to understand the challenges faced by a programme's beneficiaries 

There is a canon of literature detailing the step-by-step process to plan and implement an MSD programme. The focus is on a single imperative – overcoming market constraints and designing suitable, market-based solutions. By overcoming these constraints MSD programmes hope to catalyse systemic change.

There are different toolkits and approaches to identify these constraints and to design solutions, but as one manual states, “there are no standard tools to explore these systemic market constraints and opportunities. The diagnostic process [relies] more on focused interaction with a limited number of selected informants.”1 

Evaluating SPRING's programme to change the lives of adolescent girls

A recent programme we evaluated, SPRING, used a tool that might be a useful addition for MSD practitioners. The tool assisted in clearly understanding challenges that faced the programme beneficiaries, adolescent girls, and contributed to the design of appropriate interventions and solutions.

SPRING was a five-year accelerator working with businesses in nine countries, providing life-changing products and services to adolescent girls. It used Human Centred Design (HCD) as its core methodology. HCD won't replace well-established MSD tools and methodologies but might supplement these and contribute to better intervention design.  Without going into all the detail about HCD, let us try to summarise the three main steps and apply them to an MSD context.  

1. Data collection beyond the landscape report - ask the adolescent girls!
Firstly, collect the data.  Many MSD programmes already know (or think they have identified) their market constraints. Studies detailing the constraints underpinning programme rationale are often described in the business case. But is this thinking regularly shared beyond the readers of the business case? How many programme partners or beneficiaries get to engage with this thinking, comment on it and refine it?

In SPRING, the businesses used a landscaping report, the equivalent of the business case, as a foundation on which to build their data gathering process. But then they went into the field and gathered their own data. They valued the direct input, comments, thoughts and opinions from their ultimate end line users – the adolescent girls.

HCD asks you to dig deeper, collecting information from those who will ultimately use or benefit from your intervention. Ask what they think of the idea, the process or the end product - this will help design an intervention to overcome the challenge, and might pinpoint additional issues that haven't been identified.

One SPRING business, providing on-call motorbike taxi services, identified the safety of their girl passengers as a constraint. They offered helmets as a free-to-use additional service. However, on investigation their clients identified driver attitude and service as needing improvement and did not support the suggested helmet solution. 

2. Design the pilot intervention - but invite ideas from all stakeholders first
The second step is to design a prototype. But don’t just jump straight in; generate as many ideas as possible about potential prototypes, even the crazy, off-the-wall ones. Use new ideas to stimulate thinking to develop more. Then, narrow the selection and choose ideas that are appropriate to your circumstance, your programme, your budget, and your timeframe.  

There are many ways to stimulate thinking. SPRING encouraged businesses to brainstorm with their own team, and to include and collaborate with other businesses, some of them competitors, some of them involved in totally different fields. Invite ideas from a range of different stakeholders - farmers, agents, aggregators, institutions, and authorities. Broaden the scope of your sources beyond the 'selected informants' identified above. This helped encourage diverse viewpoints of the constraint, to approach the problem from different angles, and to suggest alternative solutions.

Many MSD programmes develop prototypes – they call them pilot interventions. Often these test the intervention or process and the relationship between the programme and the private sector partner. Many pilots are abandoned, perhaps because of a failure in the intervention design, or a weakness in the relationship between the partners. Often this effort ends here, gathering dust.  

A SPRING company provided their target girls with smartphones to access their offered product. But after finding that many of the phones were sold on, they realised they had misinterpreted the perceived market constraint. They broadened their pool for thinking of solutions, evaluating what went wrong, and then moved to the third step.  

3. Evaluate the pilot, revisit your target group, reassess, improve
The third step of the process focuses on iteration. Evaluate the reasons why the prototype or the pilot was not as successful as hoped. Then make improvements. Go back to the data from the first step, and maybe back to the source of the data. Ask your target group’s opinion. How might things have been implemented differently? How might design be improved? Don’t assume that because you’ve once gathered data, you’ve heard the complete story.

In SPRING 'failing fast' was a mantra. Businesses were given two months to design and implement a prototype. It wasn’t expected to succeed totally. It was designed to be replaced by an improved, better version. After testing the product, gathering data on its use and redesigning it, the business relaunched the product. And, if necessary, they did it again, and again.

For example, a SPRING micro finance business designed a group savings format for adolescent girls. But the girls lacked the maturity to effectively manage the group and contributions. The business iterated and matched the adolescent girl groups with older women mentors.

In another example an agricultural company revised its intervention and strengthened its relationship with smallholder farmers by facilitating low-cost health insurance for the same farmers.

HCD as a tool to allow a programme to choose its course...

MSD programmes depend on market activity and feedback. However, while the market might react immediately (in the case above, the savings group might fail), the root cause is not often fully communicated, or slowly percolates back to the programme. Some programmes rework the intervention, but others declare a failed pilot and move on.

In some cases, might it be better to continue investing in established relationships – with target communities, and with existing, potential partners? HCD might be a tool programmes can use to choose this course. 

Not every SPRING business succeeded in their ventures. But two years after the programme closed, two-thirds of the businesses continue to service the needs of adolescent girls. Not all use their original prototype, but using HCD, they continue to empower girls – the ultimate SPRING mandate.

  1HEKS, “Market Systems Development: Guideline to plan and facilitate market system changes”, Zurich, 2015

Join us for our 23rd November 2021 webinarEmpowering women and girls through private sector engagement: lessons from SPRING

Add your comment

Sign up or log in to comment and contribute.

Sign up