What have we learned about Payment by Results (PBR) programmes from verifying one?

After 19 verification rounds, the WASH Results Monitoring and Verification team shares its suggestions for how to design future PBR programmes.

Martha Keega assesses a latrine in South Sudan

Verification in action: MV team member Martha Keega assesses a latrine in South Sudan

Verification is at the heart of the WASH Results Programme. Suppliers only get paid if we, the Monitoring and Verification (MV) team, can independently verify the results they are reporting. Usually we can: results are reported by Suppliers, verified by us and Suppliers are paid by DFID to an agreed schedule. However, all Suppliers have received deductions at least once, which, although painful for everyone, is testament to the rigour of the verification process. Overall, the system is working and the results of the programme are clear. But the demands of verification are also undeniable, leading to some aspects of verification being labelled “Payment by Paperwork” and like any process, it could be improved.

In January 2016 the team* came together to reflect on what we have learned so far from conducting 19 rounds of verification across the three Suppliers. Our discussions focused on verification but inevitably considered wider issues around design of a PBR programme. Here we share some suggestions for design of future PBR programmes, from a verification perspective.

  1. Ensure targets and milestones reflect high level programme objectives
  2. Be clear on targets and assumptions about their measurement
  3. Think carefully about enabling alignment with local government and other WASH stakeholders
  4. Reconsider the 100% PBR mechanism to avoid verification inefficiencies
  5. Consider payments for over-achievement of outcomes, but not of outputs
  6. Include provision for a joint Supplier and Verifier inception phase that will streamline verification
  7. Consider pros and cons of relying more on Supplier-generated evidence as opposed to independent evidence generation

1. Ensure targets and milestones reflect high level programme objectives
The WASH Results Programme has ambitions with regard to equity, gender and disability and overall health benefits that are not universally built into targets and payment milestones agreed between DFID and Suppliers. As a consequence, these ambitions are not explicitly incentivised. Any future programme should think carefully about how the design of the programme, especially the targets set in the tender and agreed with Suppliers, uphold objectives based on good practice within the sector.

2. Be clear on targets and assumptions about their measurement
We have found that when payment decisions are riding on whether targets have been met, the devil is in the detail. During implementation, some discrepancies have emerged over targets and how to achieve them. Discussions have taken place about minimum standards for latrines (DFID or JMP definition) and hygiene targets (what does ‘reach’ mean?). In addition, there was occasionally lack of clarity on how achievement of targets would be measured.

When working at scale, assumptions made about the average size of a household in a particular area, or the best way of measuring the number of pupils in a school become subject to intense scrutiny.  This is quite a departure from how programmes with different funding mechanisms have worked in the past and the level of detailed evidence required may come as a shock for Suppliers and Donors alike. In response, we suggest that future programmes should provide clear guidance on technical specifications relating to targets and guidelines for evidencing achievements.

3. Think carefully about enabling alignment with local government and other WASH Stakeholders
One concern that we discussed in the meeting was that the design of the WASH Results Programme does not sufficiently incentivise alignment with local government. We suspect that this was a result of the scale of the programme and the tight timelines, but also the demands of verification. The need to generate verifiable results can dis-incentivise both the pursuit of “soft” outcomes such as collaboration, and, working with government monitoring systems.

We suggest that PBR programmes need to think carefully about how to incentivise devolution of support services from progamme teams to local governments, and to other sector stakeholders during the life of the programme, for example by linking payments to these activities. Also, to think how programme design could encourage long-term strengthening of government monitoring systems.

4. Reconsider the 100% PBR mechanism to avoid verification inefficiencies
The merits or otherwise of the 100% PBR mechanism in the WASH Results Programme are subject to much discussion; we considered them from a verification perspective. We believe that, in response to the 100% PBR mechanism, some Suppliers included input- and process-related milestone targets to meet their internal cash flow requirements. In some cases, this led to verification processes that required high levels of effort (i.e. paperwork) with relatively few benefits.

We suggest that people designing future PBR programmes consider non-PBR upfront payments to Suppliers to avoid the need to set early input and process milestones, and run a substantial inception phase that includes paid-for outputs for Suppliers and Verifiers. In the implementation phase of the WASH Results Programme, payment milestones have been mainly quarterly, so requiring seemingly endless rounds of verification that put pressure on all involved, particularly Supplier programme staff. In response, we suggest that payments over the course of a programme should be less frequent (and so possibly larger), so requiring fewer verification rounds and allowing greater space between them. This may have implications for the design of the PBR mechanism.

5. Consider payments for over-achievement of outcomes, but not of outputs
The WASH Results Programme does not include payment for over-achievement. Over the course of the programme, some Suppliers have argued that over-achievement should be rewarded, just as under-achievement is penalised. As Verifiers, we agree that paying for over-achievement for outcomes would be a positive change in a future PBR design. However, there were concerns among our team that encouraging over-achievement of outputs could have unintended consequences such as inefficient investments or short-term efforts to achieve outputs without sufficient attention to sustainability and the quality of service delivery.

6. Include provision for a joint Supplier and Verifier inception phase that will streamline verification
It is broadly accepted that the WASH Results Programme would have benefited from a more substantial inception phase with the Verification Team in place at the start. Our recommendations about how an inception phase could help streamline and strengthen verification are as follows:

  • Key inception outputs should include a monitoring and results reporting framework agreed between the Supplier and the Verification Agent. Suppliers and Verifiers could be paid against these outputs to overcome cash flow issues.
  • The inception phase should include Verification Team visits to country programmes to establish an effective dialogue between the Verifiers and Suppliers early on.
  • If Suppliers evidence their achievements (as opposed to independent collection of evidence by the Verification Agent – see below), assessment of, and agreement on, what are adequate results reporting systems and processes need to be included in the inception phase.
  • Run a ‘dry’ verification round at the beginning of the verification phase where payments are guaranteed to Suppliers irrespective of target achievement so that early verification issues can be sorted out without escalating stress levels.

7. Consider pros and cons of relying more on Supplier-generated evidence as opposed to independent evidence generation
In the WASH Results Programme, Suppliers provide evidence against target achievements, which is subsequently verified by the Verification Team (we will be producing a paper soon that outlines how this process works in more detail). Is this reliance on Supplier-generated evidence the best way forward? What are the pros and cons of this approach as compared with independent (verification-led) evidence generation?

Indications are that the PBR mechanism has improved Suppliers’ internal monitoring systems, and has shifted the internal programming focus from the finance to the monitoring and evaluation department. However, relying on Suppliers’ internal reporting systems has required some Suppliers to introduce substantial changes to existing reporting systems and the MV team has faced challenges in ensuring standards of evidence, particularly in relation to surveys.

We have some ideas about pros and cons of Supplier-generated evidence as opposed to evidence generated independently, but feel this can only be fully assessed in conversation with the Suppliers. We plan to have this conversation at a WASH Results Programme Supplier learning event in March. So, this is not so much a suggestion as a request to watch this space!

Coming up…

WASH Results Programme Learning Event:  On March 7 2016 Suppliers, the e-Pact Monitoring & Verification and Evaluation teams, and DFID will convene to compare and reflect on learning so far. Key discussions at the event will be shared through this blog.

Verification framework paper: an overview of how the verification process works in the WASH Results Programme. This will present a behind-the-scenes look at verification in practice and provide background for future lessons and reflections that we intend to share through our blog and other outputs.

 

 


* About the MV Team: In the WASH Results Programme, the monitoring, verification and evaluation functions are combined into one contract with e-Pact. In practice, the ongoing monitoring and verification of Suppliers’ results is conducted by one team (the MV team) and the evaluation of the programme by another.  The lessons here are based on the experience of the MV team although members of the Evaluation team were also present at the workshop. Read more about the WASH Results Programme.

Leave a Reply

Fill in your details below or click an icon to log in:

WordPress.com Logo

You are commenting using your WordPress.com account. Log Out / Change )

Twitter picture

You are commenting using your Twitter account. Log Out / Change )

Facebook photo

You are commenting using your Facebook account. Log Out / Change )

Google+ photo

You are commenting using your Google+ account. Log Out / Change )

Connecting to %s