Engaging the private sector to overcome the learning crisis is all the rage in global education. This is giving rise to heated debates, as evident from responses to a paper just published on Liberia’s high-profile ‘Partnership Schools for Liberia’ (PSL) experiment.

The authors of the Liberia report conclude: ‘After one year, public schools managed by private operators raised student learning by 60 percent compared to standard public schools. But costs were high, performance varied across operators, and contracts authorized the largest operator to push excess pupils and under-performing teachers into other government schools.’

While the learning gains are important, this does not sound an overwhelming success, particularly given the low base of learning, and that in some cases the programme resulted in a shuffling of students between schools. Overall, there is no evidence that the programme increased enrolment in a context where only around 38% of children of the official school age are in school. As most of the children not in school are from the most disadvantaged backgrounds, these children did not have a chance to benefit from the programmes. Moreover, the learning gains of 0.18 standard deviations are also relatively modest according to J-PAL’s assessment, which suggests that a 0.1 standard deviation improvement is typically considered a small effect.

It is important nonetheless to unpack the reasons for the learning gains to the extent possible to learn lessons for Liberia and other related contexts. At best, the randomized control trial of the kind that has been undertaken can give us important information on whether an experiment has worked or not. In the case of the PSL experiment, so many other things happened (as is the case in the real world) that we cannot actually say if the experiment as originally designed worked. As such, the experiment is unable to give guidance on whether it was the contracts that had the effect on learning or something else.

And to the extent that contracting was an important reason, the experiment cannot explain why and how this worked, notably what effect it had on practices in the classroom. The reality is that each of the contractors probably had different approaches to their work with teachers in the classroom, with varied pathways to learning – some potentially more effective, sustainable, and educationally-relevant than others.

Based on the findings, I propose some key policy areas that deserve attention:

Money matters: A recognized flaw in the design of the PSL experiment was that there was only enough money to pay the private contractors an additional $50 per child, while the government schools in the comparison group did not receive these additional funds. As such, the comparison between PSL and other government schools was skewed in the design by different resource availability. On top of this, most of the private contractors were able to leverage more money for their schools, some raising substantial sums. This could suggest that an important benefit of engaging private contractors in education is that they can help to fill the financing gap. It also indicates that, contrary so some of the debates in education, money does matter. In this case, private contractors have improved learning with (significantly) more resources. Whether this is ultimately cost-effective is open to on-going debate.

Political will matters: As evidence from the Liberia experiment highlights, the form of contracting, and adherence to contracting, matters. The authors note that one of the contractors (achieving low learning gains) did not complete the contracting process, and another (achieving high learning gains) was selected outside the competitive process. As such neither of these contractors completed the due diligence, nor did they receive the $50 per student allocated as part of the programme. This in turn allowed them scope to implement the reform differently to intended – in one case, firing teachers and reducing student numbers with potential knock-on effects for other government schools. They did so while raising significant additional resources from other sources so they were able to more than compensate for not receiving $50 per student.

If contractors are able to circumvent the system in this highly-controlled and visible experiment that had the direct backing of the Minister of Education and was being closely watched by the President, what would happen if the model were to go to scale without such stringent government oversight?

Teachers matter: There is no doubt that, in countries where teacher absenteeism is high, there is a need to reduce this. The Liberia randomized control trial has provided us with robust evidence that teachers in the privately-operated schools were more likely to be in school and in the classroom. The 20 percentage point difference in absenteeism is impressive. However, the model has not yet solved the problem, as 40% of teachers in the privately-contracted schools are still absent.

Importantly also, the study is unable to tell us about why there has been an improvement. It proposes a link between contracting and better management which, in turn, has led to a reduction in teacher absenteeism. However, as many studies of this kind, it is unable to untangle the reasons behind the reduction in teacher absenteeism, nor can the experiment actually tell us if management has improved (if so, how), nor if there is an association between this and teacher absenteeism.

An issue receiving less attention is that PSL schools had first pick of better-trained, new graduates. This change in composition in the teaching force was responsible for almost half of the increase in learning outcomes. Overall, the evidence implies that teachers do matter to boost learning – both in terms of being in the classroom, as well as because of who the teachers are.

What happens in the classroom matters: While private contractors might be proven to be more successful than the government at leveraging finances, does this make them better pedagogues? We don’t know the answer to this question one way or another. It is noticeable, however, that questions in education circles on how school leadership and other forms of support to teachers can improve classroom practices is not a key area of attention in the PSL debate.

Successful examples of classroom-based reforms are apparent in DFID’s Girls’ Education Challenge programme. Some of the providers in this programme are a version of ‘PPPs’, with non-state providers working through government schools. Take the example of Camfed, a non-governmental organization with around 25 years’ experience of working in government schools in countries in sub-Saharan Africa to support marginalized girls both to make the transition to secondary school and to learn. Its programme in government secondary schools in Tanzania supported by the Girls’ Education Challenge has not only been successful in keeping girls in school but also increased learning. This increase in learning has been substantial, improving by over one standard deviation, or by 250 percent, compared with standard government schools within around one year – a much greater gain than the 60% improvement by PSL.

Camfed has achieved this by building up a strong relationship with governments in the countries in which it works over many years. The focus of its success is less on the nature of the relationship or ‘contract’ with government but rather with identifying ways to tackle the multidimensional barriers that marginalized girls face in school access and learning. This incorporates targeted financing to meet girls’ school-going costs in tandem with pedagogical measures to improve the welfare and learning of marginalized children. One of Camfed’s innovations is the involvement of Learner Guides (female secondary school graduates from marginalised backgrounds, many of whom had previously been supported by Camfed) who are expected to support girls in improving their self-esteem or aspirations.

There is much to learn from the PSL experiment. In going forward, a comprehensive assessment of the benefits and challenges that a model like PSL can bring will be vital to inform the Ministry of Education in Liberia’s reform process, with wider lessons for children around the world.

Disclosure: The author was a member of the Oversight Group of the PSL evaluation, and an external member of the Girls’ Education Challenge Steering Committee.

14 Responses to What matters for education reform? Lessons from the Partnership Schools for Liberia experiment and beyond

Surely the private sector will proclaim the results as a great success, to be expanded to other low-income countries. Somehow people don’t scrutinize the data. But effect sizes or fractions of a standard deviation are difficult to interpret empirically, so it’s good to make these comprehensible. How many words per minute did the students read before and after the grade? what was the difference, given information from other countries? e.g. students in the US increase about 25 words per year. How many did Liberian students get?

The other question is why teach the complex English spelling as a way to learn reading. The Liberian government has better options. Liberian Creole is spelled transparently, and students could start with that. They can also start reading in African languages, such as Kpele. They can simultaneously learn oral English, until they are fluent in transparent orthographies and can learn the English complexities.

There have been dialogues with the government on this option, but there is always a status issue. So instead of teaching all children transparent orthographies inexpensively, the government opted for this complicated private-public partnership. It’s unfortunate.

Hi Helen – thanks for your comment. Appendix H (graph on page 84) includes information on words per minute in the sample. It would be interesting to hear your interpretation of this. Your point on language is an important one. Ultimately bringing in the private sector alone to support government schools cannot fix the problem, if the problem is to do with these kinds of issues. This isn’t to say that involving the private sector cannot bring benefits, but perhaps is a sticking plaster in the current context if more fundamental changes are needed, as you suggest?

Pauline, I looked at the data annex, and the content is very hard to interpret. PAL may be showing its statistical prowess, but this does not help. Results are in terms of standard deviation units rather than words per minute or some other concrete metric. Students should be reading 45 words per minute at least in order to make sense of the text. So it’s unclear whether the benefits of the private schools translate into ability to learn information from texts.
Reported data include the Stallings classroom snapshot, so we see info on what the teacher was doing. Private teachers were more likely to be in the school and in the class, and more likely to engage in ‘active’ vs. ‘passive’ instruction. So, students are getting more instructional time, and they have on average a few more books. They should be performing better, therefore.
Some providers also replaced most teachers, and one of them has 60% new teachers. We all want to see bad teachers go, but the crux is qualified replacements. And will they stay? young people may teach for a while and leave for better opportunities. The program is being evaluated for short-term effects, and it’s unclear what the long-term effects will be. Supervision matters a great deal, and it’s also expensive. The data are unclear about the incidence of supervisors.
The data do not show how the complex English spelling is taught and how much reading practice students get. This is what matters. And frankly, there are very good interventions in Liberia for that. A few years ago Marcia Davidson spent 2-3 years for USAID developing a very detailed reading program with scripted lessons for Liberian English that gave very good outcomes. Why has that fallen into disuse and when did private providers develop a similar program? How do they teach English spelling? All that is unclear.
Sometimes money and public relations obscure the substance. A sleek exterior discourages people from asking questions.

Using a larger screen, I looked at the data annex more carefully. Scores of variables are reported, some more relevant than others.

The schools given to private providers seem to be well below the Liberian average. TIMSS sample test items and words per minute suggest that. The USAID scores were 8-10 wpm higher either as treatment or as control than the scores of the private providers (or their controls). see p. 82, table H.1, for example. This makes scores hard to compare with those of earlier measures that were more representative of the country.

Contractor schools seem to have better student attendance, but they were chosen to be 4 minutes away from the rd. rather than 24. Contractors constrained the numbers of students in classes and did not want to go to remote areas. If governments had the same option, they would be performing much better.

Some variables related to memory and consolidation had effects. For example, student attendance was much higher in some contractors than others. a 30% vs. 56% attendance for example would raise expectations for greater long-term retention, assuming the class taught some links in the chain needed to read. But many differences with control schools in various situations were statistically but not practically significant. Do 2 words per minute justify management change when students only read 15 wpm? And the counterfactual is really not known. Contractors in most cases did not report those.

Overall, the benefits and hassles of using contractors to teach literacy through English are confused and modest. The government will get much better outcomes by teaching reading automaticity in the Roman script through local languages or Creole English transparently spelled. Maybe these results will nudge them in that direction.

many thanks, Helen. Some important policy issues to consider here. The RCT team are hoping to continue tracking for the next couple of years, so there is hopefully an opportunity for them to look into some of the issues you raise

This is hitting the nail on the head as it concerns our attitude to educational development in the so called “third world”, especially Africa. The issue of educating our young people is not taken as seriously as it should by almost everyone across the board in the education value chain. From the politicians to the policy implementers, down to the teachers and other stakeholders. This is quite ironical, because it should be the greatest priority for us, seeing that the overall backwardness of the nations in this bracket of the nations of the world is mainly due to the backwardness in education. Until this is realized, and corresponding attention and appropriate actions taken to redress the present poor attitude towards educational advancement, the situation will not improve, hence these nations would continue remain backward.
It is our passionate desire and cry to all concerned to give everything they can give, of their time, energy, resources, and passion to the cause of improving the educational climate, which would lead to relative high educational standard.

Pauline, thanks for inviting me to comment. You raise some pertinent points, in particular regarding the limits of what an RCT, and especially the design of this one, can and cannot tell us. WHY have certain providers done better? WHY have their been relative gains? We don’t know.

That matters. It matters for replication. It matters in making the decision to scale-up or not. And it matters in trying to figure out what measures we can institute in all schools.

As you point out, and as we have discussed, relative gains need to be translated to absolute terms, and those absolute terms need to match desired learning outcomes. That matters if we want all our children (and I do mean ‘our’ children) to learn, and learn well.

The relative funding differences between government schools and among the different private providers, is certainly an issue that needs further exploring. Money matters especially in contexts that are severely under-funded.

Not sure why you say 60% more learning is insignificant. 0.18 standard deviations (sd) is clearly less like 0.1 sd (a weak effect) and more like 0.2 sd (a moderate effect). Also, not sure why you are giving up on PSL after a single year. Given that some school operators saw truly large effects, equating to more than an additional year of learning, I’d be more optimistic.

Some clarifications. Nowhere did I say 60% is insignificant. Rather, I was pointing out that there are other programmes that have achieved much larger gains within a year, so it is important to put in this context.

Also, nowhere in my blog do I suggest giving up on PSL. But, like the evaluators, I would suggest that there is not yet enough evidence to support substantial scaling up. Indeed, having information for more than one year at the current scale would be valuable, as the evaluators propose.

Rather, my blog is about considering other forms of evidence alongside the important results identified by the RCT including to consider the conditions under which a programme like PSL can work. This includes looking beyond the contractual arrangements to also looking at the practices within the classrooms.

By the way, am I correct that you are Vice President, Measurement and Evaluation, Bridge International Academies (if @stevecmeasures is your twitter profile)?