Musings on research, international development and other stuff

Tag Archives: Education

Babies suffer through a lot of bogus treatments for the sake of placebo-induced parental reassurance.

So, as regular readers know, I have recently become a mum. As I mentioned in my last post*, I was really shocked by how much pseudoscience is targeted at pregnant women. But four months after the birth, I have to tell you that it is not getting any better. What I find most concerning is just how mainstream the use of proven-not-to-work remedies are. Major supermarkets and chemists stock homeopathic teething powders; it is common to see babies wearing amber necklaces to combat teething; and I can’t seem to attend a mother and baby group without being told about the benefits of baby cranial osteopathy.

I find this preponderance of magical thinking kind of upsetting. I keep wondering why on earth we don’t teach the basics of research methodologies in high schools. But then sometimes I question whether my attitude is just yet another example of parents being judgey. I mean, other than the fact that people are wasting their money on useless treatments, does it really matter that people don’t understand research evidence? Is worrying about scientific illiteracy similar to Scottish people getting annoyed at English people who cross their hands at the beginning, rather than during the second verse, of Auld Lang Syne: i.e. technically correct but ultimately unimportant and a bit pedantic?

I guess that I have always had the hypothesis that it does matter; that if people are unable to understand the evidence behind medical interventions for annoying but self-limiting afflictions, they will also find it difficult to make evidence-informed decisions about other aspects of their lives. And crucially, they will not demand that policy makers back up their assertions about problems and potential solutions with facts.

But I have to admit that this is just my hypothesis.

So, my question to you is, what do you think? And furthermore, what are the facts? Is there any research evidence which has looked at the links between public ‘science/evidence literacy’ and decision making?? I’d be interested in your thoughts in the comments below.

.

* Apologies by the way for the long stretch without posts – I’ve been kind of busy. I am happy to report though that I have been using my time to develop many new skills and can now, for example, give virtuoso performances of both ‘Twinkle Twinkle’ and ‘You cannae shove yer Granny’.†,‡

† For those of you unfamiliar with it, ‘You cannae shove yer Granny (aff a bus)’ is a popular children’s song in Scotland. No really. I think the fact that parents feel this is an important life lesson to pass on to their children tells you a lot about my country of birth…

‡ Incidentally, I notice that I have only been on maternity leave for 4 months and I have already resorted to nested footnotes in order to capture my chaotic thought processes. This does not bode well for my eventual reintegration into the world of work.

Readers, I am delighted to introduce my first ever guest post. It is from my colleague Max – who can be found lurking on twitter as @maximegasteen – and it concerns the recent Pritchett/Sandefur paper. Enjoy! And do let us know your thoughts on the paper in the comments.

The quest for internal validity can sometimes go too far…(Find more fab evaluation cartoons on freshspectrum.com)

Development folk are always talking about “what works”. It’s usually around a research proposal saying “there are no silver bullets in this complex area” and then a few paragraphs later ending with a strong call “but we need to know what works”. It’s an attractive and intuitive rhetorical device. I mean, who could be against finding out ‘what works’? Surely no-one* wants to invest in something that doesn’t work?-.

Of course, like all rhetorical devices, “what works” is an over-simplification. But a new paper by Lant Pritchett and Justin Sandefur, Context Matters for Size, argues that this rhetorical device is not just simplistic, but actually dangerous for sensible policy making in development. The crux of the argument is that the primacy of methods for neat attribution of impact in development research and donors’ giddy-eyed enthusiasm when an RCT is dangled in front of their eyes leads to some potentially bad decisions.

Pritchett and Sandefur highlight cases where, on the basis of some very rigorous but limited evidence, influential researchers have pushed hard for the global scale-up of ‘proven’ interventions. The problem with this is that while RCTs can have very strong internal validity (i.e. they are good at demonstrating that a given factor leads to a given outcome) their external validity (i.e. the extent to which their findings can be generalised) is oftentimes open to question. Extrapolating from one very different context, often at small scale, to another context can be very misleading. They go on to use several examples from education to show that estimates using less rigorous methods, but in the local context are a better guide to the true impact of an intervention than a rigorous study from a different context.

All in all, a sensible argument. But that is kind of what bothers me. I feel like Pritchett and Sandefur have committed the opposite rhetorical sin to the “what works” brigade – making something more complicated than it needs to be. Sure, it’s helpful to counterbalance some of the (rather successful) self-promotion of the more hard-line randomistas’ favourite experiments, but I think this article swings too far in the opposite direction.

I think Pritchett and Sandefur do a slight disservice to people who support evidence-informed development (full disclosure: I am one of them) thinking they would blindly apply the results of a beautiful study from across the world in the context in which they work. At the same time (and here I will enter into the doing a disservice to the people working in development territory) I would love to be fighting my colleagues on the frontline who are trying to ignore good quality evidence from the local context in favour of excellent quality evidence from elsewhere. But in my experience I’ve faced the opposite challenge, where people designing programmes are putting more emphasis on dreadful local evidence to make incredible claims about the potential effectiveness of their programme (“we asked 25 people after the project if they thought things were better and 77.56% said it had improved by 82.3%” – the consultants masquerading as researchers who wrote this know who they are).

My bottom line on the paper? It’s a good read from some of the best thinkers on development. But it’s a bit like watching a series of The Killing – lots of detail, a healthy dose of false leads/strawmen but afterwards you’re left feeling a little bit bewildered – did I have to go through all that to find out not to trust the creepy guy who works at the removal company/MIT?

Having said that, it’s useful to always be reminded that the important question isn’t “does it work (somewhere)” but “did it work over there and would it work over here”. I’d love to claim credit for this phrase, but sadly someone wrote a whole (very good) book about it.

*With the possible exception of Lyle Lanley who convinced everyone with a fancy song and dance routine to build a useless monorail in the Simpsons

It seems like higher education is having a bit of a ‘moment’ in the development world just now. More people than ever are enrolling for universities and new modes of delivery such as Massive Open Online Courses (usually referred to by the wonderful acronym ‘MOOCs’) have the potential to transform how post-secondary learning takes place. The High Level Panel’s emphasis on data has focussed attention on the need to strengthen in-country analytical capacity (although it seems that not everyone agrees on how this should best be done!) and indeed there is growing recognition that achievement of development goals in all sectors will require a higher education system which is able to deliver knowledge and human capital. Meanwhile DFID has set up a Higher Education Taskforce to consider a future policy position on higher education.

Tying in with this flurry of interest, the Association of Commonwealth Universities will be launching its Beyond 2015 Campaign – asking whether higher education is ready to contribute to future development goals. They are calling for inputs from a range of stakeholders and, since this is one of my (many!) soap-box issues, I thought I would take the opportunity to throw in a few thoughts and suggestions of my own…1. Don’t get too seduced by ‘technological fix’ arguments.
The argument for higher education is sometimes made on the basis that an increase in research will lead to new and better technologies which will make the world a better place. Now, there’s some truth in this – many of the greatest technological developments have come from academia – however, I think it is also misleadingly simplistic. The changes needed to end poverty are complex, deeply political and unlikely to be ‘fixable’ with technological breakthroughs. And indeed many exciting technological fixes are under-used due to political barriers. I think the major benefit that higher education can give to society is increased human capital. A major part of this is through vocational training – to produce the nurses, doctors, engineers and teachers of the future. But higher education can also increase the ability of people in all professions to investigate, question and think critically. Such skills are crucial to build societies which grapple with seemingly intransigent problems – and demand better response from their governments.2. Focus on the organisation…
I know that this is not an original point – but it bears repeating. No amount of funding for research or higher education will lead to sustainable change if the institutions providing it are not well set up and managed. This applies to ‘traditional universities’ – but also to new modes of higher education which may not rely on a physical presence. Support for higher education may need to focus on some of the underlying issues which are crucial, but sometimes not sexy enough to get attention! This includes efficient and transparent finance and accounting systems, effective campus bandwidth management, responsive IT support, well-resourced and proactive libraries etc. etc.3….but don’t forget the individuals!
There has been a gratifying increase in attention on organisational capacity strengthening in recent years. But occasionally this has given individual capacity building schemes – particularly ones which remove participants from their home institutions – a bad name. Don’t get me wrong – my ideal situation would be that we have world-class higher education institutions in developing countries so that future talent can be nurtured there. But while we are getting there, we don’t want to lose the potential of lots of talented young people who are seeking an excellent education. Plus, the strengthened organisations of tomorrow are going to need well-educated people to staff them. For this reason, my personal view is that well-targeted individual scholarship schemes which enable talented young people to study at a world-class university and ensure that their new-found skills benefit their own country can be a useful part of efforts to strengthen higher education.4. Figure out links between research and higher education agendas – and avoid turf wars.
Some projects which are funded as ‘research capacity building’ could equally be described as higher education programmes – and vice versa. I am completely comfortable about this so long as the people funding each talk to each other. The two agendas are so intrinsically linked – and there is no lack of work to do – so I hope we can agree to work together on this one.

I am really looking forward to the discussions on higher education over the next few months – and in particular to hearing the findings of DFID’s Task Force. However it will be important that we don’t let the excitement about higher education distract us from the really pressing needs in other areas of education. As I have discussed before, the state of primary and secondary education remains abysmal in far too many parts of the world – and we will need to focus on all sectors of education if we are to achieve the vision set out in the High-Level Panel report.

One of the major aims – and indeed major successes – of the millennium development goals, has been to increase the number of kids going to school. At first glance, it appears wonderful that enrollment in schools went from 50% to 66% between 1995 and 2010. But the worrying thing is, that getting more kids into schools does not necessarily mean that they are learning more. In fact, a recent report from the Centre for Global Development reveals that the levels of educational attainment amongst children in developing countries are worrying low. The report draws on large global datasets but a few examples which stood out for me include:

In India, 60% of grade 8 children are unable to use a ruler to measure a pencil while only 27% who finish primary school can carry out tasks (such as reading a passage of text and telling the time) that are expected to be achieved by the end of the second year of school.

In Tanzania and Uganda, less than half of children aged between 10 and 16 have basic literacy and numeracy skills.

In Malawi, almost 80% of sixth graders score below the international minimum standard for reading proficiency

An average eighth grader in Ghana achieves a test score in maths and science which is equivalent to the lowest 0.2% of US students

I have worked in international development for many years but I still find these figures truly shocking. I can’t help wondering if a huge amount of the work we do in capacity building at organisational and institutional levels might be unnecessary if only people were getting a decent standard of education from the outset. I am reminded that at the International Conference on Evidence-Informed Policy Making in Nigeria last year, one of the main conclusions was that if policy makers are ever going to be able to make use of research evidence, they need to have much better levels of basic education.

So how can we improve learning? The report talks a lot about the use of assessment to improve educational outputs – not so much because assessment drives learning but because poor assessment results drive people to reform the system. At present it seems that many people in developing countries are not aware of how poor the education system is and are therefore not demanding reform (see for example the graph on the left – taken from the report). But precisely what type of reform would work is less clear.

There is evidence that teacher incentives can improve both attendance and effort – for example this systematic review suggests that having teachers on fixed term rather than permanent contracts increases attainment. However the results are patchy and a recent study from Kenya showed that contract teachers only improved attainment when they were hired by NGOs rather than the government. Of course, getting the teachers to actually turn up is important (!) but getting teachers to promote a friendly learning environment (e.g. encouraging kids to ask questions) is equally crucial and equally challenging. A number of studies (see for example here) have shown that changing the culture of teaching to a more learner-centred approach is very difficult. I have experience of this myself – I used to teach learner-centred pedagogy to capacity building trainers. The work was great fun, but hard; people’s experience of learning is very personal and deep-seated and it can be quite scary for people to break free from this.

Overall, my conclusion from reading this report is that we know shockingly little about how we can improve education. We have failed to invest in good quality research about how we can support people learn and academic pedagogy has been dominated by pseudoscience. Education research has lagged behind other areas of development research for too long and getting our kids to actually learn is just too important to neglect.

My German is a little bit, erm, gramatically challenged – but unfortunately, having a German husband is not equivalent to a live-in German-teacher. The problem is that German is so natural to him that he has forgotten how he learnt it and he finds it difficult to respond to my questions about why you have to say something a certain way.

The assumption that someone who knows something will know how to teach someone else it, crops up all the time in the field of international development. People recognise that there is a gap in capacity and then they identify someone who has that capacity. And then they organise for that person to go and “pass on” their capacity. I think there is an assumption that it will work a little bit like this….

The problem of course is that people WITH capacity (knowledge, skills, attitudes in whatever area) might be really rubbish at supporting others to develop that capacity.

For example, I often hear of training programmes for academic researchers in developing countries which make use of senior academics, usually from the north, as trainers. In my experience, one major challenge for junior researchers is critical thinking skills; some people are very adept at learning new facts and theories but really struggle to synthesise information, to draw out meaning from it and to critically engage with it. These are skills which many of us who have grown up in a highly questioning environment have acquired without thinking about it. But for those who have gone through an education system that has relied on rote learning and discourages questioning, they can be a big challenge – and this can clearly be a major problem for aspiring academics. Now, I don’t doubt that the senior academics from the north who are brought in as trainers have bags of critical thinking skills – but what I am not so sure about is whether they are always well qualified to pass these skills on to others.

In fact, I think that with many capacity building projects – particularly those which aim to influence behaviours and attitudes – we need to think more carefully about how we can support people to learn. How do you break down an area of capacity, like critical thinking, and facilitate a process that allows someone to develop it? How do you support capacity building in a way that doesn’t bore or patronise people (I love Nathan Chiume’s description of capacity building as “a euphemism for cramming 30 ppl in a room 4 a few days and trying to kill them with power-points and flipcharts and group work”). And how can you support local actors to act as facilitators of learning – rather than parachuting experts in from the north? These are not simple problems… but they are really important ones.

The good news is, that there are people out there who have been thinking about this kind of thing for a while – they are called… teachers! Well to be more precise, those who train and study teachers – the educational psychologists, pedagogues, instructional scientists and educationalists. There are lots of them out there (I follow some of them on twitter and they seem to be very nice people) and I think it would be great if we in the international development community joined up with them a little more and found out what they could, you know, teach us.

Update: in response to a comment below I give a more specific example from my experience – would be interested to know if others have experienced similar.