Some thoughts on Triple P and evidence based practice

Over the last few weeks, thanks to the power of twitter (and particularly tweeter @LMarryat) I became aware of some journal articles and blog posts about the Triple P parenting support programme. I must confess to a possibly not entirely unbiased interest in this, as before my current academic position I worked as a health visitor in an authority which invested heavily in the Triple P programme and over a period of a few years we were all trained up and expected to deliver the programme at every possible opportunity.

Triple P is marketed as an evidence-based programme which provides support for parents of 0-18s who have issues with various aspects of behaviour. It is provided at several levels and the support can be provided to individual families or done as group work. As a practitioner, I was trained at level 3, which is the basic parenting support package for individual families. It meant that I was trained to give advice and support around all sorts of parenting issues – food refusal, tantrums, toilet training, home safety, whining, hitting, etc etc – through the structured programme, supported by tipsheets and a DVD (although I must confess to never using the DVD myself). For some families it seemed to be really helpful, for others less so – to be honest the materials seemed as good as anything else I’d seen, but not especially better, and there were some aspects of it that really jarred. Triple P is based in Australia so all the materials are developed there, and part of the deal is that the purchasing authority does not deviate from them, photocopy them or adapt them in any way. In particular I got a right bee in my bonnet about the Home Safety tipsheet – in an inner-city area of multiple deprivation, high rise flats etc it really got on my wick that I had to hand families a sheet which told them how they must make sure that their children don’t go near the swimming pool in the garden unattended. That’s the most extreme example, but I just think it typifies the issues that many of us had with the actual materials. The other thing that made me cross was how we weren’t allowed to get the sheets translated for our families who spoke minimal or no English – there were some sheets in a few languages, but not enough for everyone who needed to use them across the city, and not in all the languages we came across in our practice, in a city which was one of the major reception cities for dispersed asylum seekers. This meant that it was really hard to offer the same level of support to families who were just as much in need of help as the English-speaking population. We also did have other issues which in a way were not Triple P’s fault and were more to do with the way management basically forced us to deliver the programme whether we wanted to or not – it became an exercise in numbers, in filling in forms and ticking boxes, and is one of the main reasons why I finally decided it was time to leave. It was getting to the point where I felt that I had no choice but to offer families a Triple P intervention, regardless of whether I thought it was what they needed, because of the pressure (and there really was pressure) to meet targets and figures. What also irritated was that Triple P was rolled out across the city at the same time as all the staff had to endure a 2 year pay freeze, and when we complained about the pressure to deliver Triple P we were basically told that as they had spent so much money bringing it in, tough we had to do it anyway. So when I say that I’m not entirely unbiased, perhaps it would be more accurate to say that I’m really quite bitter about it! (grrrr!)

One of the things that we were often told, again and again, was that Triple P was so great because it was “evidence-based”. I didn’t have access to university libraries at the time so it wasn’t easy to access articles, only abstracts, but I did a Google Scholar search and noticed that pretty much every article about Triple P featured the name of Dr Matt Sanders, who is the person from the University of Queensland who developed Triple P and is very involved in the materials, promotion and training development for the programme. I wasn’t able to get hold of the full articles due to paywalls, but this discovery did leave me with some questions about the evidence and the risk of conflicts of interest.

More recently, studies have started to emerge which call the evidence base for Triple P into question. Mostly they seem to be highlighting that the claims of the evidence to date can be questioned due to small sample sizes of the various research. There is also a recent trial with larger numbers that did not involve the Triple P team which compared Triple P level 4 with two other parenting programmes in Birmingham (Little et al 2012) where there were no discernible effects from Triple P. Following this, a systematic review and meta-analysis article by Scottish researchers in BMC Medicine (Wilson et al 2012) raised questions about the current evidence base (as outlined above – small sample sizes, potential conflicts of interest), which was also blogged about at PLOSOne by Dr James Coyne here. There then followed a response from Dr Matt Sanders et al here, and Dr James Coyne & Dr Linda Kwakkenbos also responded to both article and response here. [And as an aside, hooray for open access!] There is also a blog post which summarises the discussion and issues by Dr Pedro De Bruyckere here, which also provides a link (unfortunately in Dutch, not one of my languages sadly) to a recently defended PhD which found no significant effect of Triple P interventions. I must say from my experience as a practitioner I absolutely agree with his concern about a lot of money being involved which I do think affected how we as professionals were expected to push the programme.

What is worrying about that final blog post is the discussion about the difficulty in publishing null or negative results of research. This seems like the ideal point to publicise the Alltrials campaign, which is highlighting how much publicly-funded clinical research is not reported at all, meaning that trial results are lost to future researchers. Please do sign the petition if you have not already done so! There are all sorts of reasons why research might be buried, but regardless I agree that it is vital that all clinical trial results are published so that their findings are available not only to inform future research but also to inform current clinical practice. There can be no excuse, particularly in these straitened times, for publically funded research to be buried. What has shocked me though – I guess I must be naive, it hadn’t actually occurred to me before that this could happen – was the issue of publishers declining null/negative research results. That is a really worrying development that must be resisted.

[Edited to add]

One thing which saddened me as a practitioner was that, in discussing with my health visiting colleagues it seemed that there was another parenting programme (whose name escapes me at the moment) that they had been trained in and using before the drive to use Triple P. All of them were unanimous about how much they liked it, how much they thought it was making a difference, but that when the decision was made to plump for a programme which would be rolled out across the city it lost out as it did not have the requisite “evidence-base”. I just think that is such a lost opportunity – I appreciate that in a city where parenting and anti-social behaviour is often an issue at a large level that a community approach needs to be taken, but why could they have not done some good quality research, perhaps even comparing this programme with Triple P, to start to build an evidence base? I wish they had been brave enough to do that – they might well have kept the goodwill of their staff (which was sorely lacking with the way Triple P was introduced) and seen some great results. I can think of one researcher (not a million miles away) who would have loved to have been involved with the qualitative aspect of that sort of study.

20 responses to “Some thoughts on Triple P and evidence based practice”

This is a wonderful, heartfelt evaluation of triple P and the absurdity of rigidly applying it to low income families. The image of having to warn low income minority persons to keep their children away from the unattended family swimming pool will stick with me. Thanks!

I’ve taken the liberty of posting this on Facebook and tweeting about it. Follow me on Twitter@CoyneoftheRealm

California is being funded with public dollars in California as an evidenced based program that either “prevents mental illness from becoming severe and disabling” or “reduces the duration of untreated mental illness” But does it do anything? The reason I ask is that the state auditor is auditing public mental health spending (largely due to our efforts). I would like to point out Triple P as non-evidenced based and/or ineffective. If anyone has a one pager that puts the info together that makes the program of questionable utility, it could help us get CA dollars flowing to the most seriously ill. tx

Thank you both very much for your responses, and for your original work. I am glad that (belatedly) the Triple P evidence base seems to be catching up with the Triple P publicity. I don’t want to throw the baby out with the bathwater, as I said there were some families who did find the approach very helpful, and I would have liked to have had it as *part* of my armoury as a practitioner. In the appropriate place, it certainly had its place. But having to ditch everything else to use it really grated; on the one hand we were being told that of course we were autonomous practitioners, and on the other hand we were being hauled before our managers for not meeting our Triple P targets.

Thanks Jackie! Its nice to have some rigour around. The Wilson et al paper was almost scary in parts:

“No studies were registered with national or international trials registries.”

“No papers reported a pre-specified principal outcome measure”

“All eligible papers appeared to be co-authored by a Triple-P affiliated author, apart from one”

And Sanders reply to them seems to have a lot of handwaving jargon in it. Which taken together with the insistence on not deviating at all from the official materials, not even to interpret them for non-English speakers, and the sickly-sugary-sweet pictures on the website sort of screams “cult” or “scam” at me. Maybe I’m just a natural-=born conspiracy theorist.

It all sounds depressingly reminiscent of the situation in parts of my own field (education), where the difference between advocacy and evaluation has been so blurred as to have practically disappeared. The rule of thumb for educational interventions and Big New Ideas seems to be that they all start out by looking as if they’re working, whether because extra money and enthusiasm are being thrown at them or because they’re targeting situations that would regress to the mean in any case. It’s only over years of practice in sub-optimal conditions that one can identify the handful of innovations that stay the course.

One of the common features of most educational charlatanry is that when the Big New Idea doesn’t work in practice, the response is always that teachers are implementing it badly or half-heartedly, so we must take away more autonomy from teachers and prescribe what they do ever more tightly. I don’t know whether this applies to Triple-P, but shall we say that the moment I see a requirement not to deviate from the approved script in any way, I get suspicious…

You’ve realised who I am? That’s more than I realise some mornings… I suspect that posting comments under my nom de guerre is bad etiquette, but I guess you can work out why I prefer craven anonymity when I want to sound off about education — I wouldn’t want my august and munificent employers to take affront and invoke the Generalised Naughtiness clause in my contract…

Hi Jackie – Thanks for blogging with care about this – I particularly apprecuated the research links and will look through them more for some of my teaching on work with families. The Birmingham article is of particular interest to me as I was part of the big wave of Triple P training maybe 8 years ago.

I was trained in Level 4 Group Teen Triple P and found it useful up to a point, but I think I adapted it far more than the rules said I should have done. I did have some discussion with the trainer about ways it could be more accessible for our inner-city families and kept enough to the core points which were helpful for some parents. But I did have to explain how it was all rather Australian and I think it connected more to a soap opera than real life, swimming pools and all. Also being a creative sort, I did add some role play as a way of trying out some of the ‘calm down, I can see you’re upset, let’s talk about it’ techniques. This tended to involve me flapping about like a stressed teenager and the parents trying to deal with me. This was well received, and on balance, I think some of the parents really connected with the material and found it helpful. But some of them couldn’t handle such a structured programme and just didn’t turn up.

I am sceptical about trying to medicalise a parenting programme – the very different personalities of trainers and participants mean that delivering identical ‘doses’ of intervention with supposedly measurable ‘outcomes’ seems much more geared towards those making money out of expensive training and materials rather than offering the best service to parents. The idea that anyone can be trained to deliver a packaged programme rather than relying on properly qualified and experienced helping professionals seems horribly representative of the managerialist, privatising of services where frontline staff are paid less and deskilled and external companies take the profits, rather than maintaining a network of skilled staff who can adapt to any challenge.

I share the report’s concerns about the data collection (and must hold up my hand, maybe I contributed to the variable quality of the data, I certainly filled in forms which will have been included in this report); I really hope that those issues can be ironed out because I think it’s really important that there is a robust, large-scale body of data as that is what will contribute to the ongoing debate around the efficacy of Triple P. If this issue is not sorted out, but the end results seem to suggest that Triple P isn’t especially effective, the Triple P industry can point to the quality of the data to dismiss the results/concerns. The same of course applies if it appears to be a fantastic success – that finding could also be disputed if the data collection problems aren’t sorted out.

Apart from the bit about this blog post being a response to the 1st year evaluation in Glasgow (it wasn’t, it was a response to the research articles linked to in the blog post; the evaluation came out about the same time as this blog but that was coincidental) I’d say this is a pretty fair and balanced piece.

Given the emails I have received, I think there are other negative trials out there that either the investigators have been reluctant to submit for publication or they have submitted their papers only to have them rejected as negative trials. Let’s get this data out of the file drawers so policymakers can see it.

Agreed, negative findings are just as reportable (and arguably more important) and should be in the public domain. And in my attempt to keep this post as a repository of links to articles on Triple P in the same place, here is the Kirby & Sanders (2013) paper you just linked to on Twitter: http://www.sciencedirect.com/science/article/pii/S0005796713001873 This one shows a positive finding (unsurprising, given the second author), but yet again with a relatively small sample size. *sigh*

I facilitate level 4 (group, both Indigenous and non-Indigenous) and level 5 (enhanced and pathways) in a community in Western Sydney, Australia. I was trained in 2009 and offer the Level 4 groups up to 8 times per year.

I have a mix of middle-class, disadvantaged, mandated, Indigenous, non-Indigenous, and CALD parents, coming to the groups without a problem. Some of the parents do not have a functional level of literacy. All of the participants talk about the small and big ways Triple P has helped their families. I am still in contact with some of the parents who attended my groups up to five years ago, and the changes are sustainable.

I think that Triple P works so well for me and the community I work in because I work for a community development organisation. We foster belonging, connection, and community. People are welcome to come back to me when they are struggling, and I will do a 1:1 session with them to refresh their Triple P skills. We walk alongside our community members, regardless of how fast or slow they are going. I have had the privilege of working with parents who have lost the care of their children due to child protection issues, and through doing Triple P and doing the hard work on themselves and their parenting skills they have been granted custody of their children.

I also tailor Triple P to the community I work in. I haven’t changed the content, but I have changed the language and I emphasise certain aspects of the programme based on how parenting is perceived and practised in our community.

You can dispute the legitimacy of the research that makes Triple P an evidence-based programme until the cows come home. However, I have seen first-hand the changes brought about by Triple P. I have never seen or heard of any parent going backwards in their parenting as a result of attending Triple P. I am working my darnedest to reach the 20% tipping point that Dr Matt Sanders talks about to effect real change in parenting in the community I work in.

I have been trained in other parenting groups, and do offer those to the community consistently as well. I am immensely grateful for Triple P and applaud our governments’ efforts to make Triple P the parenting standard.