Background. The Consolidated Standards of Reporting Trials (CONSORT) statement has been developed to improve the quality of reporting of clinical trials. There is possibly suboptimal adherence to the CONSORT statement in published trials. We evaluated the compliance of randomized controlled trials ( RCTs) published in the Journal of the American Medical Association (JAMA) and British Medical Journal (BMJ) in 2013 to the CONSORT statement 2010.Methods. A PubMed search for RCTs published in JAMA and BMJ for 2013 was done. Scores were assigned to each subitem of CONSORT by one of four authors and disputes were resolved by mutual consensus. The total score for each RCT was calculated and converted to a percentage total score (PTS). Scores were expressed as median (range). The median scores between journals and types of RCTs were compared using the Mann–Whitney U test.Results. There were 97 RCTs (69 in JAMA and 28 in the BMJ) comprising parallel (75), cluster (14) and non-inferiority (8) design studies. The overall median (range) of PTS of all RCTs was 82% (59.4%–97.1 %). JAMA had an overall median (range) PTS of 81.6% (59.4%–97.1 %) and the BMJ 84% (65.2%–92.2%). The difference was not statistically significant (p=0.25). Between trial designs, the highest PTS was seen with parallel (which included parallel, crossover and factorial designs) with a median (range) of 85.1% (68.4%–90.2%) followed by cluster randomized trials 82.8% (65.2%–92.2%) and non-inferiority trials 78.6% (72.7%–85.7%). There was no significant difference between the three trial designs (p=0.48).Conclusion. A wide range in PTS (59.4%–97.1 %) even in high impact journals indicates poor compliance of reported trials with CONSORT.

How to cite this article:Susvirkar A, Gada P, Figer B, Thaker S, Thatte UM, Gogtay NJ. An assessment of the compliance of randomized controlled trials published in two high impact journals with the CONSORT statement. Natl Med J India 2018;31:79-82

How to cite this URL:Susvirkar A, Gada P, Figer B, Thaker S, Thatte UM, Gogtay NJ. An assessment of the compliance of randomized controlled trials published in two high impact journals with the CONSORT statement. Natl Med J India [serial online] 2018 [cited 2019 May 25];31:79-82. Available from: http://www.nmji.in/text.asp?2018/31/2/79/253165

Introduction

Randomized controlled trials (RCTs) are considered the gold standard in evidence-based medical practice. Clear, lucid and complete reporting of these studies is as important as their conduct. The Consolidated Standards of Reporting Trials (CONSORT) statement, first published in 1996,[1] provides a checklist of essential items that authors should use while reporting their studies. It was revised in 20 1 0[2] and extensions of the statement to non-inferiority/equivalence[3] and cluster randomized trials[4] were published in 2012. Though CONSORT is endorsed by almost 600 journals,[5] the World Association of Medical Editors (WAME)[6] and the International Committee of Medical Journal Editors (ICMJE),[7] several studies have reported lack of adherence of published papers to CONSORT. These studies are largely in ‘specialty areas’ such as oncology,[8] plastic surgery,[9] otorhinolaryngology[10] or address only a specific study design such as non-inferiority.[11] Since journals that are multispecialty and with a wide readership base are likely to impact the practice of evidence-based medicine and health policy to a larger extent, we assessed adherence to CONSORT and its extensions among RCTs published in two high impact multidisciplinary journals and also compared compliance between the journals.

Methods

The Institutional Review Board deemed the protocol exempt from review (EC/OA-54/2015). The Journal of the American Medical Association (JAMA, impact factor for 2013: 30.387),[12] published from the USA and the British Medical Journal (BMJ, impact factor for 2013: 16.378),[13] published from the UK endorse CONSORT. We chose these two journals for study as they are multidisciplinary, enjoy a high weekly readership of more than 0.1 million each[14],[15] and have high impact factors.

RCTs were categorized as cluster, non-inferiority/equivalence and parallel group studies. For the purpose of this study, crossover and factorial study designs were classified under ‘parallel’. Each RCT was assessed for compliance to all 37 sub-items in the CONSORT checklist. The RCTs with cluster and non-inferiority trial designs were assessed for compliance with respect to extensions to the CONSORT checklist.[3],[4] A completely reported sub-item/ extension was given a score of 1; any sub-item which was not reported was scored zero. If a sub-item/extension was partially reported, it was given a score of 0.5. Exceptions to this rule were sub-item 1b and extensions 1b, where each sub-item was divided into four subdivisions and a reporting of each such subdivision earned a score of 0.25. A total score was calculated for each RCT. Since the applicable sub-items varied for each RCT, total scores were converted into percentage total scores (PTS). Four investigators independently scored all the RCTs. If there were discrepancies between the scores assigned, these were resolved after discussion and a consensus was arrived at.

Scores for each journal were expressed as median (range) and normality was tested with the Kolmogrov–Smirnov test. The total PTS between the journals and each study design between journals was compared using the Mann–Whitney U test. The PTS between parallel, cluster and non-inferiority trials was compared using the Kruskal–Wallis test followed by Dunn’s post hoc test. The proportion of RCTs completely reporting a sub-item of the total RCTs for which the sub-item was applicable was calculated and were compared between the two journals using Fisher ' s exact test. All analyses were done at 5% significance using two-sided tests on GraphPadInStat 3.0 (San Diego, USA).

Results

The search yielded 4666 articles, of which 1286 were published in JAMA and 3380 in the BMJ. A total of 113 RCTs were published of which 16 were excluded as these were post hoc analyses or continuation of previous studies. Thus, 97 studies were eventually analysed. The proportion of RCTs in JAMA was 69/1286 (5.36%) and in the BMJ, 28/3380 (0.82%). Of the 69 RCTs published in JAMA, 57 were parallel design, 4 cluster-randomized and 8 non-inferiority studies. Of the 28 RCTs in the BMJ, 18 were parallel and 10 cluster-randomized. The list of articles assessed is enclosed as a supplementary appendix (see www.nmji.in).

Analysis between journals

The overall median (range) PTS of all RCTs was 82% (59.4%–97.1%). JAMA had an overall median (range) PTS of 81.6% (59.4%–97.1%) and the BMJ84% (65.2%–92.2%). This difference was not statistically significant (p=0.25). There was no difference in the compliance to the main checklist (parallel including crossover and multifactorial designs; p=0.23) or cluster randomized trials (p=0.7) between the journals.

Analysis between trial designs

The highest PTS was seen with parallel (which included parallel, crossover and factorial designs) with a median (range) of 85.1% (68.4%–90.2%) followed by cluster randomized trials 82.8% (65.2%–92.2%) and non-inferiority trials 78.6% (72.7%–85.7%). There was no significant difference between the three trial designs (p=0.48). Compliance between the two extensions (cluster and non-inferiority) was not significantly different.

Compliance of sub-items

Eight sub-items namely 1b, 3a, 3b, 6b, 9, 10, 17b and 24 showed poor overall compliance, with these sub-items being completely reported in less than 50% of RCTs. Nine sub-items namely 1a, 2a, 2b, 6a, 12a, 15, 20, 22 and 25 showed good overall compliance, with these being completely reported in more than 95% of the RCTs. The sub-items 19 and 23 were significantly better reported in JAMA while sub-item 24 was significantly reported better in the BMJ[Table 1].

We found that JAMA and BMJ, two high impact factor journals, had an overall compliance score just under 80% for reporting RCTs. Although earlier studies have evaluated adherence to CONSORT, these have focused on trials published in specialty journals or on particular types of RCTs[8],[9],[10],[11],[16] unlike our study, specifically looked at multispecialty journals and all types of RCTs. We found that although the proportion of RCTs published in JAMA (6%) and the BMJ (1%) was different, there was no difference in the PTS between them. The two journals assessed for CONSORT compliance in this study are similar to each other in that they are both multispecialty journals published by the American and British Medical Associations since 1883 and 1840, respectively[14],[17] and endorse the CONSORT statement.

One of the methods intended at improving transparency and completeness in reporting RCTs has been endorsement of the CONSORT statement by editors worldwide.[18],[19] This did impact reporting as seen in a Cochrane systematic review[20] that showed that 25 of 27 CONSORT-related items were more completely reported by endorsing journals relative to the non-endorsing ones, 5 of these sub-items being significantly different. Our audit found that the quality of reporting of RCTs varied across study designs with poorer reporting of non-inferiority and cluster randomized controlled trials. Similar observations have been made previously.[21],[22] One possible reason could be that authors are either unaware or they simply do not refer to the extensions.

Several sub-items showed poorer compliance relative to others. These included description of trial design, important changes to methods after trial commencement, changes to trial outcomes after trial commencement, random allocation and allocation concealment, details about who generated random allocation sequence, for binary outcomes presentation of absolute and relative effect sizes and details about where full trial protocol can be accessed. While it is essential to improve compliance with all subitems, the sub-item pertaining to ‘important changes to methods after trial commencement’ is particularly important. A study by Wandalkar et al.[23] found significant variation between protocols registered with a clinical trials registry and the final published manuscripts. Hence, an explicit statement that ‘no changes were made to the study protocol after trial commencement’ would be useful. In our study only 2 of 97 (2.1%) articles actually had this statement.

Sub-items related to identification that the study is a randomized trial in the title, scientific background and rationale, objectives, inclusions, interventions, outcome measures, results of main outcome measures, limitations, interpretation, trial registration and funding sources were better reported. The instructions to authors of both journals explicitly stress on reporting of certain items on their website which were better reported and we postulate that this is the reason for better reporting.[24],[25]

Considering that RCTs are the gold standard in the field of evidence-based medicine, reporting that lacks methodological rigor can lead to biased results. There is a need to educate authors about various guidelines and development of ‘writing aid tools’.[26]

Our study is limited by the fact that we studied articles in only two journals and that too in a single year. The distribution of RCTs between the journals was not similar with poorer representation of non-inferiority trials in the BMJ. Seven studies in our sample were continuation of older studies. Although these were scored, we did not refer to their earlier publications. This could have led to some under-estimation of scores for these articles. In conclusion, a median compliance of 77% in two widely acclaimed journals indicates a need to improve the compliance rates of the CONSORT guidelines among all the stakeholders.