Abstract

We evaluate effects of a performance contract (PC) implemented in Delaware in 2001 and participation in quality improvement (QI) programs on waiting time for treatment and length of stay (LOS) using client treatment episode level data from Delaware (n = 12,368) and Maryland (n = 147,151) for 1998 – 2006. Results of difference-in-difference analyses indicate waiting time declined 13 days following the PC, after controlling for client characteristics and historical trends. Participation in the PC and a formal QI program was associated with a decrease of 20 days. LOS increased 22 days under the PC and 24 days under the PC and QI programs, after controlling for client characteristics. The PC and QI program were associated with improvements in LOS and waiting time, although we cannot determine which aspects of the programs (incentives, training, monitoring) resulted in these changes.

Evidence of quality improvement from PC in behavioral health is mixed. Analyses of the Maine PC identified improved performance on some measures for programs with higher than average revenue from the PC (Commons, McGuire, & Riordan, 1997). However, a control group was not included and later analyses incorporating a control group found changes may have been due to misreporting of client severity (Lu & Ma, 2006). Shen identified a problem with cherry-picking of clients under the Maine PC (Shen, 2003), but in a more comprehensive examination of the Maine treatment system, Lu and Ma found much of the selection was explained by better matching of clients to treatment modalities (Lu, Ma, & Yuan, 2003). In an initial evaluation of the 2001 Delaware PC, McLellan and colleagues identified improvements in utilization rates and “systematic increases” in client participation in treatment following the PC (McLellan, Kemp, Brooks, & Carise, 2008).

Building on the initial program-level evaluation by McLellan and colleagues, we here examine the impact of the change in payment system from a global payment to a performance contract for Delaware outpatient substance abuse treatment programs using more detailed client level data. We examine the effect of participation in the PC and participation in parallel quality improvement (QI) initiatives; adding a comparison group of programs from another state not subject to the PC. We exploit the sequential implementation of the PC and QI initiatives to estimate effects of the PC only and the PC in conjunction with QI initiatives on waiting time for treatment (WT) and length of stay (LOS) while controlling for case mix and historical trends.

2.2.1 Intervention: The Delaware Performance Contract

Prior to implementing the PC, Delaware paid AOD programs using a global budget; a fixed monthly installment, one twelfth of the annual amount which was determined prospectively through negotiations between the program and the state (Kemp, 2006). In 2000, DSAMH began to incorporate considerations of quality in purchasing. The move to the PC represented an attempt to incentivize provision of quality care, without dictating specifically how care is provided.

Delaware’s PC has been described previously (McLellan, et al., 2008); we highlight key concepts. Under the PC at least one evidence-based practice must be used; beyond this the state did not dictate how to provide services. State funding was increased 5% with the introduction of the PC. Organizations reported to the state by entering information into a spreadsheet and were paid monthly based on three performance measures: capacity utilization, client participation and treatment completion. Initially programs were given a six month hold-harmless period when performance data were collected and submitted, but programs’ payments were not affected. Beginning in January 2002 programs continued to be paid monthly but each allotment was either incremented or reduced based upon their measured performance in the previous month. It is important to note that turnaround time for reimbursement was designed to be very fast in order to strengthen the incentive’s impact.

Previous analyses did not find any evidence that programs engaged in “cherry-picking” of clients more likely to meet the performance measures; rather severity of the population seemed to increase over time following the PC (McLellan, et al., 2008). We examined our own data for changes in client severity over time and came to a similar conclusion that the population did appear to become more severe over time (analyses not shown).

2.2.2 Performance Measures

Performance measures, established in the contract between the state and the AOD treatment organizations, are the foundation of the PC. The treatment programs reported monthly to the state by entering information into a spreadsheet designed for the PC. The three performance measures are defined as follows:

Capacity Utilization: a program level measure of the proportion of the program’s capacity used in a month. The numerator is the number of clients enrolled in the program during the month. The denominator is negotiated between the program and the state and is based on the program’s size, geographic location and costs (McLellan, et al., 2008). It is important to note that because the denominator was negotiated it is not an entirely objective measure. It is possible that a savvy program director might negotiate a smaller denominator in order to make it easier for their organization to achieve the rate. It was not possible to examine whether this occurred with the available data.

Active Participation: a client level measure of attendance to treatment. Clients met this measure by attending a specified minimum number of treatment sessions each month. The minimum number of sessions required for a client to meet this criterion declined over time in a clinically sensible manner and is shown in Table 1, Treatment Participation Requirements.

Program Completion: a measure of retention in treatment to completion. It was defined as active participation in treatment for a minimum of 60 days, completion of the major goals of the treatment plan and submission of four consecutive weekly urine samples free of alcohol and illegal drugs. Programs were eligible for $100 bonus payments for each client completing the program. However, this measure was subject to budget constraints so once a program reached its annual budgetary maximum (between $3,000 – $9,000) the program did not receive program completion payments for future clients. This measure is not analyzed because the bonus was not available for every client due to budgetary constraints. Complete data on this measure were not available from the state because programs stopped tracking this information once they reached the annual maximum.

2.2.3 Payments

Delaware’s PC required programs to maintain a 90% capacity utilization rate to receive the full monthly payment. Both penalties and rewards were included. If an organization did not meet the utilization target, its payment was reduced accordingly, by up to 50%. If an organization met the utilization target and its clients met its active participation targets, the organization was eligible for bonus payments up to 5% beyond the contract amount. Table 1 shows target rates and payments for the utilization and active participation measures.

2.2.4 Quality Improvement Initiatives

Following PC implementation, Delaware outpatient treatment programs began participating in two QI programs: The Network for the Improvement of Addiction Treatment (NIATx) in 2004 and Advancing Recovery in 2006. NIATx is based on the rapid-cycle change concept and offers four goals: reduce waiting time, reduce no-shows, increase admissions and increase continuation of care (McCarty, Gustafson, Capoccia, & Cotter, 2009; McCarty et al., 2007). Advancing Recovery is designed to help programs work with state agencies to change administrative and clinical systems and implement evidenced-based practices. In our analyses it is important to account for participation in QI programs to distinguish and estimate their effects separately from the PC.

2.3 Data

Analyses are based on matched treatment and comparison groups and rely on administrative data from Delaware and Maryland as well as personal interviews with program CEOs. State agencies provided admission level data including demographic information, primary drug used and frequency of use for all adult clients treated in publicly funded outpatient AOD treatment programs between 1998 and 2006 in Delaware (n=12,368) and Maryland (n=147,151). The Delaware data used in this analysis were not part of the performance contract data and therefore we do not expect programs had an incentive to misreport data. Delaware did oversee programs’ data collection efforts under the PC so we expect the data collected after the PC was implemented to be more accurate. All data were collected by the states through standard reporting systems. Data collection and coding procedures in both states were examined and discharge dates were calculated using standard methods. Interviews, approximately one hour in length focusing on how programs responded to the PC, were conducted by the first author with CEOs of the four Delaware organizations providing outpatient AOD services. Findings from these interviews informed interpretation of the quantitative analyses. The study was approved by Brandeis University’s Institutional Review Board.

2.4 Creation of the matched sample

Sample construction was based on one-to-one matching of 12,368 Delaware clients with comparison Maryland subjects. We stratified the Maryland and Delaware databases into cells based on eight factors- admission year, gender, race, marital status, employment, type and frequency of primary drug, and previous treatment for alcohol or drugs – and randomly selected Maryland clients from each cell to match the number of Delaware clients in the cell. The substance use variables were included to eliminate any differences in measures of severity or type of use between the states. Previous treatment was included because it has been shown to be an important predictor of treatment outcomes.

For matching we employed the SAS procedure SurveySelect (SAS Institute Inc, 2009). We excluded 691 Maryland admissions due to negative or otherwise unreasonable LOS values. This left 146,460 admissions for selecting the matched sample. We excluded 926 Delaware admissions (7.5%) because of missing data on one or more of the stratifying variables. The remaining Delaware sample consisted of 11,442 admissions. Table 2 provides a description of the matched sample. As expected, the samples are almost identical on all characteristics, with the exception of the proportions of clients reporting no substance use within 30 days before admission, 41% in Delaware versus 44% in Maryland (p<.001). Although this is not a small difference, the sample is otherwise well-matched and this is not likely to drive results. In addition this variable is included in the regression models to control for the difference.

2.5 Dependent Variables

Client waiting time (WT) is defined as number of days between the day the individual first contacted the program and the admission day. This variable was included in the Maryland data and was calculated in the Delaware data. Just over 1.5% of admissions in Delaware were missing one or more of the data elements required to calculate WT and were excluded from the analysis. Twenty-six admissions in Delaware had admission and discharge on the same data and were also excluded from the analysis. None of the Maryland admissions had missing WT.

Client length of stay (LOS) was calculated by subtracting admission date from discharge date and adding one day so that clients admitted and discharged on the same day have a LOS of one day. Records missing LOS were excluded from the analysis, resulting in the exclusion of 304 records, 2.8% of the Delaware admissions. Records with LOS greater than or equal to 338 days, the 95th percentile of the Delaware data, were also excluded (n=1135).

Often there are concerns regarding reporting of discharge dates. It is possible a client might show multiple admissions and discharges within a short period of time, which would appropriately be grouped as a single episode of care for analysis. The Delaware data indicated that readmissions within 30 days of discharge occurred 220 times, representing fewer than 2% of the Delaware admissions. Identifying readmissions in the Maryland data was not possible without a unique client identifier, but since Delaware records were not combined it would have been inappropriate to combine them in Maryland.

2.6 Independent variables

Independent variables include client demographics, employment status, criminal justice involvement, and type and severity of drug use. These variables were chosen because they represent factors likely to influence treatment access and LOS. In addition, indicator variables for state and time period were included in the model. The post period is broken into two sections: the PC only time period (2002 – 2003) and the PC and QI period (2004 – 2006). Thus, within Delaware the indicator for the earlier time period (2002–2003) is a marker for treatment in a PC only program and the indicator for the later time period (2004–2006) is a marker for treatment in a program under both PC and QI initiatives.

2.7 Statistical Models

Analyses employed multilevel linear regressions, modeling WT and LOS as linear functions of state, time, interactions between state and time, client demographics, employment, educational status, previous treatment, criminal justice involvement, and type and frequency of substance used and were completed using SAS version 9.2 (SAS Institute Inc, 2009). Interactions between the indicator for Delaware admissions and the two time period indicators are the main variables of interest, as they represent for Delaware clients the effects of PC only and PC in conjunction with QI initiatives. The linear regression models were solved using generalized estimating equations, which account for clustering of clients within programs.

2.8 Sensitivity Tests

The comparison group was included to try to control for temporal trends not due to the payment system change, however it is important to keep in mind that SA treatment programs in the two states may be very different. In addition, with a relatively short baseline WT, Maryland had little room to change in response to temporal trends. Therefore, despite client matching, because of differences in baseline WT between Delaware and Maryland (Table 3) and variation across states in SA treatment, the comparison group may not serve as intended. Therefore we conducted a sensitivity analysis of WT and LOS based solely on Delaware’s admissions.

Mean waiting time and Length of Stay (LOS) in days by admission year for adult clients in Delaware and Maryland outpatient AOD treatment, 1998 – 2006

The Delaware PC implemented the active participation criteria to incentivize programs to reach out to poorly attending clients and re-engage them in treatment. Under the prior global budget payment system it was not unusual for outpatient programs to keep clients officially on the rolls even though they had stopped attending treatment. Outpatient programs may choose not discharge a client because they know there is the likelihood of a readmission and that if there has been a discharge they will have to re-do paper work. In addition, counselors may hide behind “full” caseloads, not having to accept new patients even though visit frequency is very low. Thus the shift to PC represents a change in incentives for reporting discharge date that cannot be teased apart from the real change in LOS.

In the LOS analysis the pre-period is limited to2001 - after the change in reporting incentives went into effect. By limiting the pre-period we identify change in LOS due to change in treatment patterns without confounding with changes in reporting incentives. Although Maryland programs have a similar incentive not to discharge clients, there was no change in this incentive in Maryland over the study period. Therefore Maryland programs likely report longer than actual LOS which would bias our hypothesis to the null and may make it more difficult to identify improved LOS in Delaware.

3. RESULTS

3.1 Waiting time

Table 3 shows WT for outpatient treatment in Delaware and Maryland by admission year. In 2001 in Delaware mean WT was 22 days; down from 34 days in 1998. In 2001 in Maryland average WT was 7 days and had been increasing since 1998 when WT averaged 2.9 days.

Results of multivariate regression analyses are displayed in Table 4. Model 1 examines the change in WT based on the entire sample, including both Delaware and Maryland clients. The interaction coefficient for Delaware admissions in 2002 – 2003 is −13.3 (p < .0001) indicating the PC reduced WT by 13 days. The interaction coefficient for Delaware admissions in 2004 – 2006 is −20.0 (p < .0001) indicating that the PC in conjunction with QI initiatives reduced WT 20 days. Although there are likely many difference between Delaware and Maryland SA treatment programs, sensitivity analyses excluding Maryland data show similar results.

Estimates of the Effect of the Delaware PBC on Waiting Time in Outpatient Alcohol and Other Drug Treatment

Model 2, providing results of our sensitivity analysis which does not include Maryland data, gives −9.09 as the parameter estimate for 2002–2003, indicating that clients admitted to AOD treatment in Delaware in 2002–2003 waited nine fewer days for treatment than Delaware clients admitted between 1998 and 2001. The equivalent parameter estimate for 2004 – 2006 is −16.71, indicating almost a 17 day improvement in this later time period. These estimates do not control for temporal trends.

Findings from personal interviews with CEOs of Delaware AOD treatment programs indicate that the financial incentives in the PC played a significant role in clinicians’ and managers’ day-to-day activities. For example clinicians began to actively remind clients to attend treatment sessions and administrators tried to oversubscribe clients to group treatment sessions in order to meet the active participation criteria. In addition, the interviews indicated that financial pressure from the PC incentivized CEOs to attend carefully to the QI programs which they recognized could help the organizations improve WT and, therefore, improve the program’s performance on the PC.

3.2 Change over time in LOS

As shown in Table 3, average LOS in Delaware outpatient AOD treatment in 2001 was 102 days. This increased by 13% to almost 116 days by 2006. This increase contrasts sharply with the slight fluctuation in LOS for clients in Maryland programs which equaled 116 days for clients admitted in 2001, declined to 106 days for clients admitted in 2003 and increased to almost 116 days for clients admitted in 2006.

In multivariate analyses we examined change in LOS in Delaware from 1998 – 2006 using 1998 – 2001 as the pre-period and did not identify state-dependent changes in LOS over time (results not shown). However, the data do not allow us to tease apart differences brought about by in changes in incentives for reporting discharges in Delaware due to the PC from changes in LOS brought about in actual LOS by the PC. Model 1, shown in Table 5, limits the pre-period to 2001 only, after the change in active auditing of attendance for clients in treatment, and identifies an increase in LOS of 24 days in 2002–2003 (p = .02) and 22 days in 2004–2006 (p = .01). Because of the many potential and unmeasured differences between Delaware and Maryland, we also examined change in LOS in Delaware over time without the Maryland comparison group [Model 2]. This model shows LOS increased by 16 days in 2002 – 2003 (p = .11) and by 20 days in 2004 – 2006 (p = .01).

Estimates of the Effect of the Delaware PBC on Length of Stay in Outpatient Alcohol and Other Drug Treatment

4. DISCUSSION

Seven of eight programs serving public clients in Delaware began participating in the QI programs in 2004. Two of the main foci of the QI programs are to decrease WT and improve treatment continuation (McCarty, et al., 2007). Interestingly, and likely due to the effects of the PC incentives which had been in place since 2001, WT was already declining and LOS was increasing before the introduction QI programs. .

The continued decrease in WT in Delaware and increase in LOS from 2004 through 2006 is most likely attributed to a combination of the PC and QI or may be the result of additional experience with the PC. In interviews conducted with CEOs of AOD treatment organizations in Delaware, several stated that the PC provided the financial motivation to carefully attend to the QI programs. These findings suggest that the PC resulted in shorter WTs for treatment and longer LOS, and that the combination of the PC and QI had an even more significant effect on both measures. Both the interviews and the data support the idea that the willingness to adopt some of the QI procedures was likely due in part to the potential for financial incentives created by the PC. In other words the PC may have set the occasion for program willingness to engage in any new clinically sensible behavior that would help them meet their financial goals.

It is likely a number of factors influenced improvements in LOS and it was not possible to specify which of the many factors in operation were most responsible for observed changes. Perhaps clients who enter treatment when they are ready stay longer in treatment. In addition, because of the financial pressure of the PC, programs were actively working to improve LOS by reminding clients of appointments and incentivizing them to attend treatment.

Because Delaware discharge data is audited by the state and Maryland data is not it is likely that Maryland data is biased toward longer LOS, making our hypotheses biased toward the null. Therefore we can be confident that LOS increased dramatically in Delaware since we did see an increase relative to Maryland despite the differences in reporting incentives.

Regardless of the mechanism by which WT and LOS improved, the key is that PC provided the incentives and program accountability which set the stage for improvements. Ultimately clients benefit because longer LOS is associated with improved outcomes.

4.1 Study Limitations

This analysis is based on a PC in Delaware –one of the smallest states in the nation; PC may work differently in larger states. However, if, as in Delaware, the sponsor achieves provider buy-in, then success may be replicated in larger states or in counties. It is important to learn as much as possible about the PC implemented in Delaware as a number of other states are considering or beginning to implement similar incentive programs.

We examined AOD treatment episode data which can be unreliable because clients frequently discontinue treatment without a formal discharge and therefore are not available to provide required information. Fortunately, we relied almost exclusively on data collected at admission with discharge date the one exception. Discharge date may be of better quality in Delaware because with the implementation of the PC the state began formal auditing of patient records for active participation., as it was used in PC payments (Kemp, 2006). This incentive did not exist in Maryland during the comparison time period, and one would expect Maryland programs to be less conscientious in removing clients from their rolls. The effect of this discrepancy is to bias between-state LOS comparison toward the null. LOS in Maryland may be recorded as longer than they really are and our estimate of increased LOS in Delaware may be understated. Thus, though there were limitations in our ability to verify these key data, we are confident of our conclusions because any systematic, but unmeasured effect would have produced a failure to see a difference. Put differently, that there was almost certainly a bias toward reporting longer LOS in Maryland, the significantly longer LOS reported in Delaware under auditing conditions, suggests that the PC had a robust effect.

There are likely many differences between Delaware and Maryland SA treatment programs and although we matched clients based on their demographic characteristics, potential differences remain unmeasured. Although using a single state as a comparison is not an ideal control group, the magnitude of the effect combined with interview findings suggests the PC did influence WT and LOS. Sensitivity analyses also suggest an association between the PC and shorter WT and longer LOS.

4.2 Conclusion

Although financial incentives have been suggested as a way to improve quality of medical care (Institute of Medicine, 2001, 2006), few studies have identified improvements associated with financial incentives (Christianson, et al., 2008; Mehrotra, et al., 2009; Petersen, et al., 2006; Rosenthal & Frank, 2006). Building on a previous evaluation of the Delaware PC (McLellan, et al., 2008), this study identified improvements to treatment: shorter waiting time and increased LOS following implementation of the PC. The focus of AOD treatment programs may explain some of the success of the Delaware PC in the absence of similar findings for medical or behavioral health. Financial incentives for hospitals and medical groups often include a number of conditions, while outpatient AOD treatment services are more focused - dealing primarily with addiction to alcohol and drugs. In addition, the design of the Delaware PC – with relatively immediate (monthly) and financially significant financial consequences to the organization for clinically relevant changes in patient behavior - is likely important in the success of this PC. Future research should examine more closely design of successful PC systems.

This study suggests there may be a synergistic effect between performance contracting and quality improvement programs and the implementation of both types of programs in concert should be considered. It will be important to control for or estimate the effect of other QI programs in place at the same time as PC. In addition, future PC programs should be designed with a control group in mind from the beginning to more accurately estimate the effect of the PC.

The move to PC was positive for the purchaser – more individuals received services for a 5% increase in expenditures. Specifically, there were more than 650 additional admissions in 2006 as compared to 2001. Although programs could endeavor to improve utilization, reduce WT and increases LOS without a PC, this had not occurred in Delaware prior to the PC. The Delaware PC set new expectations for programs, held programs accountable and provided leeway in achieving expectations. Programs responded to the PC in different ways – by participating in QI, by incentivizing clients to attend treatment, and by incentivizing clinicians. By creating accountability and a market for innovation where previously there were no explicit financial incentives, the PC and QI programs are associated with longer length of stay and shorter waiting time for treatment. We cannot determine which aspects of the programs, for example, the financial incentives, additional attention and monitoring, or training in quality improvement, resulted in these improvements and future research should examine these aspects of the program in more detail.

Acknowledgments

This research was supported by NIDA Grant 5F31DA22822-2 and NIDA Grant P50 DA010233. Preliminary findings were presented at the Addiction Health Services Research Meetings October 28 – 30 2009, San Francisco, CA and October 25 – 27 2010 in Lexington, KY and at the Academy Health Research Meeting, June 28 – 30 2009 in Chicago, IL. We thank Meredith Rosenthal for helpful comments on this paper. Our thanks are extended to Jack Kemp, former Director of the Division of Substance Abuse and Mental Health in Delaware, the Delaware Division of Substance Abuse and Mental Health and the Maryland Alcohol and Drug Abuse Administration for providing the data.

Footnotes

Publisher's Disclaimer: This is a PDF file of an unedited manuscript that has been accepted for publication. As a service to our customers we are providing this early version of the manuscript. The manuscript will undergo copyediting, typesetting, and review of the resulting proof before it is published in its final citable form. Please note that during the production process errors may be discovered which could affect the content, and all legal disclaimers that apply to the journal pertain.