Leah Backhus, MD, MPH, FACS

Bio

Bio

Leah Backhus trained in general surgery at the University of Southern California and cardiothoracic surgery at the University of California Los Angeles. She practices at Stanford Hospital and is Chief of Thoracic Surgery at the VA Palo Alto. Her surgical practice consists of general thoracic surgery with special emphasis on thoracic oncology and minimally invasive surgical techniques. She is also involved in research with the Thoracic Surgical Health Services Research group, and has grant funding through the Veterans Affairs Administration. Her current research interests are in imaging surveillance following treatment for lung cancer and cancer survivorship. She is a member of the National Lung Cancer Roundtable of the American Cancer Society serving as Chair of the Task Group on Lung Cancer in Women. She also serves as a professional member of the Patient Centered Outcomes Research Institute (PCORI) Advisory Panel on Improving Healthcare Systems. As an educator, Dr. Backhus is the Associate Program Director for the Thoracic Track Residency and serves on the ACGME Residency Review Committee for Thoracic Surgery which is the accrediting body for all cardiothoracic surgery training programs in the US.

Abstract

To determine whether surgeon selection of instrumentation and other supplies during video-assisted thoracoscopic lobectomy (VATSL) can safely reduce intraoperative costs.In this retrospective, cost-focused review of all video-assisted thoracoscopic surgery anatomic lung resections performed by 2 surgeons at a single institution between 2010 and 2014, we compared VATSL hospital costs and perioperative outcomes between the surgeons, as well as costs of VATSL compared with thoracotomy lobectomy (THORL).A total of 100 VATSLs were performed by surgeon A, and 70 were performed by surgeon B. The preoperative risk factors did not differ significantly between the 2 groups of surgeries. Mean VATSL total hospital costs per case were 24% percent greater for surgeon A compared with surgeon B (P = .0026). Intraoperative supply costs accounted for most of this cost difference and were 85% greater for surgeon A compared with surgeon B (P

Abstract

Lung resections carry a significant risk of complications necessitating the characterization of peri-operative risk factors. Unhealthy alcohol use represents one potentially modifiable factor. In this retrospective cohort study, the largest to date of lung resections in the Veterans Health Administration (VHA), we examined the association between unhealthy alcohol use and postoperative complications and mortality.Veterans Affairs Surgical Quality Improvement Program data recorded at 86 medical centers between 2007 and 2011 were used to identify 4,715 patients that underwent lung resection. Logistic regression models, adjusted for demographics and comorbidities, were fit to assess the association between unhealthy alcohol use (report of >2 drinks per day in the 2 weeks preceding surgery) and 30-day outcomes.Among 4,715 patients that underwent pulmonary resection, 630 (13.4%) reported unhealthy alcohol use (>2 drinks/day). Overall, postoperative complications occurred in 896 (19.0%) patients, including pneumonia in 524 (11.1%). The rate of mortality was 2.6%. In adjusted analyses, complications were significantly more common among patients with unhealthy alcohol use [odds ratio (OR), 1.42; 95% confidence interval (CI), 1.15-1.74] including, specifically, pneumonia (OR, 1.69; 95% CI, 1.32-2.15). No statistically significant association was identified between unhealthy alcohol use and mortality (OR, 1.27; 95% CI, 0.75-2.02). In secondary analyses that stratified by smoking status at the time of surgery, drinking more than 2 drinks per day was associated with post-operative complications in patients reporting current smoking (OR, 1.51; 95% CI, 1.18-1.91) and was not identified in those reporting no current smoking at the time of surgery (OR, 1.23; 95% CI, 0.79-1.85).In this large VHA study, 13% of patients undergoing lung resection reported drinking more than 2 drinks per day in the preoperative period, which was associated with increased risk of post-operative complications. Unhealthy alcohol use may be an important target for perioperative risk-mitigation interventions, particularly in patients who report current smoking.

Abstract

The landscape of care for early-stage non-small cell lung cancer continues to evolve. While some of the developments do not seem as dramatic as what has occurred in advanced disease in recent years, there is a continuous improvement in our ability to diagnose disease earlier and more accurately. We have an increased understanding of the diversity of early-stage disease and how to better tailor treatments to make them more tolerable without impacting efficacy. The International Association for the Study of Lung Cancer and the Journal of Thoracic Oncology publish this annual update to help readers keep pace with these important developments. Experts in the care of early-stage lung cancer patients have provided focused updates across multiple areas including screening, pathology, staging, surgical techniques and novel technologies, adjuvant therapy, radiotherapy, surveillance, disparities, and quality of life. The source for information includes large academic meetings, the published literature, or novel unpublished data from other international oncology assemblies.

Abstract

While lepidic-predominant lung adenocarcinomas are known to have better outcomes than similarly sized solid tumors, the impact of smaller noninvasive foci within predominantly solid tumors is less clearly characterized. We tested the hypothesis that lung adenocarcinomas with even a small ground-glass opacity (GGO) component have a better prognosis than otherwise similar pure solid (PS) adenocarcinomas.The maximum total and solid-component diameters were determined by preoperative computed tomography in patients who underwent lobar or sublobar resection of clinical N0 adenocarcinomas without induction therapy between May 2003 and August 2013. Survival between patients with PS tumors (0% GGO) or tumors with a minor ground-glass (MGG) component (1%-25% GGO) was compared by Kaplan-Meier and Cox analyses.A total of 123 patients met the inclusion criteria, comprising 54 PS (44%) and 69 MGG (56%) whose mean ground-glass component was 18 ± 7%. The solid component tumor diameter was not significantly different between the groups (2.3 ± 1.2 cm vs. 2.5 ± 1.3 cm, P = .2). Upstaging to pN1-2 was more common for the PS group (13% [7/54] vs. 3% [2/69], P = .04), but the distribution of pathologic stage was not significantly different between the groups (PS 76% stage I [41/54] vs. MGG 80% stage I [55/69], P = .1). Having a MGG component was associated with markedly better survival in both univariate analysis (MGG 5-year overall survival 86.7% vs. PS 64.5%, P = .001) and multivariable survival analysis (hazard ratio, 0.30, P = .01).Patients with resected cN0 lung adenocarcinoma who have even a small GGO component have markedly better survival than patients with PS tumors, which may have implications for both treatment and surveillance strategies.

Abstract

It remains unclear whether a dominant lung adenocarcinoma that presents with multifocal ground glass opacities (GGOs) should be treated by local therapy. We sought to address survival in this setting and to identify risk factors for progression of unresected GGOs.Retrospective review of 70 patients who underwent resection of a pN0, lepidic adenocarcinoma, who harbored at least 1 additional GGO. Features associated with GGO progression were determined using logistic regression and survival was evaluated using the Kaplan-Meier method.Subjects harbored 1 to 7 GGOs beyond their dominant tumor (DT). Mean follow-up was 4.1 ± 2.8 years. At least 1 GGO progressed after DT resection in 21 patients (30%). In 11 patients (15.7%), this progression prompted resection (n = 5) or stereotactic radiotherapy (n = 6) at mean 2.8 ± 2.3 years. Several measures of the overall tumor burden were associated with GGO progression (all P values 1 cm (odds ratio, 4.98; 95% confidence interval, 1.15-21.28) were the only factors independently associated with GGO progression. Survival was not negatively influenced by GGO progression (100% with vs 80.7% without; P = .1) or by progression-prompting intervention (P = .4).At 4.1-year mean follow-up, 15.7% of patients with unresected GGOs after resection of a pN0 DT underwent subsequent intervention for a progressing GGO. Some features correlated with GGO growth, but neither growth, nor need for an intervention, negatively influenced survival. Thus, even those at highest risk for GGO progression should not be denied resection of a DT.

Abstract

Although residential segregation has been implicated in various negative health outcomes, its association with kidney transplantation has not been examined.Age- and sex-standardized kidney transplantation rates were calculated from the Scientific Registry of Transplant Recipients, 2000-2013. Population characteristics including segregation indices were derived from the 2010 U.S. Census data and the U.S. Renal Data System. Separate multivariable Poisson regression models were constructed to identify factors independently associated with kidney transplantation among Blacks and Whites.Median age- and sex-standardized kidney transplantation rates were 114 per 100,000 for Blacks and 38 per 100,000 for Whites. 16.1% of the U.S. population lived in counties with high segregation. There was no difference in the kidney transplantation rates across the levels of segregation among Blacks and Whites.Factors other than residential segregation may play roles in kidney transplantation disparities. Continued efforts to identify these factors may be beneficial in reducing transplantation disparities across the U.S.Using the Scientific Registry of Transplant Recipients and U.S. census data, we aimed to determine whether residential segregation was associated with kidney transplantation rates. We found that there was no association between residential segregation and kidney transplantation rates.

Abstract

Modifications in recipient and donor criteria and innovations in donor management hold promise for increasing rates of lung transplantation, yet availability of donors remains a limiting resource. Imaging is critical in the work-up of donor and recipient including identification of conditions that may portend to poor posttransplant outcomes or necessitate modifications in surgical technique. This article describes the radiologic principles that guide selection of patients and surgical procedures in lung transplantation.

Abstract

Little is known about symptom assessment around the time of lung cancer diagnosis. The purpose of this pilot study was to assess symptoms within 2 months of diagnosis and the frequency with which clinicians addressed symptoms among a cohort of veterans (n = 20) newly diagnosed with lung cancer. We administered questionnaires and then reviewed medical records to identify symptom assessment and management provided by subspecialty clinics for 6 months following diagnosis.Half (50%) of the patients were diagnosed with early-stage non-small-cell lung cancer (NSCLC), stage I or II. At baseline, 45% patients rated their overall symptoms as severe. There were no significant differences in symptoms among patients with early- or late-stage NSCLC or small-cell lung cancer. Of the 212 clinic visits over 6 months, 70.2% occurred in oncology. Clinicians most frequently addressed pain although assessment differed by clinic.Veterans with newly diagnosed lung cancer report significant symptom burden. Despite ample opportunities to address patients' symptoms, variations in assessment exist among subspecialty services. Coordinated approaches to symptom assessment are likely needed among patients newly diagnosed with lung cancer.

Abstract

Modifications in recipient and donor criteria and innovations in donor management hold promise for increasing rates of lung transplantation, yet availability of donors remains a limiting resource. Imaging is critical in the work-up of donor and recipient including identification of conditions that may portend to poor posttransplant outcomes or necessitate modifications in surgical technique. This article describes the radiologic principles that guide selection of patients and surgical procedures in lung transplantation.

Abstract

The importance of imaging surveillance after treatment for lung cancer is not well characterized. We examined the association between initial guideline recommended imaging surveillance and survival among early-stage resected non-small-cell lung cancer (NSCLC) patients.A retrospective study was conducted using Surveillance, Epidemiology, and End Results-Medicare data (1995-2010). Surgically resected patients, with stage I and II NSCLC, were categorized by imaging received during the initial surveillance period (4-8 mo) after surgery. Primary outcome was overall survival. Secondary treatment interventions were examined as intermediary outcomes.Most (88%) patients had at least one outpatient clinic visit, and 24% received an initial computerized tomography (CT) during the first surveillance period. Five-year survival by initial surveillance imaging was 61% for CT, 58% for chest radiography, and 60% for no imaging. After adjustment, initial CT was not associated with improved overall survival (hazard ratio [HR], 1.04; 95% confidence interval [CI] 0.96-1.14). On subgroup analysis, restricted to patients with demonstrated initial postoperative follow-up, CT was associated with a lower overall risk of death for stage I patients (HR, 0.85; 95% CI, 0.74-0.98), but not for stage II (HR, 1.01; 95% CI, 0.71-1.42). There was no significant difference in rates of secondary interventions predicted by type of initial imaging surveillance.Initial surveillance CT is not associated with improved overall or lung cancer-specific survival among early-stage NSCLC patients undergoing surgical resection. Stage I patients with early follow-up may represent a subpopulation that benefits from initial surveillance although this may be influenced by healthy patient selection bias.

Abstract

Failure to rescue is defined as death after an acute inpatient event and has been observed among hospitals that perform general, vascular, and cardiac surgery. This study aims to evaluate variation in complication and failure to rescue rates among hospitals that perform pulmonary resection for lung cancer.By using the Society of Thoracic Surgeons General Thoracic Surgery Database, a retrospective, multicenter cohort study was performed of adult patients with lung cancer who underwent pulmonary resection. Hospitals participating in the Society of Thoracic Surgeons General Thoracic Surgery Database were ranked by their risk-adjusted, standardized mortality ratio (using random effects logistic regression) and grouped into quintiles. Complication and failure to rescue rates were evaluated across 5 groups (very low, low, medium, high, and very high mortality hospitals).Between 2009 and 2012, there were 30,000 patients cared for at 208 institutions participating in the Society of Thoracic Surgeons General Thoracic Surgery Database (median age, 68 years; 53% were women, 87% were white, 71% underwent lobectomy, 65% had stage I). Mortality rates varied over 4-fold across hospitals (3.2% vs 0.7%). Complication rates occurred more frequently at hospitals with higher mortality (42% vs 34%, P < .001). However, the magnitude of variation (22%) in complication rates dwarfed the 4-fold magnitude of variation in failure to rescue rates (6.8% vs 1.7%, P < .001) across hospitals.Variation in hospital mortality seems to be more strongly related to rescuing patients from complications than to the occurrence of complications. This observation is significant because it redirects quality improvement and health policy initiatives to more closely examine and support system-level changes in care delivery that facilitate early detection and treatment of complications.

Abstract

Current guidelines consider the absence of a dependable social support system as an absolute contraindication to lung transplantation, yet there are varying degrees of social support among those selected for transplantation. We sought to characterize the relationship between a patient's self-reported primary caregiver and long-term outcomes after lung transplantation.We conducted a retrospective cohort study of all lung transplant recipients ≥18 years of age who had undergone an initial transplant (2000 to 2010). Cox regression was used to explore the relationship between type of caregiver and the long-term risk of death and chronic graft failure while adjusting for potential confounders.There were 452 patients undergoing lung transplantation over the study period who met the inclusion criteria. Five types of primary caregivers were identified, with spouse 60% (270 of 452) being the most common. Compared with spousal caregiver, overall survival was significantly worse for patients who identified an adult child (hazard ratio [HR] 2.04, 95% confidence interval [CI] 1.15 to 3.60) or sibling (HR 3.79, 95% CI 2.48 to 5.78) as their primary caregiver. In addition, risk for long-term graft failure was increased significantly (HR 3.34, 95% CI 1.58 to 7.06) among patients with sibling caregivers.Type of primary caregiver selected before transplantation was associated with long-term outcomes. These results may be a reflection of the long-term support requirements and/or competing responsibilities of other caregiver types. Interventions to increase support for at-risk patients may include identifying additional caregivers during the pre-transplant assessment. As lung allocation is designed to maximize graft potential, risk stratification for listing patients should include type of caregiver and be considered as critically as major organ dysfunction.

Abstract

Current guidelines recommend routine imaging surveillance for patients with non-small cell lung cancer (NSCLC) after treatment. Little is known about surveillance patterns for patients with surgically resected early-stage lung cancer in the community at large. We sought to characterize surveillance patterns in a national cohort.We conducted a retrospective study using the Surveillance, Epidemiology, and End-Results (SEER)-Medicare database (1995-2010). Patients with stage I/II NSCLC treated with surgical resection were included. Our primary outcome was receipt of imaging between 4 and 8 months after the surgical procedure. Covariates included demographics and comorbidities.Chest radiography (CXR) was the most frequent initial modality (60%), followed by chest computed tomography (CT) (25%). Positron emission tomography (PET) was least frequent as an initial imaging modality (3%). A total of 13% of patients received no imaging within the initial surveillance period. Adherence to National Comprehensive Cancer Network (NCCN) guidelines for imaging by overall prevalence was 47% for receipt of CT; however, rates of CT increased over time from 28% to 61% (p < 0.01). Reduced rates of CT were associated with stage I disease and surgical resection as the sole treatment modality.Imaging after definitive surgical treatment for NSCLC predominantly used CXR rather than CT. Most of this imaging is likely for surveillance, and in that context CXR has inferior detection rates for recurrence and new cancers. Adherence to guideline-recommended CT surveillance after surgical treatment is poor, but the reasons are multifactorial. Efforts to improve adherence to imaging surveillance must be coupled with greater evidence demonstrating improved long-term outcomes.

Abstract

Optimizing evidence-based care to improve quality is a critical priority in the United States. We sought to examine adherence to imaging guideline recommendations for staging in patients with locally advanced lung cancer in a national cohort.We identified 3,808 patients with stage IIB, IIIA, or IIIB lung cancer by using the national Department of Veterans Affairs (VA) Central Cancer Registry (2004-2007) and linked these patients to VA and Medicare databases to examine receipt of guideline-recommended imaging based on National Comprehensive Cancer Network and American College of Radiology Appropriateness Criteria. Our primary outcomes were receipt of guideline-recommended brain imaging and positron emission tomography (PET) imaging. We also examined rates of overuse defined as combined use of bone scintigraphy (BS) and PET, which current guidelines recommend against. All imaging was assessed during the period 180 days before and 180 days after diagnosis.Nearly 75% of patients received recommended brain imaging, and 60% received recommended PET imaging. Overuse of BS and PET occurred in 25% of patients. More advanced clinical stage and later year of diagnosis were the only clinical or demographic factors associated with higher rates of guideline-recommended imaging after adjusting for covariates. We observed considerable regional variation in recommended PET imaging and overuse of combined BS and PET.Receipt of guideline-recommended imaging is not universal. PET appears to be underused overall, whereas BS demonstrates continued overuse. Wide regional variation suggests that these findings could be the result of local practice patterns, which may be amenable to provider education efforts such as Choosing Wisely.

Abstract

Complications after pulmonary resection lead to higher costs of care. Video-assisted thoracoscopic surgery (VATS) for lobectomy is associated with fewer complications, but lower inpatient costs for VATS have not been uniformly demonstrated. Because some complications occur after discharge, we compared 90-day costs of VATS lobectomy versus open lobectomy and explored whether differential health care use after discharge might account for any observed differences in costs.A cohort study (2007-2011) of patients with lung cancer who had undergone resection was conducted using MarketScan-a nationally representative sample of persons with employer-provided health insurance. Total costs reflect payments made for inpatient, outpatient, and pharmacy claims up to 90 days after discharge.Among 9,962 patients, 31% underwent VATS lobectomy. Compared with thoracotomy, VATS was associated with lower rates of prolonged length of stay (PLOS) (3.0% versus 7.2%; p<0.001), 90-day emergency department (ED) use (22% versus 24%; p=0.005), and 90-day readmission (10% versus 12%; p=0.026). Risk-adjusted 90-day costs were $3,476 lower for VATS lobectomy (p=0.001). Differential rates of PLOS appeared to explain this cost difference. After adjustment for PLOS, costs were $1,276 lower for VATS, but this difference was not significant (p=0.125). In the fully adjusted model, PLOS was associated with the highest cost differential (+$50,820; p<0.001).VATS lobectomy is associated with lower 90-day costs--a relationship that appears to be mediated by lower rates of PLOS. Although VATS may lead to lower rates of PLOS among patients undergoing lobectomy, observational studies cannot verify this assertion. Strategies that reduce PLOS will likely result in cost-savings that can increase the value of thoracic surgical care.

Abstract

A regional quality improvement effort does not exist for thoracic surgery in the United States. To initiate the development of one, we sought to describe temporal trends and hospital-level variability in associated outcomes and costs of pulmonary resection in Washington (WA) State.A cohort study (2000-2011) was conducted of operated-on lung cancer patients. The WA State discharge database was used to describe outcomes and costs for operations performed at all nonfederal hospitals within the state.Over 12 years, 8,457 lung cancer patients underwent pulmonary resection across 49 hospitals. Inpatient deaths decreased over time (adjusted p-trend=0.023) but prolonged length of stay did not (adjusted p-trend=0.880). Inflation-adjusted hospital costs increased over time (adjusted p-trend<0.001). Among 24 hospitals performing at least 1 resection per year, 5 hospitals were statistical outliers in rates of death (4 lower and 1 higher than the state average), and 13 were outliers with respect to prolonged length of stay (7 higher and 6 lower than the state average) and costs (5 higher and 8 lower than the state average). When evaluated for rates of death and costs, there were hospitals with fewer deaths/lower costs, fewer deaths/higher costs, more deaths/lower costs, and more deaths/higher costs.Variability in outcomes and costs over time and across hospitals suggest opportunities to improve the quality and value of thoracic surgery in WA State. Examples from cardiac surgery suggest that a regional quality improvement collaborative is an effective way to meaningfully and rapidly act upon these opportunities.

Abstract

Lung volume reduction surgery (LVRS) provides palliation and improved quality of life in select patients with end-stage chronic obstructive pulmonary disease (COPD). The effect of previous LVRS on lung transplant outcomes has been inadequately studied. We report our experience in the largest single institution series of these combined procedures.The records of 472 patients with COPD undergoing lung transplantation or LVRS between 1995 and 2010 were reviewed. Outcomes of patients undergoing transplant after LVRS were compared with outcomes of patients undergoing transplant or LVRS alone. Survival was compared using log-rank tests and the Kaplan-Meier method.Demographics, comorbidities, and spirometry were similar at the time of transplantation. Patients who had undergone lung transplant after LVRS had longer transplant operative times (mean 4.4 vs 5.6 hours; P = .020) and greater hospital length of stay (mean 17.6 vs 29.1 days; P = .005). Thirty-day mortality and major morbidity were similar. Posttransplant survival was reduced for transplant after LVRS (median, 49 months; 95% confidence interval [CI], 16, 85 months) compared with transplant alone (median, 96 months; 95% CI, 82, 106 months; P = .008). The composite benefit of combined procedures, defined as bridge from LVRS to transplant of 55 months and posttransplant survival of 49 months (total 104 months), was comparable with survival of patients undergoing either procedure alone.Lung transplant after LVRS leads to minimal additional perioperative risk. The reduced posttransplant survival in patients undergoing combined procedures is in contradistinction to reports from other smaller series. When determining the best surgical treatment for patients with more severe disease, the benefit of LVRS before transplant should be weighed against the consequence of reduced posttransplant survival.

Abstract

The goals of this study were to examine the real-world effectiveness of PET in avoiding unnecessary surgery for newly diagnosed patients with non-small cell lung cancer.A cohort of 2,977 veterans with non-small cell lung cancer between 1997 and 2009 were assessed for use of PET during staging and treatment planning. The subgroup of 976 patients who underwent resection was assessed for several outcomes, including pathologic evidence of mediastinal lymph node involvement, distant metastasis, and 12-mo mortality. We anticipated that PET may have been performed selectively on the basis of unobserved characteristics (e.g., providers ordered PET when they suspected disseminated disease). Therefore, we conducted an instrumental variable analysis, in addition to conventional multivariate logistic regression, to reduce the influence of this potential bias. This type of analysis attempts to identify an additional variable that is related to receipt of treatment but not causally associated with the outcome of interest, similar to randomized assignment. The instrument here was calendar time. This analysis can be informative when patients do not receive the treatment that the instrument suggests they "should" have received.Overall, 30.3% of patients who went to surgery were found to have evidence of metastasis uncovered during the procedure or within 12 mo, indicating that nearly one third of patients underwent surgery unnecessarily. The use of preoperative PET increased substantially over the study period, from 9% to 91%. In conventional multivariate analyses, PET use was not associated with a decrease in unnecessary surgery (odds ratio, 0.87; 95% confidence interval, 0.66-1.16; P = 0.351). However, a reduction in unnecessary surgery (odds ratio, 0.53; 95% confidence interval, 0.34-0.82; P = 0.004) was identified in the instrumental variable analyses, which attempted to account for potentially unobserved confounding.PET has now become routine in preoperative staging and treatment planning in the community and appears to be beneficial in avoiding unnecessary surgery. Evaluating the effectiveness of PET appears to be influenced by potentially unmeasured adverse selection of patients, especially when PET first began to be disseminated in the community.

Abstract

Treatment optimization for centrally located lung cancers requires special considerations for determining resectability and patient selection. Evaluation involves an experienced multidisciplinary team performing careful clinical and invasive-disease staging to identify the best management approach and ascertain the need for multimodality therapy. Preoperative imaging alone is often inaccurate in its ability to determine whether the patient is at an advanced clinical T stage that might preclude curative surgical resection. Therefore, other modalities are often necessary to complete the clinical staging. In the absence of irrefutable evidence of unresectability, however, surgical exploration should be undertaken with curative intent. Long-term outcomes can be favorable in select patients, and most of the procedures, including complex reconstructions, can be performed with acceptable morbidity and mortality.

Abstract

Lymphovascular invasion (LVI) is considered a high-risk pathologic feature in resected non-small cell carcinoma (NSCLC). The ability to stratify stage I patients into risk groups may permit refinement of adjuvant treatment recommendations. We performed a systematic review and meta-analysis to evaluate whether the presence of LVI is associated with disease outcome in stage I NSCLC patients.A systematic search of the literature was performed (1990 to December 2012 in MEDLINE/EMBASE). Two reviewers independently assessed the quality of the articles and extracted data. Pooled hazard ratios (HRs) and 95% confidence intervals (CI) were estimated with a random effects model. Two end points were independently analyzed: recurrence-free survival (RFS) and overall survival (OS). We analyzed unadjusted and adjusted effect estimates, resulting in four separate meta-analyses.We identified 20 published studies that reported the comparative survival of stage I patients with and without LVI. The unadjusted pooled effect of LVI was significantly associated with worse RFS (HR, 3.63; 95% CI, 1.62 to 8.14) and OS (HR, 2.38; 95% CI, 1.72 to 3.30). Adjusting for potential confounders yielded similar results, with RFS (HR, 2.52; 95% CI, 1.73 to 3.65) and OS (HR, 1.81; 95% CI, 1.53 to 2.14) both significantly worse for patients exhibiting LVI.The present study indicates that LVI is a strong prognostic indicator for poor outcome for patients with surgically managed stage I lung cancer. Future prospective lung cancer trials with well-defined methods for evaluating LVI are necessary to validate these results.

Abstract

Lung cancer mortality rates may vary with access to specialty providers and local resources. We sought to examine the effect of access to care, using density of lung cancer care providers, on lung cancer mortality among blacks and whites in the United States.We examined U.S. county-level data for age-adjusted lung cancer mortality rates from 2003 to 2007. Our primary independent variable was per capita number of thoracic oncologic providers, adjusting for county-level smoking rates, socioeconomic status, and other geographic factors. Data were obtained from 2009 Area Resource File, National Center for Health Statistics, and the County Health Rankings Project.Providers of lung cancer care were unevenly distributed among the U.S. counties. For example, 41.4% of the U.S. population reside in counties with less than four thoracic surgeons per 100,000 people, 23.4% in counties with 4 to 15 surgeons per 100,000 people, and 35.3% in counties with more than 15 surgeons per 100,000 people. Geographically, 4.3% of whites compared with 11.2% of blacks lived in high lung cancer mortality zones. Lung cancer mortality did not vary by density of thoracic surgeons or oncology services; however, higher primary care provider density was associated with lung cancer mortality reduction of 4.1 per 100,000 for whites.Variation in provider density for thoracic oncology in the United States was not associated with a difference in lung cancer mortality. Lower mortality associated with higher primary care provider density suggests that equitable access to primary care may lead to reduced cancer disparities.

Abstract

Lung cancer is the leading cause of cancer deaths in the United States. Despite many advances in treatment, surgery remains the preferred treatment modality for patients presenting with early stage disease. Imaging is critical in the preoperative evaluation of these patients being considered for a curative resection. Advanced imaging techniques provide valuable information, including primary diagnostics, staging, and intraoperative localization for suspected lung cancer. Knowledge of surgical implications of imaging findings can aid both radiologists and surgeons in delivering safe and effective care.

Abstract

To examine the relationship between race and lung cancer mortality and the effect of residential segregation in the United States.A retrospective, population-based study using data obtained from the 2009 Area Resource File and Surveillance, Epidemiology and End Results program.Each county in the United States.Black and white populations per US county.A generalized linear model with a Poisson distribution and log link was used to examine the association between residential segregation and lung cancer mortality from 2003 to 2007 for black and white populations. Our primary independent variable was the racial index of dissimilarity. The index is a demographic measure that assesses the evenness with which whites and blacks are distributed across census tracts within each county. The score ranges from 0 to 100 in increasing degrees of residential segregation. RESULTS The overall lung cancer mortality rate was higher for blacks than whites (58.9% vs 52.4% per 100 000 population). Each additional level of segregation was associated with a 0.5% increase in lung cancer mortality for blacks (P < .001) and an associated decrease in mortality for whites (P = .002). Adjusted lung cancer mortality rates among blacks were 52.4% and 62.9% per 100 000 population in counties with the least (<40% segregation) and the highest levels of segregation (≥60% segregation), respectively. In contrast, the adjusted lung cancer mortality rates for whites decreased with increasing levels of segregation.Lung cancer mortality is higher in blacks and highest in blacks living in the most segregated counties, regardless of socioeconomic status.

Abstract

Lung cancer is a leading cause of death in the United States and among veterans. This study compares patterns of diagnosis, treatment, and survival for veterans diagnosed with non-small cell lung cancer (NSCLC) using a recently established cancer registry for the Veterans Affairs Pacific Northwest Network with the Puget Sound Surveillance, Epidemiology, and End Results cancer registry.A cohort of 1715 veterans with NSCLC were diagnosed between 2000 and 2006, and 7864 men were diagnosed in Washington State during the same period. Demographics, tumor characteristics, initial surgical patterns, and survival across the two registries were evaluated.Veterans were more likely to be diagnosed with stage I or II disease (32.8%) compared with the surrounding community (21.5%, p = 0.001). Surgical resection rates were similar for veterans (70.2%) and nonveterans (71.2%) older than 65 years with early-stage disease (p = 0.298). However, veterans younger than 65 years with early-stage disease were less likely to undergo surgical resection (83.3% versus 91.5%, p = 0.003). Because there were fewer late-stage patients among veterans, overall survival was better, although within each stage group veterans experienced worse survival compared with community patients. The largest differences were among early-stage patients with 44.6% 5-year survival for veterans compared with 57.4% for nonveterans (p = 0.004).The use of surgical resection among younger veterans with NSCLC may be lower compared with the surrounding community and may be contributing to poorer survival. Cancer quality of care studies have primarily focused on patients older than 65 years using Medicare claims; however, efforts to examine care for younger patients within and outside the Department of Veterans Affairs are needed.

Abstract

Surgical manipulation of lung cancers may increase circulating tumor cells and contribute to metastatic recurrence after resection. Cyclooxygenase 2 is overexpressed in most non-small cell lung cancer and upregulates the cell adhesion receptor CD44. Our goal was to examine the effects of perioperative cyclooxygenase blockade on the metastatic potential of circulating tumor cells, CD44 expression, and adhesion of cancer cells to extracellular matrix.Human non-small cell lung cancer cells (A549) were injected through the lateral tail vein in an in vivo murine model of tumor metastasis with three random treatment groups: no treatment, perioperative selective cyclooxygenase 2 inhibition (celecoxib) only, and continuous celecoxib. Lung metastases were assessed at 6 weeks by a blinded observer. For in vitro experiments, cells were treated with celecoxib, and expression of CD44 was determined by Western blotting. Extracellular matrix adhesion was assessed by Matrigel (BD Labware, Bedford, Mass) assay.In vivo lung metastases were significantly decreased relative to control by both perioperative and continuous celecoxib (P = .0135). There was no significant difference in number of metastases between continuous and perioperative treatment groups. In vitro adhesion to the extracellular matrix was significantly inhibited by celecoxib in a dose-dependent manner (P < .01). A549 cells expressed high levels of CD44, upregulated by interleukin 1beta and downregulated by celecoxib.Celecoxib significantly reduced establishment of metastases by circulating tumor cells in a murine model. It also inhibited CD44 expression and extracellular matrix adhesion in vitro. Perioperative modulation of cyclooxygenase 2 may be a novel strategy to minimize metastases from circulating tumor cells during this high-risk period.

Abstract

We reviewed our experience with adult living lobar lung transplant (LL) recipients to assess whether size and shape mismatch of the donor organ to the recipient pre-disposes to the development of pleural space problems (PSP).Eighty-seven LL were performed on 84 adult recipients from 1993 through 2003. Seventy-six patients had cystic fibrosis. Patient records were examined for PSP, defined as air leak or bronchopleural fistula for more than 7 days; pneumothorax, loculated pleural effusions, or empyema in 68 patients for which complete data were available.There were 24 PSP identified for an overall incidence of 35%. The most common PSP was air leak/bronchopleural fistula, accounting for 38% of PSP. The second most common PSP was loculated pleural effusion (21% of PSP). Empyema was uncommon (2 patients, 3% of total patients) in our series of patients despite the large population of cystic fibrosis patients. In 4 of these patients, computed tomography-guided drainage was used for loculated effusions after chest tube removal. Three LL patients underwent surgery for persistent air leak and required muscle flap repair. One of these required subsequent omental transfer. Two LL patients required decortication for empyema. Many patients with PSP could be managed without further surgical intervention (14/24 patients). Donor-recipient height mismatch was not significantly different between PSP and non-PSP patients (p = 0.53).The incidence of PSP in LL recipients is similar to that reported in the literature on cadaveric transplant recipients. The relatively small lobe in the potentially contaminated chest cavity of cystic fibrosis recipients does not significantly pre-dispose to development of empyema despite immunosuppression. Many PSP can be managed non-operatively, although early aggressive intervention for large air leaks and judicious chest tube management are essential for a good outcome.

Abstract

The cyclooxygenase 2 enzyme has become a therapeutic target in cancer treatment. Cyclooxygenase 2 blockade with selective inhibitors increases apoptosis and decreases the metastatic potential of lung cancer cells. Some of the antitumor effects of these inhibitors may occur through both cyclooxygenase 2-dependent and independent pathways. Our goal was to investigate these pathways using celecoxib (selective cyclooxygenase 2 inhibitor) and 2,5-dimethyl celecoxib, a structural analog modified to eliminate cyclooxygenase 2 inhibitory activity, while potentially maintaining antineoplastic properties.2,5-dimethyl celecoxib was synthesized in the Department of Chemistry at the University of Southern California. With the use of non-small cell lung cancer cells (A549), prostaglandin E2 production was quantified by enzyme-linked immunosorbent assay to assess cyclooxygenase 2 activity. Cell proliferation was assessed by 3-(4,5-dimethylthiazol-2-yl)-5-(3-carboxymethoxyphenyl)-2-(4-sulfophenyl)-2H-tetrazolium, inner salt assay. Cell migration was performed using transwell inserts that were matrigel coated for invasion experiments. Gelatin zymography was used to assess matrix-metalloproteinase activity.2,5-dimethyl celecoxib did not inhibit interleukin-1beta-stimulated prostaglandin E2 production, whereas celecoxib did even at low doses. Both celecoxib and 2,5-dimethyl celecoxib decreased tumor cell viability and proliferation with IC50 for celecoxib and 2,5-dimethyl celecoxib of 73 and 53 micromol/L, respectively. Both drugs were also potent inducers of apoptosis, and both inhibited tumor cell migration and invasion. This was associated with down-regulation of matrix metalloproteinase activity.2,5-dimethyl celecoxib is a structural analog of celecoxib that lacks cyclooxygenase 2 inhibitory activity but exhibits significant antineoplastic properties comparable to celecoxib. This suggests that the antineoplastic activities of celecoxib are, at least in part, cyclooxygenase independent and that therapeutic strategies can be developed without the side effects of global cyclooxygenase 2 blockade.

Abstract

Cyclooxygenase-2 plays a role in growth, apoptosis, angiogenesis, and metastasis in lung cancer. Inhibition of cyclooxygenase-2 with celecoxib has been shown to inhibit tumor growth. We evaluated the effect of increasing doses of celecoxib in a murine model of human lung cancer.Human lung adenocarcinoma cells (A549) were implanted in the left lung upper lobe of mice with severe combined immunodeficiency syndrome. Mice were randomly assigned to 4 groups at implantation (n = 10 per group): control, 125 mg/kg chow, 500 mg/kg chow, 1000 mg/kg chow. After 3 weeks, mice were killed, and a blinded observer measured total tumor volume. The dose effect of celecoxib was examined in vitro by studying cell proliferation, expression of cyclooxygenase-2 (mRNA and protein), and production of prostaglandin E 2 in unstimulated and interleukin 1beta-stimulated cells.All 40 mice survived for 3 weeks with no observed toxicities. Total tumor volume was inhibited in each celecoxib group ( P = .0038, Welch analysis of variance): 206.7 +/- 119.5 mm 3 (control group), 41.4 +/- 54.0 mm 3 (low-dose group), 34.5 +/- 39.3 mm 3 (medium-dose group), and 27.3 +/- 53.6 mm 3 (high-dose group). In vitro celecoxib was effective at inhibiting production of prostaglandin E 2 , even in stimulated cells, although little effect was seen on cyclooxygenase-2 protein levels. Inhibition of proliferation was evident only at doses that exceeded those used in the animal model.Inhibition of cyclooxygenase-2 with low-dose celecoxib restricted the growth of lung cancer in this model. This might be mediated by prostaglandin E 2 . Higher doses of celecoxib afforded no additional benefit. Chronic therapy with low-dose cyclooxygenase-2 inhibition has the potential to influence tumor progression in non-small cell lung cancer.

Abstract

Stationary manometry is the gold standard for the evaluation of patients with suspected esophageal motility disorders. Comparison of videoesophagram in the evaluation of esophageal motility disorders with stationary motility has not been objectively studied. Two hundred two patients with foregut symptoms underwent stationary motility and videoesophagram. Radiographic assessment of esophageal motility was done by video recording of five 10-cc swallows of barium. Abnormal esophageal body function was defined by stasis of barium in the middle third of the esophagus on at least four swallows or stasis on at least three swallows in the distal third. Stationary manometry was performed using a five-channel water perfused system. Contraction amplitudes <25 mm Hg in any of the last two channels or the presence of simultaneous or interrupted waves in 10 per cent or more were considered to be abnormal. Sixty-two patients had abnormal manometry. Thirty-four patients also demonstrated abnormal videoesophagrams for an overall sensitivity of 55 per cent. The positive predictive value was 53 per cent; specificity was 79 per cent; and negative predictive value was 80 per cent. Sensitivity was greatest in patients with achalasia (94%) and scleroderma (100%) and in patients presenting with dysphagia (89%). Sensitivity was poor for nonspecific esophageal motility disorders. A videoesophagram is relatively insensitive in detecting motility disorders. It seems most useful in the detection of patients with esophageal dysfunction, for which surgical treatment is beneficial, and in those patients presenting with dysphagia.