Monthly Archives: September 2010

Each year, about 75,000 bright, motivated and talented students graduate from U.S. high schools with an “undocumented” U.S. residence status. This situation provides these students with little hope of higher education or legal employment in the country they call home. The Development, Relief and Education for Alien Minors (DREAM) Act would give these students (who meet certain criteria) a path to participation in U.S. colleges, military, employment, and eventually citizenship.

Regrettably, after a nine year journey through Congress, the DREAM Act was most recently stalled in the Senate on September 21, 2010. Attached to a Defense Bill, the act was 4 votes shy of ending a filibuster on the issue.

Proponents of the DREAM Act have mostly emphasized the potential positive impact on the U.S. economy. Advocates have pointed to the DREAM Act as means to secure a return on investment for the “free” K-12 education government has provided for undocumented students. Unfortunately, this strategy has largely been unsuccessful because Americans do not see any immediate gain for themselves. Although improving the economy is good, for most Americans, it is not as significant as the sudden competition from recently legalized immigrants for jobs and college acceptances.

Several groups have successfully opposed the DREAM Act, primarily on the basis that it will encourage more illegal immigration and lead to increased competition for jobs and education. The Federation for American Immigration Reform (FAIR) has referred to the DREAM Act as “amnesty” for illegal aliens, claiming that it rewards parents who violated immigration laws through their children, and provides an incentive for more illegal immigration.

In order to garner enough support to pass the DREAM ACT, we must shift the perception away from the economy and immigration and onto education as a civil right. The 1954 Supreme Court case Brown v. Board proved our nation’s commitment to ensuring the right to education, even for the most marginalized members of society. In this respect, times have changed only in that the “standard” level of education acceptable for members of our society is increasingly becoming a college degree.

At present, there is considerable public momentum behind education reform. Look at the examples of the recently released film, Waiting for “Superman”and its preview on Oprah, Obama’s Race to the Top program encouraging innovations in state school systems, and NBC’s Education Nation a nationally broadcasted discussion about improving the U.S. education system. The failures of the “system” to produce high levels of student achievement have attracted mainstream interest.

Riding the coattails of this movement would likely prove successful for the DREAM Act. In the absence of the DREAM Act, the talent of undocumented students is wasted. Currently, only 5-10 percent of undocumented high-school graduates go to college. This is undoubtedly reducing our nation’s overall graduation rate and can be viewed as a systemic failure that can be remedied by comprehensive education change.

Viewing the DREAM Act as a solution to failure in the education system also aligns the fate of undocumented students with “our own” children. These types of symbolic images are often very successful, particularly with middle class who most strongly oppose the Act on the basis of increased competition for jobs and education. This broader scope could also gain stronger support from powerful interest groups such as the National Education Association, Teacher’s Unions and family focused groups.

Cosatu, the trade union in question, will tell you it’s about money. They’re asking for an 8.6% raise and an R1000/month increase in housing allowance. After rejecting an offer of a 7.5% raise and R800/month housing allowance increase, Cosatu, which represents teachers, nurses, and a number of other civil servants, agreed to resume delivery of essential services for three weeks while continuing negotiations. After three weeks, though, school will be out and patients uncared for again.

Strikes in South Africa are de rigueur; about this time last year, postal workers went on strike for over a month, freezing mail delivery. Local taxi strikes, which basically halt local travel and are often violent, are too common to enumerate or gather news attention. Now Pick ‘n’ Pay employees, a large grocery chain, are going on strike too, and the mine strikes show no sign of ending. South Africa’s economy will not be taken seriously internationally as long as these continue; it’s up to workers, government, and businesses to start working together to find middle ground if they’re serious about economic stability and development.

The Cosatu strike, though, is too important to wait. The strike has halted essential social services, including schooling—which comes on top of a longer than usual winter break to accommodate the World Cup—and health services, resulting in many being turned away from clinics, AIDS patients going off their medicine, and long-term TB patients no longer receiving care. Nurses and teachers are better paid than many South Africans, where un- and underemployment are rife, and much existing employment is in the informal economy or unskilled work. Then again, they also are often earning money to support an extended family and though standards of living are lower than in the West, employment in a skilled profession usually brings economic stability rather than wealth. One wonders, though, if other perks—like better facilities and more equipment that will directly benefit students and patients—ought to have been part of the demanded package.

The strike has political implications for the relationship between the unions and the ANC. Unions thought that President Zuma’s inauguration last year would bring them a renewed power in government negotiations, and found instead that little changed. Rumblings of a strike have been going on since June. Striking has left the government hamstrung and certainly made apparent how important the striking workers are to the operation of the country, though in a nation so used to gaps in service delivery (see the 2008 power shortages), the pressure is somewhat stifled. The rejection of the government’s offer indicates that the unions are not interested in compromise, which will probably result in the government either meeting demands or making some other kinds of concessions. As a power play, striking may be pretty effective in demonstrating strength in the short-term, though as a tactic for improving education and healthcare in South Africa, it leaves a lot to be desired.

Comments Off on Are the strikes in South Africa about politics or money?

Welcome to the Sanford Journal of Public Policy (SJPP) website and blog. This is a new forum for SJPP staff members and guest bloggers to engage with current issues in public policy. Our posts will reflect the diversity of opinions and interests among the students at the Sanford School of Public Policy, and we hope this diversity will be matched with an equally broad readership of students, academics, and practitioners from a range of policy areas.

The SJPP, in its second year of publication, hopes that you find this blog and the other website content interesting and informative. Your feedback is welcome in the comments and at sjpp@duke.edu.

Posted onSeptember 8, 2010byEditor|Comments Off on Book Review: The Next American Century: How the U.S. Can Thrive as Other Powers Rise

by Nina Hachigian and Mona Sutphen

Reviewed by Meaghan Monfort‡

Nina Hachigian and Mona Sutphen met while colleagues at the National Security Council under the Clinton Administration. Hachigian is currently a Senior Fellow at the Center for American Progress specializing in U.S. foreign policy, U.S.-China relations, and international institutions. She received her B.S. from Yale University and her J.D. from Stanford University. Sutphen, a former U.S. diplomat, is now serving in the Obama Administration as Deputy Chief of Staff. She earned a B.A. from Mount Holyoke College and her M.Sc. from the London School of Economics. With such diverse experience in and exposure to foreign affairs, these authors are especially qualified to recommend policies for the U.S. to thrive in the 21st century.

Hachigian and Sutphen wrote The Next American Century from 2006-2007 during the final years of the Bush Administration. The 2010 paperback edition includes a preface by Hachigian, in light of the election of President Obama in 2008. Eight chapters guide the reader through the challenges facing the U.S. in an age of rising powers.

Hachigian and Sutphen challenge a growing tendency in U.S. foreign policy discourse to view the rise of other global powers as a threat to American power and interests. Of specific interest are China, Russia, India, Japan, and Europe, which the authors refer to as “the pivotal powers” [10]. These pivotal powers “have the resources to support or thwart U.S. aims, to build the world order or disrupt it” [10]. South Africa, Brazil, and Iran do not make the list because, the authors argue, they do not meet this definition. South Africa’s military is modest and its influence lies mainly on the African continent, Brazil’s aims are not yet global despite its impressive resources, and Iran’s economy is too small. These states, however, could make the list in the future.

While all five pivotal powers are attended to in the text, the authors discuss at great length concerns over the growing influence of pivotal power number one—China. Hachigian and Sutphen also distinguish among the five powers in terms of their mentalities. China, Russia, and India “believe in their historical destiny as great powers, yet are simultaneously preoccupied with a raft of daunting internal challenges” [132]. On the other hand, Japan and Europe are set apart because they are already major global powers cooperating with the U.S., and “both are also less enamored with the trappings of traditional demonstrations of power, which sets them apart from the other three” [133].

Given the timing of the book and the authors’ political affiliations, one might have expected a more scathing denouncement of the Bush Administration’s foreign policies. Instead, their work is refreshingly prescriptive and future-oriented. The authors outline how the U.S. can grow its innovative economy, maintain its role as a global leader, protect American lives, and advance the liberal world order. They direct our attention away from nebulous and indirect dangers and toward a focus on the threat of international terrorism. In a persuasive table, the authors compare the threats presented by China and radical Islamic terrorists as evidenced by: (1) number of Americans they have killed on U.S. soil; (2) belief in free trade and capitalism; (3) level of ideological expansionism; and (4) announced policy toward liberal world order, among others. They conclude that “China is a much more ambiguous potential foe than are terrorists, the true threat of today” [17]. The U.S. should, as a result, focus on improving capabilities to target international terrorism instead of building up weapons systems and capabilities for highly unlikely, large-scale ground wars of the past.

The Next American Century successfully tackles the most common arguments for why the U.S. should feel threatened by the rise of pivotal powers. In its full form, the book is an encyclopedia of talking points on why global power is not a zero-sum game: international trade is good and should be encouraged; China is at least 40 years away from being a real military threat; dependency on foreign oil is not exceedingly dangerous; and it does not matter if foreign firms own American ones. The U.S., they argue, cannot insist on keeping other powers poor and weak in the name of self-preservation. Hachigian and Sutphen now have a likeminded leader in Washington, evidencing a trend toward agreement with the ideas and aims of this book. They aptly quote President Obama’s inaugural address when he proclaimed, “Our power alone cannot protect us” (xiii).

Perhaps the biggest criticism of this work is that it appears a bit unrealistic at times and too quickly dismisses the challenges of ideological differences. Illiberal politics in Russia and China are real roadblocks for the United States. Market transactions, which will no doubt drive our collaboration with the pivotal powers, are not immune to ideological battles. And cooperation in the market cannot be a cure-all for U.S. foreign policy. Google’s difficulties with Chinese internet censorship laws are a case in point. Additionally, while most agree on the need to keep and create jobs in the U.S. and attract investment from abroad, doing so is all the more challenging when other states cannot or do not regulate wages or adequately enforce tax laws.

Hachigian and Sutphen have created an impressive manual of policy prescriptions for the U.S., both domestically and internationally. Their policy recommendations—manage the national debt; construct collaborative relationships with the pivotal powers; revamp the education system to maintain leadership in innovation; put additional protections in place for workers; foil terrorist plots with smart technology and international cooperation; reduce greenhouse gas emissions; and promote liberal norms and democracy abroad—will not be easy to accomplish. But Hachigian and Sutphen are optimistic, and they have outlined a coherent and sensible strategy for moving forward.

‡Meaghan Monfort is a Master of Public Policy candidate at the Sanford School of Public Policy, Duke University. She earned her Bachelor’s degree in International Relations and Religion from Syracuse University in 2008. As a Pickering Foreign Affairs Fellow, Monfort has worked at the Department of State in the Bureau of Political-Military Affairs and in the Political Section of U.S. Embassy Moscow, Russia.

Comments Off on Book Review: The Next American Century: How the U.S. Can Thrive as Other Powers Rise

An Analysis of Risk Factors in the North Carolina Teaching Fellows Program

Alesha Daughtrey‡

AbstractThis study is a first attempt to quantify attrition rates for participants of the North Carolina Teaching Fellows Program and to identify risk factors for attrition. This study finds that Fellows are more likely than other teacher candidates to enter and remain in the classroom. Access to a professional mentor is associated with large and significant reductions in attrition risk for Fellows. Factors such as teaching in a high needs school and teaching in districts geographically proximate to Fellows’ home districts were also associated with a smaller, but significant, decrease in attrition risk. These results suggest mentoring and other instructional supports, as well as attention to placement in first teaching positions, are critical to improving retention of Fellows—particularly in high-needs schools.

Introduction
Public schools in every state, including North Carolina, struggle with problems of inadequate teacher supply. North Carolina needs more than 10,000 new teachers each year due to enrollment growth, class size reduction initiatives, retirements, and high teacher attrition rates.[1] Attrition is a particular problem for novice teachers. Sixty percent of all teacher candidates choose not to enter the classroom at all after completing their professional training.[2] By the fifth year of teaching, about half of novice teachers have left the profession.[3] Shortages of well-qualified teachers could put school quality at risk and push schools afoul of federal and state requirements to place properly credentialed and effective teachers in every classroom. If most “leavers” depart from high-needs schools with large concentrations of economically disadvantaged or minority students, attrition might also contribute to educational inequities within districts. This study is a first attempt to quantify attrition rates for participants in the North Carolina Teaching Fellows Program and to identify risk factors for attrition.

Background
The North Carolina Teaching Fellows Program (NCTFP) was established by the state’s General Assembly in 1986 to recruit talented high school students into the teaching profession and provide them with intensive preparation for school-based leadership and success in the classroom. In exchange, Fellows must repay their scholarships with four years of full-time teaching service in the North Carolina public schools. This arrangement is designed to provide the state with a more stable and well-prepared supply of public school teachers. The NCTFP, then, could be an effective response to teacher supply challenges—if Fellows are more resistant to attrition risks than other novice teachers.

The problem of teacher attrition and turnover
Schools’ struggles to find well-qualified teachers have popularly been described as a “teacher shortage,” but this phrase reflects an inaccurate depiction of the staffing problems experienced by public schools. In fact, preparation programs nationwide train more than enough teachers annually to fill current positions. Through the 1980s and 1990s, traditional certification programs multiplied, with an accompanying jump of about 50 percent in the number of teachers trained.[4] As of 2000, there were six million trained teachers in the United States—about double the number needed to fill positions nationwide.[5] The teacher shortage problem, then, is based not just on the low numbers of teachers prepared to enter classrooms, but in large part on the high number of teachers leaving them. North Carolina’s teacher attrition rate has averaged 12.8 percent over recent years.[6] Although this rate is well below the national rate of 16.8 percent,[7] it is still problematic for a state in one of the fastest-growing parts of the country.

Highest attrition rates in early and late career teachers
Changes in teacher labor markets have contributed to the teacher shortage. Well-educated women and minorities once had few employment opportunities aside from teaching. However, as more prestigious and high-paying careers have opened to these groups, many of these teachers have opted away from the profession.[8]

Typically, teacher attrition patterns follow a U-shaped distribution, with most departures coming either early or late in teachers’ careers.[9] Almost a quarter of North Carolina’s public school teachers have 20 or more years of experience, presaging a “teacher bust” as these individuals reach eligibility for retirement.[10] Yet retirements are still outpaced by all other reasons for leaving the profession by five to one.11 Indeed, attrition is most common for early career teachers.[12], [13], [14] Finding ways to retain the best-prepared and most effective novices could be a critical component of solving the challenges of teacher supply, especially in high-needs schools and districts or hard-to-staff positions.

Causes of attrition among novice teachers
Reducing high novice attrition rates may be one means of assuring a greater supply of well-prepared teachers in high-needs schools, but doing so requires an understanding of why novices leave. Several studies in other states have indicated that in general, teachers prefer to teach in schools with fewer high-needs students (i.e., students who are low-performing, have special needs, or are low-income or minority).[15], [16] North Carolina does not gather data from exiting teachers about whether issues related to school composition or other school-based factors impacted their career decisions. However, novices may have to wait several years before becoming eligible for a preferred school or position; or may even be displaced to a less-desirable school by more senior teachers because transfers among schools and teaching assignments within schools are commonly granted on the basis of seniority. Some novices, then, might choose to leave teaching altogether rather than remaining in an especially difficult position.

Teacher labor markets also tend to be highly localized, with most teachers preferring to teach in schools and districts close to where they grew up.[17] Even when teachers choose to locate farther from home geographically, the pull of home manifests itself in other ways; they predominantly choose areas that are similar to the areas in which they grew up.[18]

School-level supports may be among the most important factors in retaining novice teachers. Novices often feel overwhelmed by and underprepared for the demands of real classrooms, especially in high-needs schools.[19] Prior research suggests that high-needs schools exhibit higher transiency and attrition rates for teachers of all experience levels.[20], [21] Where novices are provided with access to induction programs or mentors to ease their transition into the profession, they are far more likely to remain beyond their first year[22] and to generate achievement gains among their students.[23] North Carolina novices are supposed to receive an Initial Licensure Program (ILP) mentor to guide them through their first three years in the classroom,[24] but since each district oversees its own ILP, quality and availability of mentors may vary widely. National surveys of teachers also show that poor leadership and insufficient support from school administrators are central reasons why many teachers leave their schools.[25], [26], [27]

Low teacher salaries are often cited as the cause for departure from the profession. However, poor school conditions and insufficient teacher support seem more important in teachers’ decisions to leave high-needs schools than their compensation.[28] Even so, teacher pay in North Carolina remains about four percent under the national average and well below that of state employees in other fields with equivalent experience and education levels, despite recent modest increases in teacher salary schedules.[29], [30]

Financial and educational costs of teacher attrition
Attrition is a particular problem for high-needs schools. These schools often have difficulty attracting qualified candidates to begin with, since they are perceived as having the least support and the most problems with student behavior and violence.[31] Once hired, teachers are twice as likely to leave high-poverty schools as compared to low-poverty schools and equally likely to leave schools with relatively high proportions of minority students.[32], [33] The less-experienced teachers, which these schools are typically forced to hire as replacements, generate lower value-added gains for their students.[34], [35] Thus, teacher attrition can undercut educational quality, particularly in schools that most need well-qualified, effective teachers.

Beyond the intangible costs in educational quality and equity, attrition is also a costly problem from a financial perspective. Hiring and retraining expenses are estimated at a national total of between $2.6 billion[36] and $7.3 billion[37] annually. Since high-needs schools have more severe attrition problems, these costs fall disproportionately on the districts that can least afford them. Given the scope of the problems triggered by attrition in the profession, minimizing teacher attrition could be a cost-effective way for public schools to resolve both a human resources and an educational quality challenge.

The role of the North Carolina Teaching Fellows Program
The North Carolina Teaching Fellows Program was primarily designed to increase the number of well-trained teachers entering the classroom, but elements of the program also correspond with research-based best practices for reducing attrition. State-allocated resources provide Fellows with more intensive clinical preparation for the real world of teaching than their peers in the state’s traditional or alternate-route certification programs. Fellows also receive additional support, mentoring, and professional development through all four years of undergraduate study.[38] The intensity of Fellows’ preparation and the program’s emphasis on classroom experience should theoretically render participants more immune than their peers to the real-world stresses of novice teaching.

Fellows are not guaranteed specific teaching placements or assignments that might further cushion any “entry shocks” to the demands of the profession. However, because of the NCTFP’s rigorous academic requirements and selection process, it is considered a prestigious program and its graduates are highly sought-after by districts statewide. Thus, NCTFP’s alumni are likely to be a special group of novices, with a greater deal of control over where and what they teach than most early-career teachers.

Finally, Fellows have a strong financial disincentive to leave prior to completing a fourth year of teaching, since they must otherwise pay back their scholarship loan. This factor should provide additional motivation to remain in the teaching profession, at least for as long as their service payback period lasts.

Data and methods Data sources and key variables
The primary data source used was the North Carolina Teaching Fellows Program master file, which provides information on Fellows’ demographic and personal characteristics, as well as data about their academic preparedness and participation in professional development opportunities. Information about the schools and districts in which particular Fellows taught during their service payback period comes from a second NCTFP data file on Fellows’ teaching career records. All NCTFP data are based on self-reports given as part of the students’ applications to the program, or on written responses to annual audit surveys of Fellows’ academic status (during college) and teaching status (during their payback periods). Data on the test scores and demographic composition of schools where Fellows either taught or attended, as well as current statistics on statewide teacher attrition, were publicly available from various reports produced annually by the North Carolina Department of Public Instruction.

Methods
Non-licensure rates for each cohort (1987-2004) were calculated and compared to test for variation over time in the NCTFP’s success in fully preparing Fellows to enter the classroom. In order to focus attrition analyses on only those Fellows who completed their professional preparation for teaching, attrition was defined as having occurred only among Fellows who successfully completed their undergraduate education, received teaching licensure, and went on to complete—fully or partially—a teaching service or monetary payback to the NCTFP. Thus, Fellows who failed to complete all program or degree requirements while in college, who were deceased before completing their payback periods, or for whom full personal, high school or college data were not available were not included in the analyses. The study also excluded cohorts graduating after 2000 because they did not have time to complete their service payback by 2004 (the last year of available data). To avoid any distortions in the program data due to NCTFP start-up issues, the first three cohorts were also dropped. This yielded a smaller sample of 3,440 Fellows, or about 51 percent of the full sample.

Constructing a life table for a pooled group of all ten cohorts provides a general picture of the timing of attrition problems within the NCTFP and controls for any distorting variations in attrition rates among the cohorts. Life tables are assembled for each cohort separately to indicate any changes or trends in the timing of attrition that have occurred across cohorts.

To test whether particular personal and school characteristics may be correlated with attrition, the Cox proportional hazards regression model was used,

Hj(t) = h0(t) exp (xjβ)

where Hj(t) is the likelihood that a given Fellow j will terminate her service payback (Hj) at time t. This probability is based on the fact that she has not failed (that is, continued active participation in service payback) up to that point (h0(t)), and on a vector of independent variables xj that are likely to affect the Fellow’s decision to leave. These variables include the proportion of low-income students in the school in which she teaches; the proportion of minority students in the school; the school’s test scores; her own test scores; her GPAs in both high school and college; her college or university; her ethnicity; the socioeconomic status of her own family of origin; and the presence of a mentor during her initial years in the classroom.
To discover the extent to which known risk factors for novice attrition were associated with actual attrition among Fellows, a series of dummy variables indicating presence of these factors were added to the dataset. These included variables for high concentrations of students who exhibited low performance on state standardized tests, who received free or reduced lunch (used as a proxy for socioeconomic status), or who were non-white minorities. A dummy variable for proximity to home indicated whether students taught in their home county or adjoining ones. Finally, to test whether Fellows’ decisions to leave the classroom might operate differently over time, a logistic regression function using the same variables as in the Cox model was also estimated.

Limitations of the data
The lack of school-level and classroom-level information about Fellows’ teaching experiences does not allow for controls for variation in teaching experiences within individual schools. The omission of controls for classroom-level effects may bias estimates, although the direction of this potential bias is unclear. Data about the quality of ILP mentors assigned is also not available.

Importantly, because of omitted variables, none of the results of the models can demonstrate a causal relationship between Fellows’ attrition and the characteristics of Fellows or the schools in which they teach and learn. Fellows’ career decisions, like those of other novice teachers, are influenced by a number of other variables omitted in this data (e.g., fluctuations in the overall economy, perceived career opportunities in other fields, withdrawals from the labor force, childbearing, or other changes to family structure). However, associative links between attrition and specific characteristics of Fellows or their schools can still be useful to the NCTFP and other teacher preparation programs, in terms of identifying the best leverage points for reducing attrition among graduates.

Results Preparing Fellows to enter the classroom: Non-certification trends
Comparison of non-certification rates for the entering Fellows cohorts of 1987 through 2004 show that among 6,704 graduated Fellows, 1,080 of them—or 16.1 percent—did not successfully complete their teaching licensure. However, about 10 percent of these “non-certified” Fellows (111 in total) did go on to complete at least one year of service payback to the NCTFP. This fact suggests that these Fellows may have entered the classroom without having completed all licensure requirements (such as passing Praxis certification exams) or with licensure applications still pending with the state, and did become licensed soon thereafter. The timing of licensure for some Fellows may therefore have biased these non-certification numbers upward.

Figure 1: NCTFP non-certification rates, by cohort (1987-2004)

Source: Author’s tabulation of NCTFP data

As shown in Figure 1, non-certification rates for Fellows have declined over time, from a high of nearly 23 percent to just over 13 percent for the most recently graduated cohort. Each of the past six cohorts has had a non-certification rate under the mean rate for the program. Since the very highest non-certification rates occurred in the early years of the program, it is possible that start-up issues for the program contributed substantially to non-certification rates, providing further justification for the exclusion of these cohorts from the models.

Attrition trends
Life table analysis (see Table 1) illustrates attrition patterns among Fellows over their four years of teaching experience. The point at which Fellows have graduated, been licensed, and are preparing to enter the classroom is considered Year 0 in this analysis. Approximately 73 percent of licensed Fellows complete four years of teaching service payback. While an overall attrition rate of 27 percent might seem high, it is important to note that the greatest attrition occurs in Year 0, before Fellows have even entered the classroom. National estimates of teacher attrition prior to entering the classroom are approximately 60 percent, with which the NCTFP Year 0 attrition rate of 11.5 percent compares very favorably.

The average overall attrition rate across cohorts drops considerably, to 16.1 percent, if we consider only those Fellows who leave after having entered the classroom – i.e., only attrition in Years 1 through 3 – as the state of North Carolina does in calculating its attrition rates. (See Figure 2.)

Figure 2: NCTFP attrition rates after entrance to the classroom, by cohort (1987-2000)

Source: Author’s tabulation of NCTFP data

Figure 2 also shows some variation in attrition rates across cohorts. A certain amount of variability can be expected purely as a matter of chance. It is also important to note that the data for more recent cohorts (1998 and later) are right-censored due to the structure of the program. Fellows have seven years after graduation to complete their four years of service payback, so these recent cohorts are still inside the seven-year window and may complete service paybacks at a later date. Attrition rates for these cohorts are thus likely to be biased upward.

Attrition rates among Fellows also vary by the year of teaching experience, as shown previously in Table 1 and again in Figure 3 below. As previously discussed, attrition is highest in Year 0, before Fellows’ entry into teaching, then drops during Years 1 and 2 before rising again in Year 3. This pattern of upticks in Year 3 departs from what the literature suggests should be a relatively steady decline in attrition across all years, as the novice Fellows acclimate to their careers in the classroom and near completion of service paybacks to discharge their debts to the state.

Figure 3: NCTFP attrition rates, by year in which departure occurred (1990-2000 cohorts)

Source: Author’s tabulation of NCTFP data

Factors associated with Fellows’ decisions to leave the classroom
A number of personal and school characteristics are demonstrated to impact attrition among novice teachers in general. The Cox proportional hazard regression model is used here to test whether and to what degree these factors may influence a Fellow’s decision to leave the profession. Table 2 shows results from the full model. These results are discussed in greater detail below.

Table 2: Change in the hazard ratio of attrition among Fellows

Hazard ratios on the far right of Table 2 indicate the extent to which each covariate in the model affects the risk of attrition. Ratios equal to one indicate a covariate that has no impact on the risk of attrition, ratios greater than one indicate increased risk, and ratios less than one indicate decreased attrition risk. For example, reading across the top row of Table 2, we find that Fellows who are female are 8.4% less likely than males to suffer attrition in any given year, a fairly modest and insignificant percentage. The hazard ratio indicates that female Fellows have 92% of the attrition risk of their male counterparts.

Presence of mentors
The variable exhibiting the highest correlation with Fellows’ attrition is whether they were mentored. Fellows who reported having an ILP mentor assigned to them had 97 percent less risk of attrition than Fellows who did not receive a mentor. Although substantial effects for mentoring might be expected in light of prior research on novices generally, this is still a large effect. The presence of a mentor also appears to be functioning independently of all other variables; estimates change in magnitude but not in sign or significance when the mentor variable is isolated.

Effects of high-needs schools
Contrary to what would be expected from prior research on teacher attrition, teaching in high-needs schools appears to be associated with decreased risk of attrition for Fellows. High concentrations of economically disadvantaged students did not significantly impact attrition risk. Teaching in low-performing schools is associated with a 17 percent decrease in attrition hazard and teaching in high-minority schools with a 27 percent decrease, both of which are significant at the p<.01 level.

Preferences for proximity to home or familiar environments
A possible explanation for Fellows’ resilience in high-needs environments is that they may have attended similar schools themselves as high school students. They may therefore have a preference to match characteristics of the schools in which they teach with those of their hometown schools. However, controlling for whether Fellows attended high-needs schools themselves did not significantly affect their rates of attrition.

While Fellows do not seem to have a specific preference for matching characteristics of their teaching environments with those they experienced as students, they do exhibit a preference for returning to their hometowns. Fellows who taught in the same county where they had attended high school, or in an adjoining county, showed a 22 percent decrease in attrition hazard. This finding was robust to controls for returning to the same high school they had attended as students, and for whether they taught in school environments similar to that of their own high schools. In this respect, Fellows’ preferences for home reflect the preferences of novice teachers overall.

Other factors related to attrition risk
In general, individual and family characteristics of Fellows were not associated with their attrition hazard. Interestingly, a Fellow’s own socioeconomic status and race showed no significant correlation with hazard of attrition when teaching in schools with high concentrations of economically disadvantaged and minority students. This finding reinforces the likelihood that Fellows are not matching their preferences for teaching environments with those of their home high schools, which are likely to serve students of similar backgrounds as the Fellows.

Prior research suggests that teachers’ own cognitive abilities or academic achievements do not correspond with their level of success in the classroom, except at the secondary level, but there is currently no information on how these factors might affect their persistence in the profession. Fellows’ academic profiles appear not to correlate significantly with any change in hazard of attrition, except for their cumulative college GPAs. Each .10 point increase in GPA was associated with a 18 percent decrease in attrition risk, when controlling for Fellows’ prior academic performance.

Changes in effects over time
Results of logistic models were roughly equivalent to those for the Cox model presented in Table 2. The few detectable differences in magnitude were extremely modest and not significant. Thus, the correlations between the above factors and Fellows’ attrition decisions appear not to differ in any meaningful way from year to year over the span of their service payback period.

Discussion
No data are available to indicate definitively why so many Fellows would not have fulfilled the certification requirement of the NCTFP. The most likely explanation—apart from issues related to timing of licensure—is that some Fellows realized during their training that the teaching profession was not a good career fit for them. It is hard to argue that any teacher preparation program should seek to place unwilling or under-motivated teachers in classrooms. Thus, this could be viewed as “good attrition” that may in fact improve efforts to train and place committed, effective teachers in public school classrooms.

The patterns of rising and falling attrition rates across cohorts also suggest Fellows’ responsiveness to changing economic conditions. Recall that there is a four-year lag between a cohort’s entrance into the NCTFP and its entry into the workforce. Reduced attrition among Fellows appears to follow a slightly countercyclical pattern, with more Fellows opting to complete service payback when the economy is less robust and the security of a teaching position is attractive in comparison with other opportunities for recent graduates. Conversely, in good economic times, Fellows’ attrition might increase because they perceive more, and better paid, opportunities in other fields. This explanation would account for relatively lower attrition among Fellows in the 1990-1991 cohorts, who entered the workforce when the economy was still in recovery from the recession of the early 1990s, and the relatively higher attrition among cohorts a few years later, graduating in the midst of the technology boom. However, until the most recent cohorts have had time to complete their service requirements, it will be difficult to know whether this pattern is borne out by the data.

The fact that Fellows’ attrition risk does not decline in a linear fashion between Year 0 and Year 4 of service payback appears counterintuitive. One possible explanation is that the year-by-year accrual of service payback, coupled with the fact that initial teaching licenses must be renewed after the third year,[39] offers a perverse incentive. If Fellows can pay down the bulk of their debts through teaching during their initial licensure period, they could then opt to make a cash payoff of the much-reduced balance—without going to the additional trouble of renewing their teaching licenses and with the possibility of higher earnings in another field.

Another reasonable explanation is that starting salaries for first-year teachers—currently around $30,000—are similar to or higher than those of other first-year professionals. However, because teachers’ salaries rise more slowly in the first few years and do not peak until teachers are mid-career, the marginal financial benefit for remaining in the profession erodes fairly quickly. The fact that Fellows have higher than average achievement may make them attractive candidates for entry-level positions outside education, and therefore more susceptible than other novice teachers to leave the teaching profession for financial reasons.

It is likely that effects observed for presence of an ILP mentor in these models are not attributable to mentoring alone. Mentoring programs, though mandated by the state of North Carolina for novice teachers, are implemented at the local level. District administrators, and sometimes even school-level administrators, are responsible for identifying and training master teachers to become mentors, matching them with novice teachers and ensuring the success of those relationships. The quality and efficacy of mentoring programs and relationships is therefore highly dependent on a variety of district- and school-level factors. Districts and schools with strong leadership, an emphasis on cooperation and collaboration among staff, and experienced teachers who are eager to assist new colleagues (formally or informally) are more likely to be able to find appropriate mentor/mentee matches for Fellows and other novices.

Furthermore, the same qualities that foster strong mentoring programs may also be associated with working conditions that are conducive to retaining novice teachers, even in the absence of any mentoring programs. Thus, estimates of hazard reduction as a result of access to mentors are probably overstated. Data on school or district leadership styles, working conditions in the schools in which Fellows teach, and how well formal and informal professional supports function are not included in the currently available data, and thus cannot be controlled for here. Still, given the significance and magnitude of these effects, it is reasonable to conclude that mentoring relationships are a highly important factor in the retention of Fellows and other novice teachers. These results provide evidence in favor of providing all novice teachers with ready and regular access to mentors of consistent and high quality.

The reversal of usual attrition risk factors for novice Fellows teaching in high-needs schools is somewhat puzzling. It is possible that Fellows might have primarily low-needs students in their classrooms, even if they are teaching in schools with predominantly high-needs students. However, at least in the middle and upper grades, it is common practice to assign more advanced (and therefore often lower-needs) classes to the most senior teachers. Therefore Fellows, as novice teachers, are more likely to receive higher proportions of high-needs students than the average for their schools. This would tend, if anything, to bias hazard estimates negatively. Actual attrition risk based on these factors might thus be even lower than indicated here, if we were able to examine classroom rather than school composition.

It is also possible that Fellows differ from most other teachers and novices, either in their motivations as teachers or in their preparation for (and thus satisfaction with) more challenging teaching positions. Because the NCTFP requires a commitment to teaching in the public schools, Fellows are well aware that they are not likely to teach predominantly advantaged students. The NCTFP emphasizes in-school preparation for Fellows, starting with classroom observations and school visits in their first year of college, so they are likely to be better-acclimated than most other novice teachers for the realities of challenging public school environments. As a result, Fellows may have more pedagogical and classroom management tools that are appropriate to high-needs schools than other novice teachers.

Individuals drawn to the NCTFP may also have a preference for teaching in high-needs schools, hoping to make a difference in the educational outcomes of the least-advantaged students. Unfortunately, this hypothesis cannot be tested with available data. Nonetheless, these findings suggest that careful attention to motivations for entering teaching during recruitment of candidates, as well as thorough preparation for teaching in high-needs schools, may be likely to serve any teacher preparation program well in terms of reducing attrition among its graduates.

Implications for policy and practice Strengthening support systems for novices: a key to teacher retention
This study finds that the primary factors associated with successfully retaining novice teachers are those related to positive and supportive working conditions within schools, including the presence of formal mentors. Fellows’ relatively low rates of attrition in comparison to other novice teachers also suggest that the level of additional preparation that Fellows receive—whether as part of experiences in schools during their four years of college or in extra professional development and enrichment programs—may render them more resilient as they face the pressures of full-time teaching and thereby could improve their retention. The results presented here are in line with those of previous studies examining novice teachers, suggesting that policies and practices that enhance school-level support systems for Fellows and other novices could substantially reduce attrition in the first few years of their careers.[40], [41]

Managing preferences to teach close to home
Like other novices, Fellows exhibit a significant preference for teaching close to home. Such preferences among teachers may pose challenges for placing novice teachers in districts with a particularly high demand for new teachers. Targeting Fellows’ teaching service placements for high-demand districts may therefore require unique strategies. In particular, program administrators should consider: 1) advocating for pay incentives to shift Fellows’ preferences towards teaching in high-demand districts; 2) intensifying recruitment for the Fellows program in the districts or geographic areas with the greatest projected demand for new teachers; or 3) ensuring, at a minimum, an even geographic distribution of students selected for the Fellows program.

Predicting retention for individual Fellows
This study finds that among the numerous characteristics of Fellows and their families of origin examined, only one is associated with significant changes in attrition hazard: cumulative college GPA. Based on prior research among novice teachers, it seems fair to surmise that GPA is functioning as an indicator of increased cognitive ability or preparedness in the subject area (especially at the secondary level), either of which might increase retention of Fellows. However, no other element of Fellows’ academic records, in high school or early in college, is similarly associated with attrition. This suggests that such characteristics might not prove useful for identifying and targeting those Fellows most likely to complete teaching service payback.

Conclusion
The above findings regarding the importance of mentoring and proximity to home for Fellows are very much in line with what is known about novice teacher retention generally; any teacher preparation program would be well served by attending to these concerns. More interesting and unusual is that Fellows seem to be significantly more resilient in high-needs, hard-to-staff schools than other teachers, particularly early-career teachers. In this regard, the implications for other preparation programs are less clear. However, research on urban teacher residency (UTR) programs in Boston, Chicago and elsewhere suggests that their graduates are also more immune to these risk factors for attrition than other novices.42, 43 These programs share NCTFP’s ties to a specific geographic region (in the case of UTRs, specific metropolitan areas) and an intensively clinical focus that is paired with one to two years of high-quality coursework on content, methods, and classroom management. It seems possible that these elements constitute best practices for preparing not only high-quality but “attrition-resistant” teachers. Additional study of these issues would be helpful to identify which program elements are most important to reduce attrition, but these preliminary findings provide a promising avenue for future policies and practices to combat novice attrition.

‡Alesha Daughtrey received her Master of Public Policy in 2009 from the Sanford School of Public Policy, Duke University and her Bachelor’s degree in English (with secondary teaching licensure) from the University of North Carolina at Greensboro in 1997. Daughtrey currently leads research initiatives at the Center for Teaching Quality, a national nonprofit focused on policy, research, and teacher leadership to improve student learning and advance the teaching profession.

Posted onSeptember 8, 2010byEditor|Comments Off on Alternative Responses to Climate Change: An Inquiry into Geoengineering

by Aaron Ray‡

AbstractReferences to geoengineering, loosely defined as the use of advanced technology to mitigate or adapt to the effects of climate change, have recently emerged in the popular press and academic literature. Governments, international organizations, and the scientific community are beginning to regard geoengineering seriously as a tool against global climate change. This paper examines the ways in which geoengineering has been defined, evaluates a number of leading proposals, considers some scientific critiques of geoengineering, and highlights some of the ethical, philosophical, and political implications of these proposals.

Introduction
Human alteration of the environment is not new. Although human transformation of the natural environment predates the Industrial Revolution, the potential of human behavior to alter the climate on a global scale has accelerated with the development of industrial civilization.[1] Within the debate about the veracity and implications of climate change and the costs and benefits of potential responses, a new line of argument has gained prominence. Journalist Graeme Wood articulated the potential of geoengineering: “What is new is the idea that we might want to deform the Earth intentionally, as a way to engineer the planet either back into its pre¬-industrial state, or into some improved third state.”[2] An examination of the implications of these proposals is necessary.

Defining geoengineering
Before evaluating various geoengineering proposals, it is essential to establish an understanding of what is meant by “geoengineering.” Writing more than a decade ago, Thomas Schelling observed that geoengineering was such a new concept that it was lacking a definition.[3] He defined geoengineering as having three characteristics: “global, intentional, and unnatural.”[4] This means that human activities such as building dams and clearing land for agricultural use, while having potentially profound local or regional impacts on humans and the environment, should not be considered geoengineering because they are not global in impact. The second requirement, intentionality, limits what is included as geoengineering by excluding the burning of fossil fuels and other activities whose effects are not intended to alter the natural global environment.[5] David Keith, director of the Energy and Environmental Systems group at the University of Calgary, agrees, that because it is not intentional, pollution itself does not constitute geoengineering.[6] The final requirement that an activity must meet to be labeled geoengineering is that it must be unnatural. For instance, adding metallic reflectors to the atmosphere, as some suggest, is an unnatural act and therefore a form of geoengineering.

For the purposes of this paper, I will use Schelling’s primary definition of geoengineering as something global, intentional, and unnatural.

Evaluating geoengineering proposals
Geoengineering proposals can be divided into two major classes: those that manipulate the amount of carbon in the atmosphere and those that manipulate insolation, the amount of solar radiation reaching the earth. Proposals in both classes merit evaluation under Schelling’s definition.

Manipulating carbon
Carbon capture and sequestration has become part of the mainstream conversation about mitigating climate change. This technique involves capturing carbon dioxide (CO2) at the point of combustion and preventing it from entering the atmosphere, possibly by injecting it into underground geologic formations. A related proposal is to directly remove CO2 from the air via carbon scrubbers. Such a proposal differs from traditional carbon capture and sequestration because it captures CO2 out of the ambient air, not at the point of combustion.[7] Schelling has expressed support for research into carbon capture and sequestration, finding that, despite its high costs, this technique could prove valuable to mitigating the negative effects of climate change. He further stated that carbon capture “probably has no adverse effects, [and] will probably not scare anybody or provoke religious objections. If it works it may be exceptionally valuable.”[8]

Physicist Freeman Dyson of the Institute for Advanced Study at Princeton has suggests that one way to remove CO2 from the atmosphere is to plant fast growing trees on large amounts of marginal land. He qualifies his proposal as a temporary way to reduce CO2 levels, explaining that planting trees will simply buy time for scientists and policymakers to create a more permanent solution to shift from fossil to renewable fuels.[9]

Another method of manipulating CO2 levels is ocean fertilization, which involves the application of iron particles or other substances to the ocean’s surface to encourage phytoplankton growth. During their growth cycle, the phytoplankton capture atmospheric CO2. When they die, these phytoplankton then carry that CO2 to the ocean floor. This proposal could increase the absorption of carbon by the ocean.[10] Ocean fertilization is already being tested and commercialized. In addition to a research expedition Bruce Frost describes in Nature, Kirsten Jerch reports in the Bulletin of the Atomic Scientists that two technology companies in California are planning to perform ocean iron fertilization to sell as carbon offsets.[11],[12] Alternately, an Australian company, Ocean Nourishment Corporation, is planning to use nitrogen rather than iron to increase phytoplankton photosynthesis.[13]

Manipulating insolation
The second class of proposals involves manipulating insolation, which is the amount of solar radiation reaching the earth. Schelling encourages us to consider that, if climate change “is defined as radiation balance, people may think the problem is not just too much CO2 but also too little sulfur aerosols, too little reflective cloud cover, too little albedo.”[14] Thus defined, climate change could be “fixed,” not only by reducing the amount of CO2 released into or present in the atmosphere, but also by reducing the amount of solar radiation reaching lower levels of the atmosphere.[15] Two potential methods for manipulating insolation include aerosol injection and the use of solar reflectors.[16]

The idea for manipulating insolation derives from natural environmental mechanisms. For example, the 1991 eruption of Mount Pinatubo in the Philippines resulted in a reduction in global temperatures by half a degree Celsius in subsequent years.[17] In theory, it is possible to mimic this effect by releasing particles into the atmosphere that would reflect solar radiation back into space rather than allowing it to warm the planet. Tom Wigley of the National Center for Atmospheric Research claims that adding aerosols to the stratosphere, similar to the Mount Pinatubo eruption, will likely present only minimal climate risks.[18]

Aerosol injection is not the only strategy for reducing incoming solar radiation. An array of mirrors or solar shades could be placed into orbit around the earth to reflect solar radiation back into space. Edward Teller of the Lawrence Livermore National Laboratory reviewed a variety of methods for reducing insolation, including sulfur molecules and metallic reflectors. He concluded that modulating insolation is technically feasible and cost effective.[19] Soviet planners have also considered this proposal and found it to have the advantage of being adjustable. If mirrors could be controlled, the amount of radiation could be adjusted over time and even directed toward specific regions.[20] This element of control might then be used to modulate the differential effects of reduced insolation on disparate regions.

Table 1 evaluates the preceding proposals using Schelling’s definition of geoengineering as global, intentional, and unnatural.

Table 1: Evaluating Geoengineering (GE) Proposals[21(a-f)]

Scientific critiques of geoengineering
Geoengineering’s skeptics warn of the dangers of these proposals. These skeptics have two main concerns. First, critics warn that we cannot foresee geoengineering’s consequences and therefore should not risk the unintended negative outcomes. Second, critics argue that there are known negative consequences of implementing geoengineering proposals.

Unintended consequences
A primary critique of geoengineering is that it carries the risk of serious unintended consequences. Jeff Kiehl, senior scientist at the National Center for Atmospheric Research, writes, “a basic assumption to this approach is that we, humans, understand the Earth system sufficiently to modify it and ‘know’ how the system will respond.”[22] The complex nature of the environmental systems being altered makes them difficult to effectively manage; even without considering possible human mistakes that could occur.[23] Robock goes on to question the ability of scientists to model and evaluate the potential impacts of geoengineering proposals. He warns that scientists cannot understand the complex interactions or predict the impacts of geoengineering proposals.[24] An editorial in Scientific American acknowledges that aerosol injection was “the best–studied proposal,” but also identified the significant expected dangers of this method. These included that: “sulfates would slow or reverse the recovery of the ozone layer; they might also reduce global rainfall, and the rain that did fall would be more acidic,” in addition to other potential unforeseeable risks.[25] Fears over unintended consequences of geoengineering are also being raised for proposals aimed at reducing incoming solar radiation.[26]

Despite these concerns, some scientists argue that the risk of unintended consequences must be evaluated relative to the risks of doing nothing. Michael MacCracken, chief scientist at the Climate Institute, insists that “geoengineering is far less risky than proceeding ahead to a 4-6 C global warming as we seem to be likely to experience if negotiations keep going as they are.”[27] In short, any consideration of geoengineering proposals must occur within a wider context that includes the potential costs and benefits of multiple alternatives.

Known risks
The known risks of geoengineering are numerous, according to its critics. For example, “the Pinatubo eruption played an important role in the record decline in land precipitation and discharge, and the associated drought conditions in 1992.”[28] Hydrological effects and reduced precipitation are not the only risks. According to Alan Robock, “aerosol particles in the stratosphere serve as surfaces for chemical reactions that destroy ozone;” and therefore, adding aerosol to the atmosphere could exacerbate the ozone depletion caused by the release of chlorofluorocarbons.[29] Concerns about the risks of geoengineering led the American Geophysical Union to release a position statement on geoengineering that recommends caution because any manipulation of the Earth’s environment could result in unpredictable and potentially damaging outcomes.[30]

Critics also worry that a single-minded focus on temperature obscures other negative environmental effects of fossil fuel combustion. Climate change is only one environmental consequence of current patterns of energy production and consumption. For example, the ocean is currently “30 percent more acidic than it was before the Industrial Revolution.” If this acidification continues, it will endanger the entire oceanic biological chain, from coral reefs to humans.[31] Supporters for emissions reductions instead of geoengineering conclude that changing production and consumption choices can address many environmental challenges, while geoengineering proposals are limited solely to manipulating temperature.

Ethical and philosophical dimensions of implementing geoengineering solutions
In addition to the scientific and political questions surrounding geoengineering, there are ethical and philosophical issues that require some contemplation to inform decisions about whether and how to utilize technology to alter the environment globally. In this section I explore concerns about distributive and procedural justice, human attitudes toward technology, and the relationship between humans and nature. Each of these issues has impacts on the implementation of geoengineering as policy.

Ethical considerations
One ethical dimension of this debate is the asymmetrical impact of climate change and geoengineering on nations and individuals. Martin Bunzl, a climate change policy expert at Rutgers University, notes that like climate change itself, geoengineering projects could cause shifts in climate that would be distributed unevenly across the globe. He warns “roughly 10% of the World’s population might be worse off even if the other 90% was better off.”[32] This asymmetry raises questions of both distributive and procedural justice. The distributive question concerns who should bear the costs and benefits of geoengineering. Addressing this question requires a procedural mechanism through which the nations of the world can decide how to apportion benefits and harms. However, at the moment, we lack a credible means of deciding global questions of distributive justice. Dean suggests that scientists and engineers may be able to predict some of the consequences of geoengineering proposals, but do not have the authority to decide whether those risks are acceptable.[33] Identifying who does have this authority is exceedingly difficult. Dean thus concludes that it would be better to preempt these problems by reducing emissions caused by heat-trapping gasses and thereby avoid having to utilize geoengineering techniques.[34] While that may indeed be the best approach, it has thus far proved to be all but politically infeasible at the pace and scale that may be necessary. The lack of a binding international agreement to reduce GHG emissions calls into question the willingness or ability of national governments to achieve the reductions Dean proposes.

Philosophical considerations
One of the philosophical elements of the debate over geoengineering is the way in which the debate reveals human attitudes toward technology. The editors of Scientific American frame the issue as one of a fundamental belief in the promise of technology: “If technology got us into this mess, maybe technology can get us out of it.”[35] This argument posits that GHG emissions are the consequence of a progression of technological developments involving the utilization of the energy stored in fossil fuels to drive economic development and improvements in standards of living. Surely, continued technological advancement can allow for continued economic development even while mitigating the negative consequences of growth. Keith suggests “geoengineering implies a countervailing measure or a ‘technical fix’; an expedient solution that uses additional technology to counteract unwanted effects without eliminating their root cause.”[36]

The view that continued technological advancement will allow for unchecked energy consumption is found not only among proponents of geoengineering, but also among those who favor the use of renewable energy technologies that permit continued energy consumption with fewer known environmental costs. The hope is that the right technology can enable us to continue to consume in our current fashion—or even increase aggregate energy consumption as developing nations modernize—without incurring the negative environmental consequences of the past. A similar argument was made at the dawn of the nuclear age. Looking back on his attitude toward nuclear energy, Alvin Weinberg, former head of the Oak Ridge National Laboratory, recalled that “…you had uranium in the rocks, in principle, an inexhaustible source of energy—enough to keep you going for hundreds of millions of years. I got very, very excited about that, because here was an embodiment of a way to save mankind.”[37] Nuclear energy turned out to have monetary costs and concerns about safety and waste that have since tarnished its promise. That nuclear energy does not provide clean and free energy might be a warning to those who believe that any technological innovation can deliver humans from the fundamental economic condition of scarcity.

We often find hope in the promise of technological innovation to release us from the limits of scarcity or mitigate the effects of previous technological developments. Reliance on this transformational power of technology may have much in common with systems of religious belief. Writer David Noble argues that technological development is a religious endeavor involving the merging of technology and faith.[38] Noble summarizes the historical arc of faith in technology from steam power to geoengineering with this observation: “We demand deliverance. This is apparent in our virtual obsession with technological development, in our extravagant anticipations of every new technical advance – however much each fails to deliver on its promise.”[39] It may be that geoengineering represents only the latest technological development that we have adopted due to faith in its promise, without a guarantee to deliver.

A final philosophical implication of geoengineering relates to the relationship between humans and the planet we inhabit. Keith debates one of the fundamental questions of philosophy: should we maximize the benefits to mankind by exploiting all the tools at our disposal or should we preserve nature by minimizing our interference with it?[40] The first position is one of active management, the drive to extract the most utility from a set of finite resources. The second position is that of minimal impact, to use as few of those resources as necessary to satisfy our basic needs.

The promise of technological development is integral to the active management position, that we can use technological innovations to extract more utility from the finite resources available to us while minimizing the environmental costs. Geoengineering proposals clearly tend toward the active management position. We have tools to manipulate the environment so as to maintain or create the optimal conditions for human growth and development. Active management is a fundamentally human-centric proposition: the value of any natural resource stems from its capacity to provide for human welfare. Keith addresses this point by asking: “Is human welfare the sole consideration, or do we have a duty to protect natural systems independent of their utility to us?”[41] It is not at all clear whether in fact we have such a duty or, even from where such a duty might emanate.

Some of the opposition to both continued GHG emissions and geoengineering proposals is rooted in the minimal impact position. This view argues not for new technology that allows for continued energy consumption, but rather for changes in behavior to reduce consumption. Burning fossil fuels alters the natural state of the planet, as do aerosol injection and ocean fertilization. Schelling’s inclusion of the dimension of naturalness in his definition of geoengineering comes into play here. Opponents of geoengineering see active management as inherently unnatural. Lest the critics of geoengineering get too comfortable, Keith reminds us that “the Earth is already so transformed by human actions that it is, in effect, a human artifact.”[42] How can one distinguish now between what is natural and unnatural? This challenge rises often in relation to environmental issues, including questions of habitat management and invasive species. We are unlikely to come to a resolution regarding this issue, but being aware of the question serves to inform this and other debates involving human choices that impact the environment.

Political implications of geoengineering
There are important political questions that arise when considering whether and how geoengineering proposals should be implemented. First, from where does the authority to experiment with or implement geoengineering proposals emanate? Second, what is the relationship between geoengineering projects and emissions reduction efforts? Third, what are the costs of geoengineering proposals relative to emissions reduction? And fourth, how and by whom would the application of geoengineering technology be controlled? I will examine each of these questions in the hope of identifying the political implications of developing this capability.

Authority to experiment and/or implement
One of the questions raised by the prospect of geoengineering is that of authority: Who has the authority to manipulate the climate and from where is that authority derived? Current international negotiations leave control for regulating emissions with individual nations. As a result, no international body currently exists that has the authority to govern and control geoengineering.[43] However, since geoengineering is, by definition, global in impact, it does not fit in the present nation-centric framework. Schelling argues it is essential to identify or create an authority to approve the experimentation needed to explore the potential consequences of these proposals. He finds the creation of this body to be even more important than actually conducting geoengineering experiments, because having an international body is critical for establishing parameters by which any future geoengineering program be judged and implemented.[44] Whether or not one supports continued research into geoengineering projects, there is a clear need for some authorizing body to be responsible for regulating both experimentation and possible implementation. Keith concurs with the other experts in seeing the need for an international body to provide a forum for democratic debate on the potential global impacts of geoengineering projects.[45] Implicit in Keith’s analysis is the view that, for decisions that have global consequences, there should be a global, democratic process of adjudication.

Geoengineering and emissions reductions
The second political element at issue is the relationship between geoengineering projects and emissions reductions efforts. One of the primary arguments for geoengineering is its simplicity relative to the political complexity of negotiating and enforcing GHG emissions reductions. Schelling writes that pursuing geoengineering has the potential to transform GHG policy from its current complicated state, replete with regulations, to a simpler approach based on balancing costs among nations.[46] Because geoengineering schemes do not require nations to reduce emissions or economic output, they would allow the debate to focus only on how to pay for the technology. This process may be further simplified if the proposals turn out to be as cost effective as their proponents suggest.

Schelling argues that one of the virtues of geoengineering is its ability to mitigate climate change while “not depending on the behavior of populations, not requiring national regulations or incentives, [and] not [being] dependent on universal participation.”[47] This is in contrast to reducing CO2 emissions, which involves a decentralized regulatory approach that requires changing behavior in many realms, including personal energy consumption, transportation, and energy production. He also suggests that emissions reduction depends upon the implementation of policies that governments are incapable of due to a lack of expertise, resources, or political will.[48] Geoengineering would alleviate the need for national governments to undertake the interventions necessary to achieve emissions reductions.

Teller agrees that, despite concerns about finding an acceptable means of authorizing their use, geoengineering proposals present a much simpler solution to the threat of global warming than the current practice of developing international consensus on how to quickly and dramatically reduce usage of fossil fuels.[49] Given the lack of progress thus far toward a binding international agreement to cut emissions, and the difficulty of adopting national programs that change personal behavior, a simple solution is attractive.

However, the ease of implementation of geoengineering projects presents a concern with regard to the relationship between geoengineering and emissions reductions. Critics of geoengineering have argued that just having geoengineering as an option could weaken the commitment to limit emissions in time to avert the most serious warming. Kerr reminds us that, if “serious efforts to cut back greenhouse gas emissions were failing, a stopgap approach would become more attractive.”[50] If emissions reductions fail to materialize, the prospect of having an alternative method to prevent the worst effects of climate change seems prudent. Yet, geoengineering represents exactly the type of stopgap measure that would likely deflate political will to change international consumption patterns or make the energy infrastructure more sustainable.[51]

Schelling advocates for further research to determine the feasibility and effectiveness of geoengineering proposals. At the same time, he acknowledges the risk of undermining mitigation efforts when he warns that geoengineering proposals may be so attractive that they steal the focus away from emissions reductions, and that this change in focus could have serious negative repercussions.[52] It is this prospect that critics of geoengineering are concerned about when they warn of the harmful environmental consequences of continuing GHG emissions that could occur even if geoengineering projects control global temperature change. MacCracken also supports research into geoengineering proposals, but cautions scientists and policymakers to never view geoengineering as a replacement for actual efforts to mitigate the negative impacts of climate change, particularly because no nation has the appropriate information to change its entire focus to implementation of geoengineering proposals.[53]

Relative cost of geoengineering and emissions reduction
Schelling contends that one advantage of geoengineering projects is that unlike mitigation, they only require us to decide what to do and how to pay for it.[54] In fact, low cost is one positive aspect of geoengineering often cited by its proponents. Teller, in surveying solar reflection enhancement proposals, finds that the total cost would probably be no more than $1 billion per year—“an expenditure that is two orders of magnitude smaller in economic terms than those underlying currently proposed limitations on fossil-fired energy production.”[55] This can be compared to the Stern Review’s conservative approximation of the costs of mitigation, which was estimated at 1% of global GDP.[56] Skeptics, however, do not accept this analysis. Robock argues that the costs of geoengineering projects are much higher than what the international community currently spends on renewable energy.[57] Robock also criticizes the multibillion–dollar subsidies given by the U.S. government to the coal, oil, gas, and nuclear industries, while providing little support to alternative energy producers.[58] He implies that if funds planned for geoengineering were directed toward the development of renewable energy generation technology, we could reduce emissions and avoid global warming without the risks of geoengineering.

Schelling suggests that the low cost of geoengineering proposals means that we can dispense with national and international negotiations altogether because some geoengineering projects can be performed by “exo-national” programs—programs not confined to national borders.[59] The idea is that wealthy individuals, corporations, or foundations have the resources to implement geoengineering proposals without the need for national or international agreement. Wood is more skeptical of the desirability of exo-national implementation of geoengineering proposals. He suggests that “the scariest thing about geo–engineering, as it happens, is also the thing that makes it such a game changer in the global warming debate: it’s incredibly cheap.” He is concerned by the number of multibillionaires in the world who could potentially take immediate action to singlehandedly reverse climate change.[60] Geoengineering projects may in fact be easier and cheaper to implement than national and international emissions reductions. But the relative ease of implementation and the risk of adverse consequences of geoengineering lead to a greater concern for the need to regulate its use.

Regulating use of geoengineering technology
The development of geoengineering technology means that nations and international organizations must now consider whether and how to restrict individuals and nations from implementing unauthorized projects. MacCracken suggests that the international community needs to establish a consensus on policy. One where individual countries are not allowed to implement solar management programs without authorization from an international body.[61] Yet even an international agreement reached through democratic means is unlikely to produce unanimity. If a decision is made at the national or international level that geoengineering projects, particularly those with the greatest risk of adverse consequences, should not be implemented, how can rogue nations, corporations, or individuals be prevented from going ahead with programs on their own? One can imagine a low-lying nation facing inundation and frustrated with the lack of international progress on emissions reductions sponsoring a program of aerosol injection without international approval.

This scenario introduces the risk of global negative consequences that stem from the actions of a single individual, corporation, or nation. Wood argues that governments would need to step in to prevent a rogue entity from taking independent action, either by enforcing current regulations or developing new ones.[62] The challenge though, is how to regulate this technology that is, according to its proponents, so economically and technologically feasible. Wood foresees some international effort to “monopolize the technology and prevent others from deploying it, through diplomatic and military means… Such a system might resemble the way the International Atomic Energy Agency (IAEA) now regulates nuclear technology.”[63]

The example of the IAEA may be an unfortunate one, at least from the perspective of those concerned about the misuse of geoengineering technology. International attempts by this agency to regulate the spread of nuclear weapons technology have failed. The nuclear club has grown since its birth at the end of World War II. International efforts at arms control have been unable to prevent India, Pakistan, or North Korea from developing nuclear weapons technology. Wood explains that the challenge faced when attempting to regulate geoengineering technology may be even more difficult than that faced when regulating nuclear technology because it is much easier and cheaper to implement many geoengineering proposals than it is to develop nuclear weapons.[64] Furthermore, one could argue that the global scale of geoengineering projects means that their potential adverse consequences are potentially greater than the threat of a single nuclear device.

Conclusion
As the debate over whether, how, and how much to reduce GHG emissions develops, geoengineering proposals are likely to draw increasing interest due to their technical, political, and economic feasibility. The consideration of these proposals is a necessary part of the scientific and political process regarding global warming. Experimentation into various mechanisms for manipulating climate may enrich our understanding of the complex interactions driving global climate change. Much more needs to be known about the potential adverse environmental consequences of these proposals before they are implemented. The political challenges presented by geoengineering may outweigh the environmental risks. Some legitimate international means of authorizing implementation, adjudicating costs and risks, and regulating and controlling the use of this technology would have to be developed. The most immediate political risk for those advocating for emissions reductions, is the possibility that the existence of geoengineering alternatives could reduce the willingness of nations to agree to binding emissions targets. Finally, exploring our attitudes toward geoengineering can help elucidate our complicated relationship with technology and the environment.

‡ Aaron Ray is studying Environmental and Regulatory Policy at the Georgetown Public Policy Institute. He earned a Master of Arts in Secondary Education from the University of New Mexico and a Bachelor of Arts in Philosophy, Political Science, and Religion from Linfield College. He taught Philosophy at Dine College, the tribally-controlled college of the Navajo Nation. He also taught history, government, and economics at Crownpoint High School in New Mexico and the César Chávez Schools for Public Policy in Washington, D.C. His interests include sustainable development, land and resource management, and the ethics and philosophy of public affairs.

Comments Off on Alternative Responses to Climate Change: An Inquiry into Geoengineering