Last Chance to Nominate!

3 Quarks Daily Advertising

Please Subscribe to 3QD

If you would like to make a one time donation in any amount, please do so by clicking the "Pay Now" button below. You may use any credit or debit card and do NOT need to join Paypal.

The editors of 3QD put in hundreds of hours of effort each month into finding the daily links and poem, putting out the Monday Magazine, administering the Quark Prizes, arranging the DAG-3QD Peace and Justice Symposia, and doing the massive amount of behind-the-scenes work which goes into running the site.

If you value what we do, please help us to pay our editors very modest salaries for their time and cover our other costs by subscribing above.

We are extremely grateful for the generous support of our loyal readers. Thank you!

3QD on Facebook

3QD on Twitter

3QD by RSS Feed

3QD by Daily Email

Recent Comments

Miscellany

Design and Photo Credits

The original site was designed by Mikko Hyppönen and deployed by Henrik Rydberg. It was later upgraded extensively by Dan Balis. The current layout was designed by S. Abbas Raza, building upon the earlier look, and coded by Dumky de Wilde.

Monday, February 25, 2013

More heat than light: The vexing complexities of the drone debate

by Bradley Jay Strawser

I. UNWARRANTED CONFIDENCE

"Cynicism is what passes for insight among the mediocre." Joe Klein delivered this gem while discussing how difficult it has become for Washington journalists to write a positive story on a politician these days.[1] Something similar could be said regarding the debate over the morality, legality, and prudence of our most recent weapon of war: the unmanned aerial drone. Critics and supporters alike tend to oversimplify the moral complexities that any reasonable assessment of drones should acknowledge. Worse, both critics and supporters often take a rigid position one way or the other with drones – enthusiastic embrace or passionate condemnation – without admitting to the deep-seated moral tension found at the heart of this fractious issue. The overconfident claims of moral surety on either side of the drone debate should give us pause.

Perhaps such conclusions are understandable. After all, each side in the debate can lay claim to a piece of the truth about drones. Given the stakes, it makes sense that we find ourselves wanting to say something – to rightly shine a light on the tragedies wrought by drone warfare or to rightly praise a weapon that has the ability to be far more accurate than alternatives, thereby saving innocent lives. There is, however, a troubling paucity of consistent data on the drone strikes themselves and a considerable lack of transparency from the U.S. government regarding its drone operations. Reaching an absolutist position on either side of the drone divide is thus both too quick and too simplistic given the issue’s complexities and unknowns. When it comes to unmanned weapons, we far too often hear vociferous condemnation or unqualified justification, when nuance and an admittedly frustrating ambivalence would be more apt. For both critics and defenders of drones alike we could say, parsing Klein, that “overconfidence is what passes for discernment among those who should be more apprehensive.”

To highlight this difficulty, it is worth noting that my own work on drones has been accused of falling into the very kind of one-sided certainty I’m here criticizing. This is because some have portrayed me as a staunch, unflinching defender of drones.[2] But this is a false portrayal; I view myself as neither a pro-drone advocate nor an anti-drone detractor. This is because, again, any comprehensive position on drones must account for the many moral complexities, both good and bad, in both theory and in practice, that this new weapon system portends.

II. THE DIFFERENCE BETWEEN POLICY AND PRINCIPLE

One cause of confusion in this debate stems from a failure to recognize a crucial distinction: the theoretical analysis of the morality of drones as separate from discourse over the morality of actual policies carried out today. Some complain that it is useless to investigate whether drones pose any intrinsic moral problems in the abstract, or have any inherent moral gains in theory, apart from how they are actually being used. As one commenter memorably put it, “Agreeing with the drone wars ‘in theory’ is like agreeing with the Iraq war ‘in theory.’”[3] Rather, this view insists, we should look solely at the ways in which these weapons are presently being used, and base our moral conclusions on those facts alone. Nick Scott gives this kind of argument in his critique of my work. Writing for Foreign Policy, Scott argues that “the abstract moral issues surrounding drone strikes are of no importance when divorced from the policy that calls for their usage. Without the context in which U.S. drone policy is executed, there is no meaningful framework through which to examine these abstract questions.”[4]

But is Scott mistaken? It seems it is not only possible to do so, but that there are good reasons to scrutinize drones distinct from their actual employment. I believe it would be rash to dismiss the importance of analyzing drones in the abstract and to instead focus only on present policy. I see at least three reasons why the present and future debate over lethal drones should maintain this distinction.

First, it is far more difficult to parse all thinking on drones through the lens of U.S. drone operations because of the surprising lack of reliable information on said operations. Very little is known publicly about the details of U.S. drone policy – particularly how and on what basis lethal decisions are made – and good, consistent data on the impacts of the drone strikes themselves is even harder to come by. With the empirical evidence we have for real-world drone operations being as weak as it is, it is wise to not place all of our judgments on drones on such infirm ground.

A second reason it is a mistake to think that we must only grapple with the morality of drones within the framework of current U.S. policy can be drawn from the history of warfare. It would be foolish to assume that drones will only ever be used in the way they are being used today, much less to think that they will only ever be used by the U.S. Presuming then that these kinds of weapons are not going away, it is incumbent on us to think seriously about the morally relevant features of drones more broadly than merely how they are used in our present historical context. The reason for doing so is not only to apply those lessons to our present decisions, but to predict the ways in which drones could be used in the future, for both good and bad.

Finally, Scott’s view is mistaken for a third reason. If we believe that there are times when killing can be morally justified, then we are obligated to carry out such actions as justly as is possible. Given that imperative, it is clear that some means and methods of warfare are more just, or more morally objectionable, than others. It is therefore well worth exploring whether certain weapons pose special moral problems or have potential moral advantages, in theory, over alternative weapons. This is especially true for new weapons that have not yet received the scrutiny of history, as is the case with drones.

To help make this point, consider another means of war: nuclear weapons. These weapons have features intrinsic to their nature that make them morally problematic in principle. These include the fact that nukes are (for all intents and purposes) impossible to use in accordance with traditional just war theory constraints. They are not precise weapons with which a just war-fighter can discriminate between innocent civilians and enemy combatants. Rather, the massive scale of destruction these weapons deliver makes them intrinsically indiscriminate. Moreover, it is hard to see how unleashing this awesome force on the world could ever be a proportionate and necessary response to an injustice one hopes to block, once all the concomitant atrocities nuclear weapons bring with them are weighed. I conclude that nuclear weapons are wrong to use in principle.

If one agrees with this conclusion, it helps guide our moral thinking on how we should (or should not) employ nuclear weapons. In fact, in this case, it settles the matter: we should simply not use nukes, regardless of the circumstances. Many hold a similar view for weapons such as poison gas or the practice of torture. The relevant questions for such things are not how and when to use them, but rather how we can best rid the world of them. Alternatively, for those weapons that we think are not intrinsically wrong in principle and could be used justly in some circumstance, a better understanding of the potential moral gains and dangers inherent to such weapons – in the abstract – can aid us in deploying them as justly as possible. This is precisely what I take to be the case with drones.

So what of drones, then? Should drones be considered wrong in principle as, in my view, we should view nuclear weapons? I don’t believe so. I cannot here make this case in full, for that would require its own lengthy discussion which I have offered elsewhere, and even there only partially.[5] The very short and oversimplified version is this: There is nothing inherent in the nature of drones which make them morally wrong to use, in principle, for an otherwise just cause. In fact, drones offer clear normative advantages by better protecting their operators from harm and by being more accurate in hitting their intended targets than other weapon platforms which can result in, on par, fewer unintended deaths of noncombatants. This means, simply enough, that drones have the potential for moral improvement in the conduct of war when used for a just cause. But such potential, of course, is no guarantee that they will be used justly.

Drones also give rise to a long list of serious ethical concerns over their use. These include fears that drones make war too easy and too tempting for policy makers and thus lower the thresholds against any use of force to dangerous levels. There are concerns that drones generate “blowback” against those who employ them among the populations where they operate. Many worry about the moral implications created by the extreme asymmetry remote warfare creates, while others question the ways such warfare might produce cognitive dissonance in the minds of those who operate drones. And there are many other ethical concerns specific to lethal drone employment too numerous to even attempt an adequate discussion of here. I take all such concerns seriously; each is worthy of legitimate apprehension and deeper moral analysis. However, I ultimately find that these potential moral problems with drones are contingent in nature and could be overcome and, thus, do not make drones wrong to use in principle.

If drones are not intrinsically wrong to use in principle, this then makes them a live option to consider for any lethal action befitting their capabilities. Thus, the first and primary question for current drone operations and policy should be the following:

Should those presently being attacked by drones be confronted at all?

To simplify this question for the moment, let us restrict the question solely to the drone operations presently carried out in the Federally Administered Tribal Areas (FATA) region of Pakistan (where the majority of drone strikes over the past few years have occurred). If one thinks that the terrorists and militants entrenched in the FATA who wage war on American, Pakistani, and NATO forces, and kill thousands of innocent Pakistani civilians should be left to their own devices, then so be it. But then one’s issue is not with drones. Rather, it is with this entire policy of lethal operations in the FATA region. Certainly, if one answers my above question in the negative, then drone operations in the FATA region should be ended. But then so too should any kind of military engagement against the militants be ended.

But if one thinks, as I do, that it is best to do something to thwart the activities of those militants in the FATA region, then our choices are far more difficult, and our engagement with the drone debate requires and deserves far more nuance than the absolutist position offers. For if one answers the question above in the affirmative, then we are left with the choice of how to best fight these adversaries. And among the various options available, I find that a relatively strong (but highly conditional) case can be made that drones are the best option (or least bad option) presently available with which to engage this fight.[6]

The reasons for this are complex and have as much to do with Pakistan’s long failure to secure this region and the politics of Washington and Islamabad as it does with the particularities of drones. The untenable situation leaves very few good options remaining, including the option of doing nothing. As Joshua Foust of The Atlantic writes, “FATA, where most militants live and drone strikes occur, is a political wasteland with little law enforcement – leaving policymakers with few options for pursuing the terrorists that continue to kill thousands of Pakistani civilians (and actively support the insurgency next door in Afghanistan).”[7]

If one thinks some attempt should be made to stop these militants, then we must weigh the available options. U.S. and NATO ground forces could be sent into these regions. Pakistani ground forces could be deployed in large numbers. Or alternative forms of airpower could be employed, such as various manned aircraft. That seems to nearly exhaust the options in terms of fighting the terrorists embedded in the FATA region. There are, of course, non-violent options, including negotiations, and the long-term solution of Pakistan gaining credible political control of the region to bring stability, law, and order. But in the short term, both the governments of the U.S. and Pakistan believe (and I think rightly) that they have a duty to protect their citizens from the violence posed by the militants operating in the region.

Sending in large scale U.S. ground forces to the FATA region is a non-starter; the political and pragmatic problems are innumerable. Sending in Pakistani ground forces may sound like a potential option, but it has proven to cause far more harm to local civilians than drone strikes do on even the most charitable evidence available. Not only have there been much higher rates of unintended civilian casualties when Pakistani ground forces have been sent into the FATA, but such operations have displaced hundreds of thousands of people. Using alternative means of airpower also gives us worse results than drones in terms of civilian casualty rates and infrastructure damage. Indeed, according to the most exhaustive and careful study of which I am aware that compares the various means used to fight the militants in the FATA region, drones emerge as by far the least bad option in terms of unintended civilians harmed or killed.[8]

It is hard to avoid the conclusion, then, that if we are going to fight the militants in the FATA region, that drones may be, presently, the least morally problematic option, from among the list of bad options. This certainly seems to be true at least in terms of traditional just war theory jus in bello proportionality considerations. This is not to downplay the tragedies imposed on civilians in the region by the drone campaign. Indeed, this includes not only the horrific cases of civilians being killed in drone strikes, but also the burden of living in a land where drones roam the skies.[9] Still, it seems that these harms are less than the harms brought upon these civilians by other means of force that could be used and have been tried. And it’s also critical to note, of course, that the harms suffered by these civilians at the hands of the militants are extreme as well and are, in fact, far greater than the harms brought about by drones.

Such a conclusion – that drones are the most proportionate means of force presently available with which to fight these militants – may lead many back to the question of whether or not we should fight these militants at all, particularly given the many moral worries on the table about drones themselves. Moreover, perhaps one believes that the proportionality considerations are even more complex than I just laid out. That is, one may think that there are any number of long-term morally problematic issues to using drones that could come back to haunt the U.S., and that these predicted future problems must be incorporated into our present proportionality considerations.[10]

I certainly agree that any and all relevant harms and potential problems must be factored into our best proportionality calculus, even ones for which we only have vague future predictions. Any one of the many contingent moral pitfalls that people raise against drones could be enough such that a reasonable observer might conclude that it outweighs whatever good the drone option delivers. To take just one example from many, consider the concern mentioned above that drone employment is creating long-term “blowback” against the U.S. in the local populations where they operate. This fear was recently articulated by General Stanley McChrystal when he said, “The resentment created by American use of unmanned strikes … is much greater than the average American appreciates…They are hated on a visceral level, even by people who've never seen one or seen the effects of one… [they create the] perception of American arrogance that says, 'Well we can fly where we want, we can shoot where we want, because we can.'”[11] The worry is that by fomenting such hatred, drones do more harm in the long-run, even if they do some good in the short-term.

But such fears rest on empirical questions. And to weigh them properly, we need good data, which we don’t have. So for this particular worry – or any of the other numerous contingent worries that have been raised against drones – it could be very well be shown that the addition of this potential harm tips the (already highly complex) proportionality scale decisively against drone use. But, given the present evidence we do have, we can cautiously conclude that using other means of war instead of drones would likely lead to more unintended deaths of innocent civilians. That creates a very high moral burden in favor of drones over alternative weapons that the blowback fear (or any of the other concerns) has to overcome. Moreover, there’s legitimate debate over the basis of the concern itself and the impacts of using drones in a given region.[12] Any of this is simply hard to predict with any certainty, much less serve as the basis for decisive judgments on the matter.

There’s an abiding tension here regarding the present lethal use of drones by the U.S. that I hope we all can appreciate. If nothing is done, innocent civilians will surely die at the hands of those whom the drones fight against, both in the FATA region and throughout the world. Alternatively, even the most scrupulously prosecuted war still results in the unintended deaths of innocents in the attempt to block the wrongs just described. The proportionality calculus here is vexing. And, when we admit that many of the independent variables in that calculus are based on shaky evidence and unclear policy and procedures, it should give us even further caution.

This deficiency of good data need not be the case, of course, and it is due in no small part to the lack of transparency by the U.S. government regarding their drone program I mentioned above. As an important aside on this point, I enthusiastically add my voice to the chorus calling for something that nearly all in this debate can agree on: more transparency from the U.S. regarding its drone policies and operations.[13]

I would ask those who disagree with my conclusions here: what alternative would you have Washington and Islamabad pursue instead of drones? How, if at all, should they try to stop those militants in FATA who kill thousands of Pakistani civilians? Given the weak evidence we do have, every plausible alternative at present would seem to do more harm, or allow more harm, than the present drone campaign. I do not mean this as a rhetorical question; we need greater dialog about exploring better options going forward, as well as a clear way out of this quagmire, to avoid the specter of perpetual war that the use of drones may herald. There are no easy answers here. All of us long for peace and protection of the innocent from harm. Yet determining the best way to accomplish that is a genuinely difficult question, as is whether drones are part of the problem, part of the solution, or, perhaps, a bit of both.

I began this rumination on our present debate over drones by noting that overconfidence is what often passes for discernment among those who should be more apprehensive. So let me stress that my conclusions are uneasy, delivered with trepidation. We are talking about the intentional killing of human beings. War and its horrible consequences is something that should only ever be done with the somber regret of necessity. With the limited insight we have to the drone operations, I certainly do not claim that all drone strikes are morally justified. Indeed, it’s almost surely the case that many drone strikes currently being carried out are wrong. But it’s equally likely that many strikes are justified and that, more broadly, drones hold tremendous potential for moral improvement in how war is fought, in principle. I make such a tentative, heavily conditioned claim not because I am flippant about those civilians who suffer under the present reign of drones, but rather because I am similarly not flippant about the unsavory realities of those adversaries against whom drones fight. Let there be no doubt: there are those who want to kill and wreak havoc upon innocents, and who have been prevented from doing so because they were killed by drones. It is because I take that reality just as seriously as the many moral downsides and pitfalls of drones that I conclude, however apprehensively, that they are, at present, morally permissible weapons of war.

[6] This is not to suggest that the only way to thwart an unjust threat is via killing – far from it. Whenever possible, non-lethal means should be used to block a threat of unjust harm. The constraints of both necessity and proportionality must be met in any morally justified instance of killing to block an unjust harm. In many cases, those conditions likely are met against those adversaries fought in the FATA region. But this will certainly not be true in all cases, or perhaps even in most cases.

[8] Avery Plaw, “Counting the Dead: The Proportionality of Predation in Pakistan,” in Killing by Remote Control: The Ethics of an Unmanned Military, ed. Bradley J. Strawser(Oxford: Oxford University Press, forthcoming).

[9] Raising awareness of this aspect of the harm done by drones was one of the strengths of the recent study commissioned by Reprieve, known commonly as the Stanford/NYU report, “Living Under Drones: Death Injury, and Trauma to Civilians from US Drone Practices in Pakistan.” http://livingunderdrones.org/. Unfortunately, the report also has many weaknesses and is an example of the continued lack of good, complete, objective data on the impact of drones in the FATA region.

[13] I applaud the (heretofore unsuccessful) efforts of the New York Times to force the U.S. government to disclose the legal justifications behind its drone program. See Jonathan Stempel and Jennifer Saba, “NY Times loses bid to uncover details on drone strikes,” Reuters, January 2, 2013.

Bradley Jay Strawser is an assistant professor in the Defense Analysis Department at the US Naval Postgraduate School in Monterey, California, and a research associate at Oxford’s Institute for Ethics, Law, and Armed Conflict (ELAC) in Oxford, UK. He has written frequently about drones for the press, including for The Guardian and the New York Times. He also has a book forthcoming from Oxford University Press entitled Killing By Remote Control: The Ethics of an Unmanned Military. It is an edited volume on the ethical questions surrounding the employment of UAVs.

To leave a comment, please see the introduction to the DAG-3QD Peace and Justice Symposia, of which this essay is a part, here.