President Obama’s decision also came down to a determination that the CIA was simply better than the Defense Department at locating and killing al-Qaeda operatives with armed drones, according to current and former U.S. officials involved in the deliberations. Even now, as the president plans to shift most drone operations back to the military, many U.S. counter­terrorism officials are convinced that gap in capabilities has not been erased.

WASHINGTON — There were more drone strikes in Pakistan last month than any month since January. Three missile strikes were carried out in Yemen in the last week alone. And after Secretary of State John Kerry told Pakistanis on Thursday that the United States was winding down the drone wars there, officials back in Washington quickly contradicted him.

More than two months after President Obama signaled a sharp shift in America’s targeted-killing operations, there is little public evidence of change in a strategy that has come to define the administration’s approach to combating terrorism.

Most elements of the drone program remain in place, including a base in the southern desert of Saudi Arabia that the Central Intelligence Agency continues to use to carry out drone strikes in Yemen. In late May, administration officials said that the bulk of drone operations would shift to the Pentagon from the C.I.A.

But the C.I.A. continues to run America’s secret air war in Pakistan, where Mr. Kerry’s comments underscored the administration’s haphazard approach to discussing these issues publicly. During a television interview in Pakistan on Thursday, Mr. Kerry said the United States had a “timeline” to end drone strikes in that country’s western mountains, adding, “We hope it’s going to be very, very soon.”

But the Obama administration is expected to carry out drone strikes in Pakistan well into the future. Hours after Mr. Kerry’s interview, the State Department issued a statement saying there was no definite timetable to end the targeted killing program in Pakistan, and a department spokeswoman, Marie Harf, said, “In no way would we ever deprive ourselves of a tool to fight a threat if it arises.”

Micah Zenko, a fellow with the Council on Foreign Relations, who closely follows American drone operations, said Mr. Kerry seemed to have been out of sync with the rest of the Obama administration in talking about the drone program. “There’s nothing that indicates this administration is going to unilaterally end drone strikes in Pakistan,” Mr. Zenko said, “or Yemen for that matter.”

The mixed messages of the past week reveal a deep-seated ambivalence inside the administration about just how much light ought to shine on America’s shadow wars. Even though Mr. Obama pledged a greater transparency and public accountability for drone operations, he and other officials still refuse to discuss specific strikes in public, relying instead on vague statements about “ongoing counterterrorism operations.”

Some of those operations originate from a C.I.A. drone base in the southern desert of Saudi Arabia — the continued existence of which encapsulates the hurdles to changing how the United States carries out targeted-killing operations.

The Saudi government allowed the C.I.A. to build the base on the condition that the Obama administration not acknowledge that it was in Saudi Arabia. The base was completed in 2011, and it was first used for the operation that killed Anwar al-Awlaki, a radical preacher based in Yemen who was an American citizen.

Given longstanding sensitivities about American troops operating from Saudi Arabia, American and Middle Eastern officials say that the Saudi government is unlikely to allow the Pentagon to take over operations at the base — or for the United States to speak openly about the base.

Spokesmen for the White House and the C.I.A. declined to comment.

Similarly, military and intelligence officials in Pakistan initially consented to American drone strikes on the condition that Washington not discuss them publicly — a bargain that became ever harder to honor when the United States significantly expanded American drone operations in the country.

There were three drone strikes in Pakistan last month, the most since January, according to the Bureau of Investigative Journalism, which monitors such strikes. At the same time, the number of strikes has declined in each of the last four years, so in that sense Mr. Kerry’s broader characterization of the program was accurate.

But because the drone program remains classified, administration officials are loath to discuss it in any detail, even when it is at the center of policy discussions, as it was during Mr. Obama’s meeting in the Oval Office on Thursday with President Abdu Rabbu Mansour Hadi of Yemen.

After their meeting, Mr. Obama and Mr. Hadi heaped praise on each other for cooperating on counterterrorism, though neither described the nature of that cooperation. Mr. Obama credited the setbacks of Al Qaeda in the Arabian Peninsula, or A.Q.A.P., the terrorist network’s affiliate in Yemen, not to the drone strikes, but to reforms of the Yemeni military that Mr. Hadi undertook after he took office in February 2012.

=======================================

espite Administration Promises, Few Signs of Change in Drone WarsPublished: August 2, 2013

(Page 2 of 2)

And Mr. Hadi twice stressed that Yemen was acting in its own interests in working with the United States to root out Al Qaeda, since the group’s terrorist attacks had badly damaged Yemen’s economy.

“Yemen’s development basically came to a halt whereby there is no tourism, and the oil companies, the oil-exploring companies, had to leave the country as a result of the presence of Al Qaeda,” Mr. Hadi said.

Asked specifically about the recent increase in drone strikes in Yemen, the White House spokesman, Jay Carney, said: “I can tell you that we do cooperate with Yemen in our counterterrorism efforts. And it is an important relationship, an important connection, given what we know about A.Q.A.P. and the danger it represents to the United States and our allies.”

Analysts said the administration was still grappling with the fact that drones remained the crucial instrument for going after terrorists in Yemen and Pakistan — yet speaking about them publicly could generate a backlash in those countries because of issues like civilian casualties.

That fear is especially pronounced in Pakistan, where C.I.A. drones have become a toxic issue domestically and have provoked anti-American fervor. Mr. Kerry’s remarks seemed to reflect those sensitivities.

“Pakistan’s leaders often say things for public consumption which they don’t mean,” said Husain Haqqani, Pakistan’s former ambassador to the United States. “It seems that this was one of those moments where Secretary Kerry got influenced by his Pakistani hosts.”

Congressional pressure for a public accounting of the drone wars has largely receded, another factor allowing the Obama administration to carry out operations from behind a veil of secrecy.

This year, several senators held up the nomination of John O. Brennan as C.I.A. director to get access to Justice Department legal opinions justifying drone operations. During that session, Senator Rand Paul, Republican of Kentucky, delivered a nearly 13-hour filibuster, railing against the Obama administration for killing American citizens overseas without trial.

For all that, though, the White House was able to get Mr. Brennan confirmed by the Senate without having to give lawmakers all the legal memos.

And, in the months since, there has been little public debate on Capitol Hill about drones, targeted killing and the new American way of war.

President Obama says he wants greater transparency for the clandestine killing of terrorists overseas, largely using missiles fired by drones. There has been little public action on this pledge, but if he is serious, he should consider many of the recommendations made this week by a former legal adviser to the State Department, Harold Koh.

Last year, the Pentagon was forced to suspend drone operations in Seychelles, an island nation in the Indian Ocean, after two Reaper drones crashed on the runway at the main international airport, which serves half a million passengers a year.

The overseas accidents could have repercussions in the United States, where the military and the drone industry are pressing the federal government to open up the skies to remote-controlled aircraft.

German chancellor’s drone “attack” shows the threat of weaponized UAVs

Dutch researchers warn that the next time, that drone could explode.

by Sean Gallagher- Sept 18 2013, 3:49pm MDT

CyberwarNational Security

Small unmanned air vehicles like this quadrocopter could be turned into swarms of exploding flying robots, and Dutch researchers say there's not much that can be done right now to stop them.

Dkroetsch

At a campaign rally in Dresden on September 15, a small quadrocopter flew within feet of German Chancellor Angela Merkel and Defense Minister Thomas de Maiziere, hovering briefly in front of them before crashing into the stage practically at Merkel's feet. Merkel appeared to be amused by the "drone attack," but de Maiziere and others on the stage seemed a bit more unsettled by the robo-kamikaze.

Enlarge / German Chancellor Angela Merkel smiles as a Parrot AR drone comes in for a crash landing during a Christian Democratic Party campaign event September 15.

EPAThe quadrocopter, a Parrot AR drone, was operated by a member of the German Pirate Party as a protest against government surveillance and the ongoing scandal over the Euro Hawk drone program—which failed because it could not get certified to fly in European airspace. In a statement, the deputy head of the Pirate Party, Markus Barenhoff, said, "The goal of the effort was to make Chancellor Merkel and Defense Minister de Maiziere realize what it's like to be subjugated to drone observation." The drone was harmless, aside from potential political collateral damage to Merkel's Christian Democratic Party, and the pilot of the drone was released after being briefly held by police.

While Merkel smirked off the drone in Dresden, even a small explosive charge or grenade aboard a similar drone would have been catastrophic—and defending against such attacks is difficult at best. Unmanned Aerial Vehicle (UAV) researchers from TNO Defense Research, an organization in The Netherlands, recently showed the real risk of that sort of attack, demonstrating that terrorists and insurgents could effectively use current commercial and do-it-yourself drones as weapons in a number of scenarios, including one much like the Dresden event.

Video of the drone-hazing of German Chancellor Merkel, Defense Minister de Maiziere, and members of Merkel's Christian Democratic Party team.

It’s a bird! It’s a plane! It’s a flying hand grenade!

In a paper published during the Unmanned Systems 2013 conference last month, Klaas Jan de Kraker and Rob van de Wiel of TNO Defence Research analyzed the threat posed by "mini-UAVs"—small remote-controlled and autonomous drones weighing less than 20 kilograms (44 pounds).

The research was in part prompted by two incidents in 2010—the crashing of a radio-controlled plane into The Netherlands' House of Parliament as a prank and the foiling of a plot to use explosives-packed, radio-controlled model airplanes to attack the Capitol and the Pentagon by the FBI.

TNO researchers found that small drones, especially those using autonomous navigation, could be stealthy, accurate, and potentially deadly weapons, and the probability of their use is rapidly increasing. The paper presented the following potential scenarios:

•During a large public event in a stadium, a terrorist launches a Mini UAV, which is equipped with a machine gun, from a building at some distance. He directs the Mini UAV into the stadium and remotely fires the machine gun. In the panic that occurs in the stadium numerous people are overrun and die. •During a public speech by a VIP, the VIP is shielded from the audience by bulletproof glass. However, a terrorist deploys a Mini UAV equipped with an explosive, which flies over the shielding glass. The explosive detonates close to the VIP wounding him fatally. •During an expeditionary mission, opposing forces launch a Mini UAV toward a compound. When the Mini UAV has reached the center of the compound, it releases a chemical agent. Luckily this only causes some minor physical effects on people that were present and unprotected. However this has caused significant fear among the compound inhabitants. •During an expeditionary mission, an opposing militant group launches a small swarm of Mini UAVs, each equipped with an explosive, toward an airbase. The Mini UAVs fly toward the fighter jets that are parked on the airbase, and their explosives detonate just above the fighter jets. This significantly damages a number of jets and even destroys one of them.

Because of their size, their low flying altitude, and their relatively low speed, mini-UAVs are particularly hard to detect—especially in an urban environment, the researchers found. Even if they are detected, identifying whether they're a threat or not is still an issue, because it's difficult to determine whether they're armed or just carrying a camera. And because of the short range they're detectable at, security forces would have only seconds of warning to decide whether to attack the drone or not.

"Detection and classification are very difficult," de Kraker and van de Wiel wrote. "This is not only due to their small size but also to their very low flying altitude and speed, and to clutter that occurs from trees and buildings." Tests of a number of commercial and do-it-yourself mini-UAVs in TNO's anechoic radar room revealed that they had a "bird-size" radar cross-section, and their relatively low speed makes them hard to distinguish from birds under even more ideal conditions.

The TNO researchers looked at a number of other ways to detect micro-UAVs, including audio sensors, radio detection of control signals, continuous-wave radar, and infrared. The best results came from mixing radar and infrared—using radar for initial detection and infrared sensors for classification.

Burn them with lasers

Taking down potentially hostile drones once they're detected comes with another set of problems. While radio jamming can be used to interrupt remote-control signals for drones, it might not keep them from reaching their target and would be ineffective against autonomous drones using GPS or GLONASS satellite navigation. Jamming commercial navigation signals could cause autonomous drones to fail to find their target, but could cause other security and safety problems at the same time.

And just shooting down drones in a crowded urban environment could cause more damage than the drones themselves. "Missile systems with small missiles and a suitable guidance mechanism, (rapid fire) guns with suitable ammunition, and machine guns are considered as very effective means for neutralizing Mini UAVs," the researchers wrote, but "downsides may be that missile systems are relatively expensive and that these hard kill systems could generate collateral damage."

The best answer, de Kraker and van de Weil suggested, might be laser and high-power microwave "directed energy solutions," which could be used to heat the drones up until their batteries or electronics are destroyed. These weapons could be deployed in a truck to provide protection for events at public places with lower risk to people and property on the ground than a chain gun or small missiles.

It has been a trying period for defenders of the drone. Public perception has been shaped in large part by the Obama administration’s use of drones in counterterrorism efforts, and civil liberties advocates have long decried the drone’s seemingly boundless capacity to restrict privacy.

Then there was the blemish for local hobbyists last week, when a drone was said to have crashed near Grand Central Terminal, narrowly missing a pedestrian.

And so, at times on Friday, the forum seemed equal parts acknowledgment of the technology’s perils and a self-affirmation exercise for its proponents, who have cited the potential of drones to improve agriculture practices and monitor endangered species, among other applications.

Killer Robots and the Laws of WarAutonomous weapons are coming and can save lives. Let's make sure they're used ethically and legally.By Kenneth Anderson and Matthew WaxmanNov. 3, 2013 6:33 p.m. ET

With each new drone strike by the United States military, anger over the program mounts. On Friday, in one of the most significant U.S. strikes, a drone killed Pakistani Taliban leader Hakimullah Mehsud in the lawless North Waziristan region bordering Afghanistan. Coming as Pakistan is preparing for peace talks with the Taliban, the attack on this major terrorist stirred outrage in Pakistan and was denounced by the country's interior minister, Chaudhry Nisar Ali Khan, who said the U.S. had "murdered the hope and progress for peace in the region."

Recent reports from Amnesty International and Human Rights Watch have also challenged the legality of drone strikes. The protests reflect a general unease in many quarters with the increasingly computerized nature of waging war. Looking well beyond today's drones, a coalition of nongovernmental organizations—the Campaign to Stop Killer Robots—is lobbying for an international treaty to ban the development and use of "fully autonomous weapons."

Computerized weapons capable of killing people sound like something from a dystopian film. So it's understandable why some, scared of the moral challenges such weapons present, would support a ban as the safest policy. In fact, a ban is unnecessary and dangerous.

No country has publicly revealed plans to use fully autonomous weapons, including drone-launched missiles, specifically designed to target humans. However, technologically advanced militaries have long used near-autonomous weapons for targeting other machines. The U.S. Navy's highly automated Aegis Combat System, for example, dates to the 1970s and defends against multiple incoming high-speed threats. Without them, a ship would be helpless against a swarm of missiles. Israel's Iron Dome missile-defense system similarly responds to threats faster than human reaction times permit.

Contrary to what some critics of autonomous weapons claim, there won't be an abrupt shift from human control to machine control in the coming years. Rather, the change will be incremental: Detecting, analyzing and firing on targets will become increasingly automated, and the contexts of when such force is used will expand. As the machines become increasingly adept, the role of humans will gradually shift from full command, to partial command, to oversight and so on.

This evolution is inevitable as sensors, computer analytics and machine learning improve; as states demand greater protection for their military personnel; and as similar technologies in civilian life prove that they are capable of complex tasks, such as driving cars or performing surgery, with greater safety than human operators.

But critics like the Campaign to Stop Killer Robots believe that governments must stop this process. They argue that artificial intelligence will never be capable of meeting the requirements of international law, which distinguishes between combatants and noncombatants and has rules to limit collateral damage. As a moral matter, critics do not believe that decisions to kill should ever be delegated to machines. As a practical matter, they believe that these systems may operate in unpredictable, ruthless ways.

Yet a ban is unlikely to work, especially in constraining states or actors most inclined to abuse these weapons. Those actors will not respect such an agreement, and the technological elements of highly automated weapons will proliferate.

Moreover, because the automation of weapons will happen gradually, it would be nearly impossible to design or enforce such a ban. Because the same system might be operable with or without effective human control or oversight, the line between legal weapons and illegal autonomous ones will not be clear-cut.

If the goal is to reduce suffering and protect human lives, a ban could prove counterproductive. In addition to the self-protective advantages to military forces that use them, autonomous machines may reduce risks to civilians by improving the precision of targeting decisions and better controlling decisions to fire. We know that humans are limited in their capacity to make sound decisions on the battlefield: Anger, panic, fatigue all contribute to mistakes or violations of rules. Autonomous weapons systems have the potential to address these human shortcomings.

No one can say with certainty how much automated capabilities might gradually reduce the harm of warfare, but it would be wrong not to pursue such gains, and it would be especially pernicious to ban research into such technologies.

That said, autonomous weapons warrant careful regulation. Each step toward automation needs to be reviewed carefully to ensure that the weapon complies with the laws of war in its design and permissible uses. Drawing on long-standing international legal rules requiring that weapons be capable of being used in a discriminating manner that limits collateral damage, the U.S. should set very high standards for assessing legally and ethically any research and development programs in this area. Standards should also be set for how these systems are to be used and in what combat environments.

If the past decade of the U.S. drone program has taught us anything, it's that it is crucial to engage the public about new types of weapons and the legal constraints on their design and use. The U.S. government's lack of early transparency about its drone program has made it difficult to defend, even when the alternatives would be less humane. Washington must recognize the strategic imperative to demonstrate new weapons' adherence to high legal and ethical standards.

This approach will not work if the U.S. goes it alone. America should gather a coalition of like-minded partners to adapt existing international legal standards and develop best practices for applying them to autonomous weapons. The British government, for example, has declared its opposition to a treaty ban on autonomous weapons but is urging responsible states to develop common standards for the weapons' use within the laws of war.

Autonomous weapons are not inherently unlawful or unethical. If we adapt legal and ethical norms to address robotic weapons, they can be used responsibly and effectively on the battlefield.

Mr. Anderson is a law professor at American University and a senior fellow of the Brookings Institution. Mr. Waxman is a professor at Columbia Law School and a fellow at the Council on Foreign Relations. Both are members of the Hoover Institution Task Force on National Security and Law.

The interplay between science fiction and the real world is a force that has been there for centuries. At one point, it was through writers like H.G. Wells, because the novel was the main vector for entertainment. Then we moved on to movies and TV shows — think of how powerful Star Trek was in influencing where technology would head next. Now it’s gaming. It’s like what happened in those great old episodes of Star Trek, where they envisioned something futuristic like a handheld communicator and then someone watching in a lab would see it and said, “I’ll make that real.” And now that’s the same for gaming. I was a consultant for the video game Call of Duty: Black Ops II, and I worked on a drone concept for the game, a quadcopter called Charlene. Now defense contractors are trying to make Charlene real. So it flips the relationship. Previously, the military would research and develop something and then spin it out to the civilian sector. Now the military is faced with a challenge of how to spin in technology.

The truth is that no one who buys discounted merchandise on Amazon today will have it delivered by drone, and such deliveries won’t happen for years — if they happen at all. It’s not just that the technology isn’t up to the task yet. It’s not just that federal regulations prohibit such flights over populated areas. It’s that drone delivery doesn’t make economic sense for Amazon, and it will never make sense unless the company completely overhauls its operation.

One example comes to mind: "The feed is so pixelated, what if it's a shovel, and not a weapon?" I felt this confusion constantly, as did my fellow UAV analysts. We always wonder if we killed the right people, if we endangered the wrong people, if we destroyed an innocent civilian's life all because of a bad image or angle.

...

And when you are exposed to it over and over again it becomes like a small video, embedded in your head, forever on repeat, causing psychological pain and suffering that many people will hopefully never experience. UAV troops are victim to not only the haunting memories of this work that they carry with them, but also the guilt of always being a little unsure of how accurate their confirmations of weapons or identification of hostile individuals were.

Our conclusion is that, in conventional war on conventional battlefields, drones are largely just another remote weapon platform. In counterterrorism-on-offense, however, against transnational non-state actor terrorist groups, they do represent something new: first, an offensive, “raiding” capability—the lightest of the light cavalry, deployed against terrorist fighters who, as raiders on offense, have rarely had to confront a counter-raiding capability. But drones offer a very special kind of raiding capability—that is, a “persisting” raiding strategy, to use military historian Archer Jones’ terminology.

Our conclusion is that, in conventional war on conventional battlefields, drones are largely just another remote weapon platform. In counterterrorism-on-offense, however, against transnational non-state actor terrorist groups, they do represent something new: first, an offensive, “raiding” capability—the lightest of the light cavalry, deployed against terrorist fighters who, as raiders on offense, have rarely had to confront a counter-raiding capability. But drones offer a very special kind of raiding capability—that is, a “persisting” raiding strategy, to use military historian Archer Jones’ terminology.

An American citizen who is a member of al-Qaida is actively planning attacks against Americans overseas, U.S. officials say, and the Obama administration is wrestling with whether to kill him with a drone strike and how to do so legally under its new stricter targeting policy issued last year.

The CIA drones watching him cannot strike because he's a U.S. citizen and the Justice Department must build a case against him, a task it hasn't completed.

An American citizen who is a member of al-Qaida is actively planning attacks against Americans overseas, U.S. officials say, and the Obama administration is wrestling with whether to kill him with a drone strike and how to do so legally under its new stricter targeting policy issued last year.

The CIA drones watching him cannot strike because he's a U.S. citizen and the Justice Department must build a case against him, a task it hasn't completed.

“China is positioning itself so that any country on the planet that, for political or financial reasons, is restricted from purchasing American or allied drones will be able to go to Beijing and get a comparable platform,” said Ian Easton, a research fellow at the 2049 Project Institute security think tank. He co-authored a recent report on China’s drones.

China’s $139 billion defense budget last year was the world’s second biggest, accounting for about 9 percent of global military spending, according to a report last week by IHS Jane’s. It’s leading a broader rise in regional military spending, with Australia, India and South Korea also hiking budgets that’s widens opportunities for defense contractors.