Presider:

Description

Please join Paul Twomey, former president and CEO of the Internet Corporation for Assigned Names and Numbers (ICANN), for a conversation on the challenges posed by the present state of global cyber instability for governance at both the corporate and international levels.

Audio

Update Required
To play the media you will need to either update your browser to a recent version or update your Flash plugin.

Transcript

STEWART BAKER: OK. If we could get started, I'm Stewart Baker, and I used to work at both NSA and DHS. And so I'll be moderating this discussion.

I first would ask that you turn off -- turn them off; I know it's a pain in the neck -- your BlackBerrys, your iPhones, your cellphones of all descriptions.

This meeting is on the record. That's important if you're going to ask questions. And so it's -- and that's a little unusual for the council.

Our speaker today over lunch is Paul Twomey. Paul distinguished himself in several different governments in Australia, worked in numerous international fora -- APEC, OECD and elsewhere -- and then became what some would call the chief operating officer of the Internet when he became the head of ICANN, the Internet -- whatever it is -- for assigned names and numbers -- (laughter) -- committee, and is now at Argo Pacific as a consultant.

He'll be talking about Internet governance issues in particular, about which he has enormous expertise and lots of scars.

PAUL TWOMEY: Scars. Good. Well, thanks much for that.

"Cyber Instability and Governance" is not a simple topic, and so I'm going to try to use -- in the first 15 minutes, at least, just give you a couple of frameworks I tend to use to think about it and then talk a little bit about what regimes are in place and what regimes of governance I think are likely or not likely. And this is definitely -- not necessarily an optimistic view of the field.

So first of all, you know, what sort of cyberthreats are there? What does actually contribute to cyberinstability? And I think it's worth dividing that up.

Secondly, what do we mean by the Internet? I think that's an important discussion, because there's assumptions what the Internet is and what it is not.

Thirdly, I want to talk a little bit about international regime formation, for the political scientists in the room, because I think that the political science around international regime formation is quite useful as a framework to think about what norms or regimes may emerge around cyberinstability or cyberstability.

And then finally I'll make a few personal views about what's likely to happen or not, I think.

So let's be clear about what are cyberthreats. They are, first of all, just inadvertent loss or theft. Secondly, they're cybercrime. Thirdly, I would say, there's cybereconomic espionage or cybereconomic crime.

Fourthly, I'd say, what I would call cyberpolitical and -military espionage, as opposed to industrial espionage; cyberpolitical activism, which for some people is not a source of instability but for other people is; and then cyberwarfare or, perhaps more accurately, cyber in warfare, because, I have to say personally, I'm not necessarily the greatest fan of this phrase "cyberwarfare," as if it will stand somehow alone from anything else.

So that -- I'll just be clear about that, because there's a range of cyberthreats, and if one was to read simply the Northeast United States media, you would think it would all cyberwarfare. I think we should be clear about that.

Secondly, let's be clear what the Internet is. The Internet has, you know, multiple layers, but let's say that it's got three.

The first layer is the transit layer. It is the layer by which electrons get sent around the network. And it's often referred to as, you know, the ISPs, and people are aware of that.

Sitting on the top of the network is applications, which in my view is really what people worry about the most. It's my email. It's my websites. It's my databases. It's all the things that I and that are actually the targets for all those nasty things I said before.

What sits between the two that actually makes it an Internet is the mathematical protocols that actually make all of that work as one single interoperable Internet. So the Internet -- the domain name system the IP addressing system; there's some 2,000-plus IP protocols, Internet protocol -- Internet parameters, et cetera; the RFCs -- that is -- they're formed to create how the system works.

And I think the political science or how we describe that, the language you describe that should be important.

I think the way in which the transit stuff works to a degree is a little bit like a guild, medieval guild. It is government-granted licenses. (It's good that I'm fair ?) and anybody who runs an ISP in the room will now criticize me to the rest of the day. But there's aspects about regulation and the role between the prince and the -- and the person getting the benefit that I think that have got similarities to the guild.

The application layer is very much like a market. It's now very firmly a market. But there is still very much in the middle the commons. It is the operation of the stuff in the middle that actually gives us an Internet. And I want to make that point, very importantly, because even those of us in the discussion this morning would have heard a lot of discussion about -- from governmental types saying we have a problem with Internet security and immediately going to the old parameters of the interception model for telecoms and saying we need to get it fixed by the private sector, which is, by the way, (only talking about ?) the transit layer.

And so we need to be careful in all this discussion, I think, not to lose sight of the fact that there is in between those a commons that gives great benefit to everyone and let's not damage the commons in the discussion between the transit layer and those who worry about the applications.

The third area, I think, that's important then to talk about a little bit is the idea of how do you set up a regime, how do you set up governance regimes, generally. And I'm going to use -- I'm going to summarize about 30 or 40 years of political science into four principles. They are, first, that you tend to have an existing epistemic community, often academics, researchers, scientists; secondly, that they tend to implicitly or explicitly agree on a set of values and broad objectives -- and if you want to think about things, think about the Antarctic treaty; you think about potentially even the formation of NATO, in some respects; think about what happened with nuclear doctrine against this test, if you like -- thirdly, that these -- community translate their values and goals into a set of normative steps, things that they believe should happen; and finally, and only finally, there's an institution or some institutional framework established to create that.

So one can think of the Bretton Woods process through the same way. A bunch of economists -- they had worked together, came to a common view, Bretton Woods, lots of discussion about what the values should be. We end up with the IMF and the World Bank.

So if you think about that in terms of the sort of this four-step process, that, I think, is really interesting if we think of that framework, because that framework has certainly been in place in the technical aspects of the Internet. So from the early 1960s, really, mid-1960s onwards there has been a range of technical people involved setting up a set of values they think are important. Those values have tended to become norms. They're actually the way in which the Internet works. They're reflected in things like the Internet -- the RFCs. They're reflected in things like the IP addressing system. And then they get their own institutional expression, things like ICANN -- it's a corporation, actually. It's the --

BAKER: Yeah, it's --

TWOMEY: And it's really only because Jon wanted to say "I can" when it was formed.

Now, if you come from the technical community, these things really matter. If you don't, you probably never heard of them. But they are an existing set of institutional frameworks behind which there is a whole regime. And I think that's important because again in the discussion of Internet instability, one wants to be careful about, you know, where are regimes feasible and what regimes are going to take place.

So when I -- when I use that framework, then I ask the question about the things that we read about in the papers, about the attacks, the espionage, et cetera. And I say, where do you -- where do I think there might be opportunities for governance and further governance -- regime governance for the Internet? And the areas that I think we're likely to see it are -- do I think we can in the area of inadvertent loss? I don't think it's really an area for regime. It's basically you can't do very much. It's up to individuals and companies to not lose things. When you think about cybercrime, is it an opportunity for a regime? I think yes. There you can see there is a community, a common epistemic community. They've got a set of values, and they would think about institutioning it.

Having said that, I think there's a potential for agreement on laws, except I think it'll be culturally or regionally driven. We can see -- we have a European convention. The Commonwealth Heads of Government Meeting in Perth later in October is actually going to look at a proposal for the Commonwealth countries to move on a set of laws. They share a common common-law background, so you can see that would happen. But I think we won't end up with a global convention on crime because of legal differences. Some may disagree with me here, but I think we'll end up with some regional process. And I think -- can see that coming in institutions.

I think enforcement becomes patchy partly because you get the problem of people signing up to things and not enforcing them, but secondly because of where the power is. And if I can share just an instance, I've heard many complaints in this town of people, for instance in other countries -- and there's often referred to -- some people in Russia, for instance -- why don't they simply prosecute the people we know we've found who do bad things? You know, we think there are people, and we've got a list we've shared with the Russians or with anybody else. Why don't they do stuff?

If you look at the problem from outside the United States, the common complaint is, why can't we get the American institutions to give us the information we need so we can actually prosecute the bad guys? And that's not just official interaction with the American law enforcement community. But if you want to actually get your own subpoena, actions that you can check the Gmail account of some criminal, if you're an Australian or a Brit -- and they're the -- you know, with the Canadians and others, you know, the close allies -- that can take six months, if you're lucky. If you're a poor enforcer in St. Petersburg doing your job, it can be almost impossible.

So when we talk about international cooperation and when we talk about the issues of regime, I think one of the things that's particularly important in the United States' experience is that so much of the private sector infrastructure that actually is important for legal enforcement sits here or is distributed and is not easy for the rest of the world to deal with. I'll put that on the table. I think regime change in economic espionage -- I think it's possible for countries themselves that might say their own people will stop doings things. I don't think it's likely in proxies or private sector, frankly. I think that's going to continue to happen. I think any regime around political activism is very unlikely in a "your (opponent's/opponents ?) my friend" problem.

I think cyberwarfare -- it's certainly possible. But in that situation, I suspect we're in an environment where it's going to be more around the norms than actually getting to institutions, and I think in those discussions around cyberwarfare in particular, the difficulties we're going to have is that there's asymmetry in what people want, to use an example we've discussed.

I think the 1930s debate about air power was an interesting one. The Germans and the British in particular drew different lessons from the summer of 1918. The Germans looked at the Monash doctrine for the Battle of Hamel and the end part of that assault through Amiens and drew lessons about close air support for armored attack, which led to blitzkrieg. The British looked at the Botha bombers and drew a completely different lesson about planes being able to fly across the channel and talked about strategic warfare, strategic bombing. So by the time you get to 1939, 1940, the force structures are completely different depending upon what your tactical need was -- one was an island and one was a land power -- and secondly, what were the lessons you learned about the technology.

One could make the argument that I think the United States, Russia, China and other great powers have different perspectives on this driven by their different tactical views about where they are and what they're -- and what they can -- lessons they can learn from the -- from the capacity. And so I think the -- working together on norms is going to be slow. I doubt whether we're going to get to a common view about any sort of regime or treaty or anything like that because at the moment I think the tactical views of the benefits and how to use this new capacity differ according to people's circumstances.

So I think that was -- that was -- that was my only point. That was sort of opening points and (international ?) comments.

BAKER: I --

TWOMEY: Can I just --

BAKER: Yeah.

TWOMEY: -- just before we finish, one point I wanted to make, which we have not discussed so far in this conference. I've talked there most about the state. Most of the infrastructure we've talked about is in the private sector hands. But I think this is a very interesting observation. If you think about those three parts of the Internet -- application layer, protocol layer, transit layer -- even in places with fully run national transit layer -- some parts of North Africa or, you know, parts of East Asia or my home country is in the business -- building one at the moment as well -- they don't control the application layer, and they don't control the protocol layer. Companies, running their own networks, do. And I actually think that one of the greatest opportunities we actually have for governance around cybersecurity in the modern world is actually at the corporate level rather than necessarily at the state level.

Corporates face the same problems. They are -- like countries, they've got -- they use the technology for different purposes, they've got all sorts of internal structures, they have all sorts of -- you know, diverse or not diverse, integrated or not integrated -- all sorts of discussions. But I think if pressure was -- you know, attention was paid to this area, one of the areas we potentially could move faster on in terms of some sort of binding governance regime that could have a real impact is actually at the corporate level, where they do control the three layers.

BAKER: This is great. Then I think what I'd like to do is sort of dig down first into some of the cyberwar norm ideas, then to cybercrime, and if we get to it, to what I think is a really intriguing idea of using corporate networks as models for building a governance structure.

So the most under-reported air war issue -- historical issue that I think is directly relevant to most of what we discussed this morning is what happened on September 1st, 1939, in Washington. Up to that point, people believed that bombing from the air was shocking, uncivilized and devastating and would happen in a war if we didn't take extraordinary measures to outlaw it, very similar to what we think about cyberwar today.

FDR got up, got the news that the war had begun in Poland, and he sent a cable to Hitler, Mussolini, the heads of state in Poland, Britain and France, and said: Surely we all agree that bombing cities is a violation of international law, should be adjured like poison gas; won't you join me in promising that the war that is -- has just broken out will be fought without bombing, from the air, of cities? And remarkably, the Nazis signed on, the Fascists signed on, the Poles, the Brits, the French, all signed on, and for months, everybody lived up to that rule. There's a bunch of Luftwaffe orders saying, you know, you bomb a civilian target, you will answer to me and the Fuhrer, and you won't like it.

It is far more of a norm than we ever will have in cyberwar, is my guess. And of course, as everybody in this room knows, it ultimately failed miserably as a method of preventing Europe's cities, and Japan's, for that matter, from being devastated in things -- in attacks that everybody who was watching those attacks had agreed four years earlier were violations, and profound violations, of the law of war.

So I guess that brings me to my question: How can we possibly expect to come up with cyberwar norms that will actually matter when cyberwar breaks out as opposed to when we're talking about it in a nice, comfortable lunch?

TWOMEY: Yeah. (Laughter.) I mean, I think -- I think these things get pursued for self-interest. In actual -- and let's be clear about this. I mean, I think it's important to think about cyber in warfare, because the idea of a stand-alone cyber war --

BAKER: Yeah, it's unlikely.

TWOMEY: -- I find -- it's science-fictionish. That's not to say that long-standing economic espionage doesn't start having aspects of hostile state behavior, but it's not the same thing, necessarily, as formally defined as -- (inaudible) -- as war. Poison gas in the Second World War wasn't used because both sides decided it just was -- they didn't need to, and if they did, it was going to get out of control.

BAKER: And they were moving too fast.

TWOMEY: And anthrax was a bit the same. Right. I think the value of norms in that part of it, frankly, will be an understanding of language and concepts that people agree to, that in their own heads they say we're not going to pass that threshold, we're not going to use that building block.

BAKER: Right.

TWOMEY: And if both parties know or the various parties know what that building block means, what the definition of it means, and you want to have a conversation in Switzerland somewhere which basically says if you don't, we won't, I think that's probably -- and perhaps I'm being too pessimistic, but I do think one of the values of discussion around norms in a practical sense, if you look at the history of conflict, tends to be more about can we quickly agree what we mean together and what are we deciding we're not going to do.

BAKER: Right. And the difficulty -- I mean, what brought this down was the fact that it's actually very hard to know whether you're bombing a civilian target when you've got fighter planes diving at you when you're flying at 20 feet above the ground. And even if you were trying to abide by the rules, it may not look that way to the guys you're bombing, which also, unfortunately, is how cyberweapons --

TWOMEY: Yeah, well, in some respects, maybe there's an advantage that we actually have the tensions we have now.

BAKER: Yeah.

TWOMEY: The difficulty with gas was that in 19 -- was it '15 -- the Germans first used it, the other side have to do a quick catch-up, the French got it first -- right? -- and you don't have a chance to build any doctrine or share any norms with anybody, because you're in the middle of a war and you've got to start using it.

One of the advantages now is a lot of the tools are being used and we actually have a better sense what they could do. I think we're in a space now where we can have a dialogue, at least, about, again, what are the components, what can the norms look like. We understand better -- that would help if there's actually a time of conflict. But am I persuaded that they will therefore result in people not using them "in extremis"? No.

BAKER: So let's try something a little easier, which is cybercrime. And you, I think, accurately identified the frustrations that a lot of U.S. allies have about the way our MLAT process works. The U.S. view on that is, one, it's the law, we're stuck with it, and two, there are a lot of things that are crimes and subject to criminal investigation in foreign countries that we wouldn't necessarily agree to as crimes or to investigate, and we want to have an ability to throttle that, or at least not cooperate in violations of what we consider human rights. And that's what the MLAT process is designed to protect, that plus the -- you know, the brutal reciprocity of law enforcement relationships, which everyone shares.

You could make that faster, probably, but you're not likely to eliminate it. And if the shoe were on the other foot, I am quite confident that there are 27, or at least 25, European governments that would say, well, wait a minute, before we give information to the United States, we want to make sure that it isn't going to be used in a death penalty case, it isn't going to assist the USA Patriot Act, that no Republicans will benefit, whatever the local hobgoblin is.

How do you deal with that problem? Because that is at the core of getting better cooperation in identifying cybercriminals.

TWOMEY: Yeah, I -- I would say this, wouldn't I, because I've spent the last 20 years of my life involved in the formation of multi-stakeholder organizations, but I do think they have some value in the 21st century in -- as mechanisms to bring various (interests and/interested ?) parties together.

And one of the things I've been thinking about this criminal area is, even just for information sharing at least, perhaps that's something we should look more at. You know, if you made it less state to state, if you're able to bring some of the key private sector players into that dialogue, even if they then decide to follow best practices without having to be told there was a law that they were having to enforce, if you could bring the technical people in to -- people to understand the downsides and upsides of certain questions that got asked, I wonder whether we could improve it. It's never going to solve that problem, but I do think things like -- I mean -- and then, to a degree, you potentially could do that in such a way that at least there was a common set of values in such an entity that people had to sign up to, which I think comes back to his whole regime formation, the important role of values and norms that the community agrees to.

I wonder whether or not we should be thinking more about that, because in the previous role I dealt with, you know, literally every country (code operator ?) in the world, with many other governments and police forces. And I found the objectives of many of those people to be very similar. And so, you know, I think there is -- that's another thing about the Internet world, is that you can actually divide the world geographically, but if you can actually divide it according to certain interests and positions, people have a lot of similarity, and I think that might be a mechanism we should think more about.

BAKER: So one of the things -- then one of the biggest problems in all of these areas, one of the -- you know, the point about cyberwar and arms that we haven't discussed is the absolute inability to attribute this, which makes any agreement on norms unenforceable. But a lot of enforcement and a lot of attribution actually could be furthered by the people who control the transit layers. They actually have customers; they know who those customers are. In the U.S., ISPs have a lot of restrictions on providing that information in the absence of a court order and the MLAT process, et cetera. A lot of that ties, it seems to me, to this notion of state responsibility, that is to say, the first state responsibility one would imagine is to identify the person who is engaged in an attack abroad, whatever that is. And so one of the questions I have is, to what extent can you drive attribution down at least to, you know, who's using this IP address to make it more automatic and to facilitate states taking more responsibility for at least identifying the people in their territory who are engaged in activities abroad that they shouldn't be?

TWOMEY: Well, how about we break that into two parts? I mean, I think -- I think the concept of international -- or of holding -- of holding you accountable, to some degree, to the actions of your own citizens helps in that sort of -- (inaudible) -- you want to prove attribution if you're being accused that your citizens are doing something.

But I also think one of the aspects of public policy that's inevitable is that public policy gets driven by the extreme case. I mean, I used to be -- play a role in Social Security policy in Australia, and it's amazing how much money got spent because of the extreme case. And what tends not to happen at a business level, because of the -- for its efficiency, is that people tend then to say, well, what's -- I'm not going to say lowest common denominator, but what's the low-hanging fruit and what can we do quickly? If you were to have -- and I was just -- and I'm really thinking aloud here -- but if you were to have this sort of a multistakeholder approach to some of the aspects of crime that we can all agree on --

BAKER: Right.

TWOMEY: -- right? -- we've often agreed on what is -- we can all agree on what's theft of intellectual property, we can all agree on those things, then I think you'd probably find people would move on ways of attributing, ways of blocking and ways of (stopping it ?), because we can all agree on that and to agree that -- let's just get that stuff off the table.

The other part which is difficult about attribution, the other stuff about, you know, state-sponsored espionage and -- well, OK. But let's -- when we talk about cyberinstability, there's a whole bunch of stuff we could do, I think, in a new way, and then there's other stuff here that, you know, you get more narrow on, and that's when you get more clearly the role of the state, then I think you can get more on the legal -- traditional legal aspects of, you know -- (always finding ?) concepts that you, Ms. Queen, are responsible for what happens with your subjects, therefore, you know, you should be accountable, tell us what's happening.

BAKER: So let me -- I -- we're going to turn and start letting the audience ask questions in just a second. There was one question I wanted to get to, which is this notion that individual companies, when they run their own network, control all the layers. And we're all on networks of that sort. We don't expect anonymity on those networks; we expect that there are logs of our activities and, unless you're the State Department or the Defense Department, unusual behavior on those activities ought to draw attention. A -- and so you can imagine spreading that kind of security regime beyond the corporation for -- by establishing relationships between corporate networks, including government networks, in which all of the companies that are part of those networks agree on some basic rules about logging and tracking and identifying certain kinds of behavior on their networks, even if it comes from a partner network.

Is that a -- that's -- in some ways that's what I think General Hayden has been talking about when he talks about dot-secure as an approach. What are the opportunities there and what are the risks in trying that approach to security?

TWOMEY: Well, I think there's a strike common law proposition around liability on -- for negligence, that I think directors throughout the world, and particularly for larger corporates and others just have now. And I think many of them don't fully grasp that yet. I think that sort of understanding -- I think the understanding of risk managers -- not just of my own risk but my supply chain, right? -- you just married those two things together, and they're both very powerful forces to get people to start looking at what happens on their networks and what happens to their ecosystem. They're in a cyberecosystem; they're dealing with supplies and customers. What sort of things are going on in supply and customer networks (or how are ?) we interacting? You get customers doing doing that; watch the ISPs follow.

BAKER: Yes.

TWOMEY: And I -- you know, the market is an incredibly -- you know, many of the values in the way the Internet has grown are the same values that have driven, you know, liberal -- indeed, libertarian -- views about the market and social and economic interaction. And I think if you, instead of trying to push against those sets of values, you actually take them into hand and find other ways of reinforcing them, it's an incredibly powerful (legal ?) lever.

So, I mean, I am aware of some companies where that process is starting to take place now. And it's incredible, you know, you -- when they actually start looking at their networks and they're really asking these questions of the C-suite. Then they go: How did we get here? And what's this, and why and how? And, you know, get this fixed and -- you know, there's a whole set of discussions that I think we should -- that's what we should be generating. Rather than necessarily, I think, telling people, do this with your suppliers, or telling ISPs to use linkages, I just think --

BAKER: Creating a --

TWOMEY: -- (inspire and make fear ?) at the C-suite and the board, and you'll make a big difference.

BAKER: So, you know, what that -- that suggests that, yeah, there's a -- that's a doable task for government: scare business -- (laughter) -- and maybe create an opportunity -- create a legal framework in which that kind of cooperation can exist and prosper.

TWOMEY: Oh, I think -- I think a lot of the legal framework is there. It -- you know, it just comes down to just the simple rules existing now around liability. And I've had -- I know in some countries they simply ask the question of the directors: What's your present structure for financial auditing and financial controls? And then they say: You got the same thing for your (network ?) information? And the answer, of course, is no. And they say: Why not?

So I think -- you know, and there's going to be some class-action lawyer in Alabama at some stage who's eventually going to say: I can go after somebody for this --

BAKER: East Texas. (Laughs, laughter.)

TWOMEY: So, I think -- you know, I just think that there's not necessarily a role -- I don't see a role necessarily in places like, you know, Washington, London, Canberra, Ottawa or wherever, to say, you know, we need to pass laws to force companies to do that. We just need to say that the law is there, at least in the common law countries, and accountabilities are there and people should be aware of them. There may be then enablement.

BAKER: Yeah.

TWOMEY: Because there are certainly problems about: How do I share data and how do I -- there's, you know, antitrust laws and lots of things emerge in that. That I think is a -- that if we're going to have the discussion, use what we have now and push the values in one direction, and then narrow down where the enablement problems are.

BAKER: Yeah. So let's let the rest of the audience join in. Raise your hand. Wait for the mic, please. When you get the mic, state your name and your affiliation, and then your question.

Do we have some questions? Yes.

QUESTIONER: Hi. Jason Healey, from the Atlantic Council. Paul, one of the big things that's happened this year when it -- when it comes to Internet and cyberspace, has been the Arab Spring. And we saw on all three levels of the Internet that you talked about how that got blocked by states. I'd be very interested in your perspective, particularly your ICANN perspective, in how that played out and what you think that means for the future of Internet and governance.

TWOMEY: I think what was interesting -- certainly my understanding of some of the stuff that happened in Egypt, for instance, is that while there was attempts to -- and I'll be quite clear: it was my understanding it was not done by the communications ministry; it came from the security parts of the Egyptian regime. There were attempts to close the -- close the transit layer, and even then potentially play with DNS routing, et cetera -- that the -- that the second layer had so much stuff going out of the country, they didn't understand it. I mean, they still saw it in terms of the telephone system, right?

And one of our -- one of our colleagues this morning was talking about how, you know, in the private sector now you're running these big, information-based online businesses. You have no idea where the data -- where each bit of the data comes from any one stage. These are completely globalized structures, because Internet protocol does not understand geography; it doesn't have geography in the protocol.

And so I think one of the surprises, at least my understanding of what happened certainly when the pressures (were brought up ?) in Egypt, was so much other stuff stopped -- you know, real financial stuff that really mattered stopped -- because people didn't realize how much -- how much the traffic was actually going outside the geographic boundaries.

So again, I think, you know, some people -- some parts of the world have their -- have their networks configured and controlled in different ways. And even those parts that people point the finger at and say that these are really controlled, my experience is they're not as controlled as people think they are.

BAKER: Yes.

QUESTIONER: Jody Westby, Global Cyber Risk. Paul, some members of the cybersecurity research community have been studying for a long time recursive domain name service providers and the role they're playing with botnets, and actually have concluded that if ICANN were more active in shutting down some of these recursive providers, that it would significantly curb cybercrime. So if I'm not putting you too much on the spot from your previous role, could you comment on that, please?

TWOMEY: Yeah, well, I think it's a -- it's a good question. Sometimes I think the community doesn't fully understand what the contracts between ICANN and the registrars say and the ability to actually -- you know, what powers the contracts have. I think there is more -- there has been more discussion and understanding about that as of recent times. But again, the need to -- the need to keep refining the contracts I think is the -- is the key point around the registrars.

Understand that ICANN does not have -- if you're a DNA service provider who is linking up through an ICANN registrar to have a domain name, you know, we can only -- ICANN can only enforce against the registrar, or kind of look through the registrar to the -- to the end user. The -- I think there's much more work been done on that -- done that of recently. But I think you put your finger on an aspect about this that I think -- again, turn the negative into a positive. Those mechanism -- contract law, I think, is an incredibly powerful instrument for us to use in this process. And what we need then is engagement of people about the constant reforming of contract law.

Now, the ICANN model has an interesting aspect in it, in that it talks about this idea of accepted policy, or the policy process that's developed. It actually has a clause in every contract where it says: Here's the contract, except for this clause which says this may vary according to what becomes the policy.

All right, now, it's an -- it's a -- it's a piece of legal genius. Some people have argued it makes the contract void for uncertainty. It's a piece of legal genius, and it works that way, but it's too -- sometimes it's too slow. So I think we've got a -- we've got a challenge about how do we do that.

BAKER: Right. Let me -- let me ask you about botnets. It strikes me -- they're surely not the worst cybercrime problem we face, but they also seem like a much more eminently fixable problem. There's a variety of ICANN approaches, and then there's simply identifying the machines, which are eminently identifiable, and going to the people who own them and either getting them fixed or getting them taken offline because they've been compromised.

Why do you think even that problem hasn't been solved, even in the United States?

TWOMEY: I don't know. Well, I mean, you talk about even in the United States -- there is an Australian example at the moment of a code of conduct amongst the ISPs. They've all agreed to actually inform their customers that they think their machines are infected. And there has been talk about actually moving then to put them -- putting them into some form of -- some form of isolation. And that's an industry code. It came from the ISPs responding to government pressure on something, and they came up with this. I mean, I think -- I think there are -- there's, you know, at least in terms of that, there should be more along those lines.

In terms of the botnets, I would just be a little concerned about a sense that they're not going to keep evolving, and particularly the command and control mechanisms are not going to keep evolving in ways that I think makes it very difficult. I mean, some of us who've been involved in the Conficker bot and responding to the Conficker bot, and part of us were always concerned about that was that all you're doing was you're driving the bad guys into other parts of the network where we'd have less visibility of what they were doing. So.

BAKER: Yes.

QUESTIONER: My name is Gene Maranz (sp). Question: How does WikiLeaks fit into your cybergovernance framework?

TWOMEY: I think that's an interesting question. I mean, how does it fit into the American Constitution? You've got an issue of a freedom-of-expression argument, right? I mean I think the WikiLeaks thing, first and foremost one needs to separate out where did that -- I assume when you say WikiLeaks, what you're really talking about is the U.S. cables, and not -- you're talking about WikiLeaks in general. Because if you actually look at WikiLeaks's history, I mean in its first part of what it did -- some of the stuff that got released, I think there was actually official U.S. government near-positive statements made about that.

If you're talking about the most recent stuff with the cables -- not the most recent, but the stuff about the cables, I think you need to differentiate quite clearly between what the criminal action that may have taken place and where did the data come from, the leaking thing, separating it out from the actual sites themselves. When you get to the sites themselves and actually putting that public data up, that starts getting very interesting. And I think this is where -- at the application layer and I think we're inconsistent.

If that wasn't diplomatic cables but was some record company's intellectual property, the French position is pretty strong -- right? -- that (they're going to intervene ?). So if you change the content of that stuff on the application and say, OK, now it's intellectual property instead of the music industry, well, all other governments in the West are really strong on that now, and the French in particular and others have sort of intervened on that -- if you suddenly said that data is suddenly political thought, then they change around.

So I think the governance issues around that sort of stuff at the application layer is where I think we need to parse it -- what the content is and then apply the law of the land as it presently applies. But if you're going to say, I want to stop there by going down into the protocol layers, whatever, and saying I want to block these IP addresses or I want to block this -- you know, then I think you're in a very different space, really.

BAKER: I like that as a solution. The State Department should say, send no cables without including an MP3 of Ke$ha's latest song. (Laughter.)

Yes.

QUESTIONER: Hi. I'm Rebecca MacKinnon with the New America Foundation. I was intrigued by your suggestion about applying a multi-stakeholder model, potentially, to a cybercrime regime and possibly some other governance regimes. And you talked about governments working more the private sector in companies. I'm wondering what about civil society and what is the role for civil society, and if we were -- would that help to kind of solve the problem, I think, of growing mistrust amongst, I think, a number of constituencies around the world that, you know, governments wand companies are sort of getting together and surveilling; and that there is concern that in the pursuit of greater security, of course, dissent and whistleblowing will become much less possible?

So how do we make sure that there's enough public trust and buy-in into these regimes so that we don't see an increasing amount of sort of, you know, guerrilla warfare a la "Anonymous," which is not productive, and people setting up darknets and just sort of, you know, creating something completely new because they have so little trust that those who hold power and wield power in cyberspace are going to protect their rights and interests? And how do we make sure that there's enough public buy-in and sort of social contract and consent that this whole thing will ultimately work? Because if you don't have the public along with you and civil society along with you, I think ultimately the problems are just going to get worse, and "Anonymous" is just the beginning.

TWOMEY: Well, I agree. And please take note that this is a thought-in-process, literally, up here in a seat, right? So I'm thinking aloud on this idea.

But first and foremost, as I have lots of scars to show, if you think about these new institutions or these new regimes, and particularly if you want a multi-stakeholder one, that any key to success is a very narrow statement of objective function. So anything that even looks like it's getting more broad, you'll have all sorts of problems.

So let's take the cybercrime one, just to have a thought idea. Something that was quite focused on, theft and fraud, or, you know, something -- that everybody could get close to having a sense that we've all got a common set of objectives, and that we'll have lots of disputes. These things produce all sorts of controversies, and I think we all know that. But if you were simply to say it's crime, I think you'd have a problem. If you said, no, it's some sets of crimes that I think we can all agree on -- it's theft, it's stealing, it's intellectual property theft, maybe -- (inaudible) -- but, you know, just if you narrowed it down, you might have a chance. And in that circumstance, of course, you'd have to have the civil society, but for the reasons you pointed out before, if you simply said it was crime, well, even in the so-called West we can't all agree what they are, let alone when we get broader.

So I think the civil society, of course, has to be involved in that, but again, narrowing these things down.

BAKER: In the back.

QUESTIONER: Dan Keel (sp), National Defense University. A quick comment and then a question. The comment: As an air power and modern military historian, I think the inhabitants of Warsaw would take exception to you observation that everybody followed the rules right from the start and --

MR. : (They didn't talk to ?) the German lawyers.

QUESTIONER: Didn't stop the Luftwaffe from fairly leveling Warsaw.

But my question is -- I'm in violent agreement with your observation about the unlikeliness of stand-alone cyberwar. In fact, some of my colleagues -- (inaudible) -- have published on this, that in my opinion, our opinion, that cyberwar will be an intrinsic part of real war with a kinetic piece of it, et cetera, et cetera.

But here's the question I want to drive at. One of the things that we've argued is that the cyber-dimension of this, because of the extreme attractiveness for the first strike, because of the potential for destabilization, for the difficulty of escalation control once you start this, it could be a far more dangerous path to go down than many of the cyber-zealots -- and I'm one of those -- would consider it to be.

Your observations.

TWOMEY: Yes. I agree. I mean, there is public reporting that says -- and I can't verify one way or the other -- that President Bush vetoed a proposal for a cyber-attack on Iraqi banks, and the public reporting says because he was not certain what the impact would be upon European ad other banking systems. If any of that was true, I think it was prescient in thinking through the possible unintended consequence problem. So I think your issue of -- the point you're -- I think that is absolutely valid.

And to come back to my point before about norms and the role of norms, I think one of the -- you know, we tend to be pessimistic at the moment because of this very large-scale problem of, you know, pretty large-scale cyberespionage and what-have-you that's going on. We're very -- we're very pessimistic about the age we're in. Potentially, we should think of it this other way around. We should say we're actually being given a blessing, in that we are actually now seeing and we're -- we're watching the capabilities being developed in real time in ways that in the '20s and '30s, for instance, people didn't really get to see. And so for the reasons you're saying and the work you're doing, development of that sort of norms, developing of what -- understanding what could happen and getting a common language around that is probably pretty valuable.

BAKER: So Paul, I sort of agree there's a -- there's an apparent advantage to first strike. And the -- and the difficulty of controlling it means you might as well just go all-in. So what technical and other barriers to a successful first strike should we be thinking about building into our global infrastructure?

TWOMEY: Well, it's an interesting question. I mean, it -- I think for an economy like the United States, it is a question of actually hardening the economy, as opposed necessarily of just sort of hardening the defense establishment. And I think when you come to the issue of hardening the economy, we come back to our conversation before, which is this is where the responsibility I think sits much more back in the boardroom than it does necessarily in the halls of the Congress. And maybe that should be more, again -- you know, I know this -- and DHS and others obviously have got quite big programs on critical infrastructure and identification and relationships with critical infrastructure. And I think that's important, but I think in doing that, that has to be driven at the corporate levels from the very top.

And I think that you can get that -- I think that, you know, in the next stage -- if you -- if you said -- if you said in the next six, 12, 18 months, what stage -- you know, 24 months -- what's that real -- what's the real challenge, I would say it's transferring the understanding from the CISO and the CIO, what-have-you, up to -- chief risk officers -- up to CFOs and CEOs and board members. I mean, I think that's one of the key aspects. And I'm -- it's happening, but I think it's one of the key things to look at.

BAKER: Yeah, I -- but I've spent time in those boardrooms. And when you say, "And you need to be prepared to deal with a Russian or a Chinese sophisticated attack," they just shut down. They don't believe they can do it.

TWOMEY: Yeah, I think some are beginning to understand -- some -- I think some are beginning to understand the true economic risk that is at -- those sort of things have for them. And for some companies it's not high, and for some companies it's quite high, and you'd be surprised what sectors. And, you know, one of the obvious ones at the moment is commodities. And so I think people are beginning to think about that they really do have to respond to that.

BAKER: Right.

Yes.

QUESTIONER: Anna Brady-Estevez, the AES Corporation. Thank you. And just to tie into that last question -- I think you start to touch upon it with the commodities -- how do we prepare ourselves best? Or what have you seen as the best practices against threats on not just the virtual infrastructure, but also the physical infrastructure: power -- you know, we've seen what happens when that goes down; transportation, whether it's trains, bridges; air traffic; all that? Have you seen -- do you believe those initiatives are moving forward? And what are the most effective ways to protect?

TWOMEY: There's a -- there's a lot of work -- there's a lot of work done on that, and there's -- you know, there are some quite fundamental differences between SCADA systems and other networks -- not least the question about logs. You know, you can do forensics in other networks; you could (hardly do ?) forensics in some SCADA networks, and -- but there's a lot of attention been paid to that. I mean, there's whole parts of the -- most governments have now -- have focused on -- a lot on that.

I think what gets interesting when it comes to -- at the moment, the present environment we're in, if you say who has an incentive to go after major SCADA systems, there is some scenario in a general warfare scenario. There might be a scenario vis-a-vis someone who wishes to extort or to -- you know, "hacktivism" or some sort of terrorist political thing -- respond. However, SCADA systems are so reliant and so important for the -- for the real economy, a lot of the players who I think have an incentive to go after industrial espionage don't necessarily have an incentive to want to actually take out production. They might want to understand it, but at the moment I don't think they've got the incentive. I may actually want to understand your intellectual property, but I want to still keep trading with you and doing business with you; therefore I don't necessarily want to take out your production capacity.

And so I think we tend to get -- it's been interesting, there's been a lot of focus on, you know, in the initial public reactions and writing about this, all the fear that the electricity plant will go down, or some mine will go down, or some bank necessarily will go down. I -- my sense at the moment, that's probably at the moment not the great fear. The real fear is: I'll get to understand what you are thinking about, your strategic information, much more; that's what I'm focusing much more on, your strategic information, than taking out your systems.

That's not to say, though -- and Stuxnet obviously is an illustration. If I've got a particular strategic objective to act now to stop your production, then I will look at ways to do that.

BAKER: We should wrap up, but I do want to pursue that just a second. If we were confronting Saddam Hussein today, surely, he would have an incentive to take out power systems and other SCADA systems in the runup to war, and to persuade the American people that it really wasn't worthwhile.

TWOMEY: So I think -- so I think, having given you the analysis I've just given you, one of the difficulties with that (analysis ?) -- (audio break) -- (if ?) I was sitting as the CEO of any major corporate -- (audio break) -- where this was relevant -- you know, power, transport, banking, commodity, you know, other logistics, anything -- the difficulty with the analysis I've just given you, which I think is an accurate analysis at the moment, is that I will say, OK, where do I allocate scarce resources for this quarterly reporting sort of thing? I'm going to allocate them to where I think the real risk is. I don't think necessarily I'm worried about the SCADA systems at the moment. I might be; I may not be.

The challenge that you've just pointed out -- and this is potentially, again, where there's a role -- very much a role for government -- I think is going to be saying and engaging -- saying: OK, we don't think there's a risk this month or next month, but this is a vulnerability that could get turned on at some stage in the future with the political pressure, if political things go wrong. And I think that's part of this.

BAKER: Right.

TWOMEY: I'm not -- I haven't enough insights into how much -- how well that dialogue is going, or how much they've prepared or -- not so much the dialogue as: How much -- how much have I got the incentives in place (so ?) the people who need to spend the money are spending the money.

BAKER: Yeah. So as a reminder, this has been on the record.

We're going to have to stop now, but this has been a terrific presentation, Paul.

UNAUTHORIZED REPRODUCTION, REDISTRIBUTION OR RETRANSMISSION CONSTITUTES A MISAPPROPRIATION UNDER APPLICABLE UNFAIR COMPETITION LAW, AND FEDERAL NEWS SERVICE, INC. RESERVES THE RIGHT TO PURSUE ALL REMEDIES AVAILABLE TO IT IN RESPECT TO SUCH MISAPPROPRIATION.

FEDERAL NEWS SERVICE, INC. IS A PRIVATE FIRM AND IS NOT AFFILIATED WITH THE FEDERAL GOVERNMENT. NO COPYRIGHT IS CLAIMED AS TO ANY PART OF THE ORIGINAL WORK PREPARED BY A UNITED STATES GOVERNMENT OFFICER OR EMPLOYEE AS PART OF THAT PERSON'S OFFICIAL DUTIES.

FOR INFORMATION ON SUBSCRIBING TO FNS, PLEASE CALL 202-347-1400 OR E-MAIL INFO@FEDNEWS.COM.

THIS IS A RUSH TRANSCRIPT.

STEWART BAKER: OK. If we could get started, I'm Stewart Baker, and I used to work at both NSA and DHS. And so I'll be moderating this discussion.

I first would ask that you turn off -- turn them off; I know it's a pain in the neck -- your BlackBerrys, your iPhones, your cellphones of all descriptions.

This meeting is on the record. That's important if you're going to ask questions. And so it's -- and that's a little unusual for the council.

Our speaker today over lunch is Paul Twomey. Paul distinguished himself in several different governments in Australia, worked in numerous international fora -- APEC, OECD and elsewhere -- and then became what some would call the chief operating officer of the Internet when he became the head of ICANN, the Internet -- whatever it is -- for assigned names and numbers -- (laughter) -- committee, and is now at Argo Pacific as a consultant.

He'll be talking about Internet governance issues in particular, about which he has enormous expertise and lots of scars.

PAUL TWOMEY: Scars. Good. Well, thanks much for that.

"Cyber Instability and Governance" is not a simple topic, and so I'm going to try to use -- in the first 15 minutes, at least, just give you a couple of frameworks I tend to use to think about it and then talk a little bit about what regimes are in place and what regimes of governance I think are likely or not likely. And this is definitely -- not necessarily an optimistic view of the field.

So first of all, you know, what sort of cyberthreats are there? What does actually contribute to cyberinstability? And I think it's worth dividing that up.

Secondly, what do we mean by the Internet? I think that's an important discussion, because there's assumptions what the Internet is and what it is not.

Thirdly, I want to talk a little bit about international regime formation, for the political scientists in the room, because I think that the political science around international regime formation is quite useful as a framework to think about what norms or regimes may emerge around cyberinstability or cyberstability.

And then finally I'll make a few personal views about what's likely to happen or not, I think.

So let's be clear about what are cyberthreats. They are, first of all, just inadvertent loss or theft. Secondly, they're cybercrime. Thirdly, I would say, there's cybereconomic espionage or cybereconomic crime.

Fourthly, I'd say, what I would call cyberpolitical and -military espionage, as opposed to industrial espionage; cyberpolitical activism, which for some people is not a source of instability but for other people is; and then cyberwarfare or, perhaps more accurately, cyber in warfare, because, I have to say personally, I'm not necessarily the greatest fan of this phrase "cyberwarfare," as if it will stand somehow alone from anything else.

So that -- I'll just be clear about that, because there's a range of cyberthreats, and if one was to read simply the Northeast United States media, you would think it would all cyberwarfare. I think we should be clear about that.

Secondly, let's be clear what the Internet is. The Internet has, you know, multiple layers, but let's say that it's got three.

The first layer is the transit layer. It is the layer by which electrons get sent around the network. And it's often referred to as, you know, the ISPs, and people are aware of that.

Sitting on the top of the network is applications, which in my view is really what people worry about the most. It's my email. It's my websites. It's my databases. It's all the things that I and that are actually the targets for all those nasty things I said before.

What sits between the two that actually makes it an Internet is the mathematical protocols that actually make all of that work as one single interoperable Internet. So the Internet -- the domain name system the IP addressing system; there's some 2,000-plus IP protocols, Internet protocol -- Internet parameters, et cetera; the RFCs -- that is -- they're formed to create how the system works.

And I think the political science or how we describe that, the language you describe that should be important.

I think the way in which the transit stuff works to a degree is a little bit like a guild, medieval guild. It is government-granted licenses. (It's good that I'm fair ?) and anybody who runs an ISP in the room will now criticize me to the rest of the day. But there's aspects about regulation and the role between the prince and the -- and the person getting the benefit that I think that have got similarities to the guild.

The application layer is very much like a market. It's now very firmly a market. But there is still very much in the middle the commons. It is the operation of the stuff in the middle that actually gives us an Internet. And I want to make that point, very importantly, because even those of us in the discussion this morning would have heard a lot of discussion about -- from governmental types saying we have a problem with Internet security and immediately going to the old parameters of the interception model for telecoms and saying we need to get it fixed by the private sector, which is, by the way, (only talking about ?) the transit layer.

And so we need to be careful in all this discussion, I think, not to lose sight of the fact that there is in between those a commons that gives great benefit to everyone and let's not damage the commons in the discussion between the transit layer and those who worry about the applications.

The third area, I think, that's important then to talk about a little bit is the idea of how do you set up a regime, how do you set up governance regimes, generally. And I'm going to use -- I'm going to summarize about 30 or 40 years of political science into four principles. They are, first, that you tend to have an existing epistemic community, often academics, researchers, scientists; secondly, that they tend to implicitly or explicitly agree on a set of values and broad objectives -- and if you want to think about things, think about the Antarctic treaty; you think about potentially even the formation of NATO, in some respects; think about what happened with nuclear doctrine against this test, if you like -- thirdly, that these -- community translate their values and goals into a set of normative steps, things that they believe should happen; and finally, and only finally, there's an institution or some institutional framework established to create that.

So one can think of the Bretton Woods process through the same way. A bunch of economists -- they had worked together, came to a common view, Bretton Woods, lots of discussion about what the values should be. We end up with the IMF and the World Bank.

So if you think about that in terms of the sort of this four-step process, that, I think, is really interesting if we think of that framework, because that framework has certainly been in place in the technical aspects of the Internet. So from the early 1960s, really, mid-1960s onwards there has been a range of technical people involved setting up a set of values they think are important. Those values have tended to become norms. They're actually the way in which the Internet works. They're reflected in things like the Internet -- the RFCs. They're reflected in things like the IP addressing system. And then they get their own institutional expression, things like ICANN -- it's a corporation, actually. It's the --

BAKER: Yeah, it's --

TWOMEY: And it's really only because Jon wanted to say "I can" when it was formed.

Now, if you come from the technical community, these things really matter. If you don't, you probably never heard of them. But they are an existing set of institutional frameworks behind which there is a whole regime. And I think that's important because again in the discussion of Internet instability, one wants to be careful about, you know, where are regimes feasible and what regimes are going to take place.

So when I -- when I use that framework, then I ask the question about the things that we read about in the papers, about the attacks, the espionage, et cetera. And I say, where do you -- where do I think there might be opportunities for governance and further governance -- regime governance for the Internet? And the areas that I think we're likely to see it are -- do I think we can in the area of inadvertent loss? I don't think it's really an area for regime. It's basically you can't do very much. It's up to individuals and companies to not lose things. When you think about cybercrime, is it an opportunity for a regime? I think yes. There you can see there is a community, a common epistemic community. They've got a set of values, and they would think about institutioning it.

Having said that, I think there's a potential for agreement on laws, except I think it'll be culturally or regionally driven. We can see -- we have a European convention. The Commonwealth Heads of Government Meeting in Perth later in October is actually going to look at a proposal for the Commonwealth countries to move on a set of laws. They share a common common-law background, so you can see that would happen. But I think we won't end up with a global convention on crime because of legal differences. Some may disagree with me here, but I think we'll end up with some regional process. And I think -- can see that coming in institutions.

I think enforcement becomes patchy partly because you get the problem of people signing up to things and not enforcing them, but secondly because of where the power is. And if I can share just an instance, I've heard many complaints in this town of people, for instance in other countries -- and there's often referred to -- some people in Russia, for instance -- why don't they simply prosecute the people we know we've found who do bad things? You know, we think there are people, and we've got a list we've shared with the Russians or with anybody else. Why don't they do stuff?

If you look at the problem from outside the United States, the common complaint is, why can't we get the American institutions to give us the information we need so we can actually prosecute the bad guys? And that's not just official interaction with the American law enforcement community. But if you want to actually get your own subpoena, actions that you can check the Gmail account of some criminal, if you're an Australian or a Brit -- and they're the -- you know, with the Canadians and others, you know, the close allies -- that can take six months, if you're lucky. If you're a poor enforcer in St. Petersburg doing your job, it can be almost impossible.

So when we talk about international cooperation and when we talk about the issues of regime, I think one of the things that's particularly important in the United States' experience is that so much of the private sector infrastructure that actually is important for legal enforcement sits here or is distributed and is not easy for the rest of the world to deal with. I'll put that on the table. I think regime change in economic espionage -- I think it's possible for countries themselves that might say their own people will stop doings things. I don't think it's likely in proxies or private sector, frankly. I think that's going to continue to happen. I think any regime around political activism is very unlikely in a "your (opponent's/opponents ?) my friend" problem.

I think cyberwarfare -- it's certainly possible. But in that situation, I suspect we're in an environment where it's going to be more around the norms than actually getting to institutions, and I think in those discussions around cyberwarfare in particular, the difficulties we're going to have is that there's asymmetry in what people want, to use an example we've discussed.

I think the 1930s debate about air power was an interesting one. The Germans and the British in particular drew different lessons from the summer of 1918. The Germans looked at the Monash doctrine for the Battle of Hamel and the end part of that assault through Amiens and drew lessons about close air support for armored attack, which led to blitzkrieg. The British looked at the Botha bombers and drew a completely different lesson about planes being able to fly across the channel and talked about strategic warfare, strategic bombing. So by the time you get to 1939, 1940, the force structures are completely different depending upon what your tactical need was -- one was an island and one was a land power -- and secondly, what were the lessons you learned about the technology.

One could make the argument that I think the United States, Russia, China and other great powers have different perspectives on this driven by their different tactical views about where they are and what they're -- and what they can -- lessons they can learn from the -- from the capacity. And so I think the -- working together on norms is going to be slow. I doubt whether we're going to get to a common view about any sort of regime or treaty or anything like that because at the moment I think the tactical views of the benefits and how to use this new capacity differ according to people's circumstances.

So I think that was -- that was -- that was my only point. That was sort of opening points and (international ?) comments.

BAKER: I --

TWOMEY: Can I just --

BAKER: Yeah.

TWOMEY: -- just before we finish, one point I wanted to make, which we have not discussed so far in this conference. I've talked there most about the state. Most of the infrastructure we've talked about is in the private sector hands. But I think this is a very interesting observation. If you think about those three parts of the Internet -- application layer, protocol layer, transit layer -- even in places with fully run national transit layer -- some parts of North Africa or, you know, parts of East Asia or my home country is in the business -- building one at the moment as well -- they don't control the application layer, and they don't control the protocol layer. Companies, running their own networks, do. And I actually think that one of the greatest opportunities we actually have for governance around cybersecurity in the modern world is actually at the corporate level rather than necessarily at the state level.

Corporates face the same problems. They are -- like countries, they've got -- they use the technology for different purposes, they've got all sorts of internal structures, they have all sorts of -- you know, diverse or not diverse, integrated or not integrated -- all sorts of discussions. But I think if pressure was -- you know, attention was paid to this area, one of the areas we potentially could move faster on in terms of some sort of binding governance regime that could have a real impact is actually at the corporate level, where they do control the three layers.

BAKER: This is great. Then I think what I'd like to do is sort of dig down first into some of the cyberwar norm ideas, then to cybercrime, and if we get to it, to what I think is a really intriguing idea of using corporate networks as models for building a governance structure.

So the most under-reported air war issue -- historical issue that I think is directly relevant to most of what we discussed this morning is what happened on September 1st, 1939, in Washington. Up to that point, people believed that bombing from the air was shocking, uncivilized and devastating and would happen in a war if we didn't take extraordinary measures to outlaw it, very similar to what we think about cyberwar today.

FDR got up, got the news that the war had begun in Poland, and he sent a cable to Hitler, Mussolini, the heads of state in Poland, Britain and France, and said: Surely we all agree that bombing cities is a violation of international law, should be adjured like poison gas; won't you join me in promising that the war that is -- has just broken out will be fought without bombing, from the air, of cities? And remarkably, the Nazis signed on, the Fascists signed on, the Poles, the Brits, the French, all signed on, and for months, everybody lived up to that rule. There's a bunch of Luftwaffe orders saying, you know, you bomb a civilian target, you will answer to me and the Fuhrer, and you won't like it.

It is far more of a norm than we ever will have in cyberwar, is my guess. And of course, as everybody in this room knows, it ultimately failed miserably as a method of preventing Europe's cities, and Japan's, for that matter, from being devastated in things -- in attacks that everybody who was watching those attacks had agreed four years earlier were violations, and profound violations, of the law of war.

So I guess that brings me to my question: How can we possibly expect to come up with cyberwar norms that will actually matter when cyberwar breaks out as opposed to when we're talking about it in a nice, comfortable lunch?

TWOMEY: Yeah. (Laughter.) I mean, I think -- I think these things get pursued for self-interest. In actual -- and let's be clear about this. I mean, I think it's important to think about cyber in warfare, because the idea of a stand-alone cyber war --

BAKER: Yeah, it's unlikely.

TWOMEY: -- I find -- it's science-fictionish. That's not to say that long-standing economic espionage doesn't start having aspects of hostile state behavior, but it's not the same thing, necessarily, as formally defined as -- (inaudible) -- as war. Poison gas in the Second World War wasn't used because both sides decided it just was -- they didn't need to, and if they did, it was going to get out of control.

BAKER: And they were moving too fast.

TWOMEY: And anthrax was a bit the same. Right. I think the value of norms in that part of it, frankly, will be an understanding of language and concepts that people agree to, that in their own heads they say we're not going to pass that threshold, we're not going to use that building block.

BAKER: Right.

TWOMEY: And if both parties know or the various parties know what that building block means, what the definition of it means, and you want to have a conversation in Switzerland somewhere which basically says if you don't, we won't, I think that's probably -- and perhaps I'm being too pessimistic, but I do think one of the values of discussion around norms in a practical sense, if you look at the history of conflict, tends to be more about can we quickly agree what we mean together and what are we deciding we're not going to do.

BAKER: Right. And the difficulty -- I mean, what brought this down was the fact that it's actually very hard to know whether you're bombing a civilian target when you've got fighter planes diving at you when you're flying at 20 feet above the ground. And even if you were trying to abide by the rules, it may not look that way to the guys you're bombing, which also, unfortunately, is how cyberweapons --

TWOMEY: Yeah, well, in some respects, maybe there's an advantage that we actually have the tensions we have now.

BAKER: Yeah.

TWOMEY: The difficulty with gas was that in 19 -- was it '15 -- the Germans first used it, the other side have to do a quick catch-up, the French got it first -- right? -- and you don't have a chance to build any doctrine or share any norms with anybody, because you're in the middle of a war and you've got to start using it.

One of the advantages now is a lot of the tools are being used and we actually have a better sense what they could do. I think we're in a space now where we can have a dialogue, at least, about, again, what are the components, what can the norms look like. We understand better -- that would help if there's actually a time of conflict. But am I persuaded that they will therefore result in people not using them "in extremis"? No.

BAKER: So let's try something a little easier, which is cybercrime. And you, I think, accurately identified the frustrations that a lot of U.S. allies have about the way our MLAT process works. The U.S. view on that is, one, it's the law, we're stuck with it, and two, there are a lot of things that are crimes and subject to criminal investigation in foreign countries that we wouldn't necessarily agree to as crimes or to investigate, and we want to have an ability to throttle that, or at least not cooperate in violations of what we consider human rights. And that's what the MLAT process is designed to protect, that plus the -- you know, the brutal reciprocity of law enforcement relationships, which everyone shares.

You could make that faster, probably, but you're not likely to eliminate it. And if the shoe were on the other foot, I am quite confident that there are 27, or at least 25, European governments that would say, well, wait a minute, before we give information to the United States, we want to make sure that it isn't going to be used in a death penalty case, it isn't going to assist the USA Patriot Act, that no Republicans will benefit, whatever the local hobgoblin is.

How do you deal with that problem? Because that is at the core of getting better cooperation in identifying cybercriminals.

TWOMEY: Yeah, I -- I would say this, wouldn't I, because I've spent the last 20 years of my life involved in the formation of multi-stakeholder organizations, but I do think they have some value in the 21st century in -- as mechanisms to bring various (interests and/interested ?) parties together.

And one of the things I've been thinking about this criminal area is, even just for information sharing at least, perhaps that's something we should look more at. You know, if you made it less state to state, if you're able to bring some of the key private sector players into that dialogue, even if they then decide to follow best practices without having to be told there was a law that they were having to enforce, if you could bring the technical people in to -- people to understand the downsides and upsides of certain questions that got asked, I wonder whether we could improve it. It's never going to solve that problem, but I do think things like -- I mean -- and then, to a degree, you potentially could do that in such a way that at least there was a common set of values in such an entity that people had to sign up to, which I think comes back to his whole regime formation, the important role of values and norms that the community agrees to.

I wonder whether or not we should be thinking more about that, because in the previous role I dealt with, you know, literally every country (code operator ?) in the world, with many other governments and police forces. And I found the objectives of many of those people to be very similar. And so, you know, I think there is -- that's another thing about the Internet world, is that you can actually divide the world geographically, but if you can actually divide it according to certain interests and positions, people have a lot of similarity, and I think that might be a mechanism we should think more about.

BAKER: So one of the things -- then one of the biggest problems in all of these areas, one of the -- you know, the point about cyberwar and arms that we haven't discussed is the absolute inability to attribute this, which makes any agreement on norms unenforceable. But a lot of enforcement and a lot of attribution actually could be furthered by the people who control the transit layers. They actually have customers; they know who those customers are. In the U.S., ISPs have a lot of restrictions on providing that information in the absence of a court order and the MLAT process, et cetera. A lot of that ties, it seems to me, to this notion of state responsibility, that is to say, the first state responsibility one would imagine is to identify the person who is engaged in an attack abroad, whatever that is. And so one of the questions I have is, to what extent can you drive attribution down at least to, you know, who's using this IP address to make it more automatic and to facilitate states taking more responsibility for at least identifying the people in their territory who are engaged in activities abroad that they shouldn't be?

TWOMEY: Well, how about we break that into two parts? I mean, I think -- I think the concept of international -- or of holding -- of holding you accountable, to some degree, to the actions of your own citizens helps in that sort of -- (inaudible) -- you want to prove attribution if you're being accused that your citizens are doing something.

But I also think one of the aspects of public policy that's inevitable is that public policy gets driven by the extreme case. I mean, I used to be -- play a role in Social Security policy in Australia, and it's amazing how much money got spent because of the extreme case. And what tends not to happen at a business level, because of the -- for its efficiency, is that people tend then to say, well, what's -- I'm not going to say lowest common denominator, but what's the low-hanging fruit and what can we do quickly? If you were to have -- and I was just -- and I'm really thinking aloud here -- but if you were to have this sort of a multistakeholder approach to some of the aspects of crime that we can all agree on --

BAKER: Right.

TWOMEY: -- right? -- we've often agreed on what is -- we can all agree on what's theft of intellectual property, we can all agree on those things, then I think you'd probably find people would move on ways of attributing, ways of blocking and ways of (stopping it ?), because we can all agree on that and to agree that -- let's just get that stuff off the table.

The other part which is difficult about attribution, the other stuff about, you know, state-sponsored espionage and -- well, OK. But let's -- when we talk about cyberinstability, there's a whole bunch of stuff we could do, I think, in a new way, and then there's other stuff here that, you know, you get more narrow on, and that's when you get more clearly the role of the state, then I think you can get more on the legal -- traditional legal aspects of, you know -- (always finding ?) concepts that you, Ms. Queen, are responsible for what happens with your subjects, therefore, you know, you should be accountable, tell us what's happening.

BAKER: So let me -- I -- we're going to turn and start letting the audience ask questions in just a second. There was one question I wanted to get to, which is this notion that individual companies, when they run their own network, control all the layers. And we're all on networks of that sort. We don't expect anonymity on those networks; we expect that there are logs of our activities and, unless you're the State Department or the Defense Department, unusual behavior on those activities ought to draw attention. A -- and so you can imagine spreading that kind of security regime beyond the corporation for -- by establishing relationships between corporate networks, including government networks, in which all of the companies that are part of those networks agree on some basic rules about logging and tracking and identifying certain kinds of behavior on their networks, even if it comes from a partner network.

Is that a -- that's -- in some ways that's what I think General Hayden has been talking about when he talks about dot-secure as an approach. What are the opportunities there and what are the risks in trying that approach to security?

TWOMEY: Well, I think there's a strike common law proposition around liability on -- for negligence, that I think directors throughout the world, and particularly for larger corporates and others just have now. And I think many of them don't fully grasp that yet. I think that sort of understanding -- I think the understanding of risk managers -- not just of my own risk but my supply chain, right? -- you just married those two things together, and they're both very powerful forces to get people to start looking at what happens on their networks and what happens to their ecosystem. They're in a cyberecosystem; they're dealing with supplies and customers. What sort of things are going on in supply and customer networks (or how are ?) we interacting? You get customers doing doing that; watch the ISPs follow.

BAKER: Yes.

TWOMEY: And I -- you know, the market is an incredibly -- you know, many of the values in the way the Internet has grown are the same values that have driven, you know, liberal -- indeed, libertarian -- views about the market and social and economic interaction. And I think if you, instead of trying to push against those sets of values, you actually take them into hand and find other ways of reinforcing them, it's an incredibly powerful (legal ?) lever.

So, I mean, I am aware of some companies where that process is starting to take place now. And it's incredible, you know, you -- when they actually start looking at their networks and they're really asking these questions of the C-suite. Then they go: How did we get here? And what's this, and why and how? And, you know, get this fixed and -- you know, there's a whole set of discussions that I think we should -- that's what we should be generating. Rather than necessarily, I think, telling people, do this with your suppliers, or telling ISPs to use linkages, I just think --

BAKER: Creating a --

TWOMEY: -- (inspire and make fear ?) at the C-suite and the board, and you'll make a big difference.

BAKER: So, you know, what that -- that suggests that, yeah, there's a -- that's a doable task for government: scare business -- (laughter) -- and maybe create an opportunity -- create a legal framework in which that kind of cooperation can exist and prosper.

TWOMEY: Oh, I think -- I think a lot of the legal framework is there. It -- you know, it just comes down to just the simple rules existing now around liability. And I've had -- I know in some countries they simply ask the question of the directors: What's your present structure for financial auditing and financial controls? And then they say: You got the same thing for your (network ?) information? And the answer, of course, is no. And they say: Why not?

So I think -- you know, and there's going to be some class-action lawyer in Alabama at some stage who's eventually going to say: I can go after somebody for this --

BAKER: East Texas. (Laughs, laughter.)

TWOMEY: So, I think -- you know, I just think that there's not necessarily a role -- I don't see a role necessarily in places like, you know, Washington, London, Canberra, Ottawa or wherever, to say, you know, we need to pass laws to force companies to do that. We just need to say that the law is there, at least in the common law countries, and accountabilities are there and people should be aware of them. There may be then enablement.

BAKER: Yeah.

TWOMEY: Because there are certainly problems about: How do I share data and how do I -- there's, you know, antitrust laws and lots of things emerge in that. That I think is a -- that if we're going to have the discussion, use what we have now and push the values in one direction, and then narrow down where the enablement problems are.

BAKER: Yeah. So let's let the rest of the audience join in. Raise your hand. Wait for the mic, please. When you get the mic, state your name and your affiliation, and then your question.

Do we have some questions? Yes.

QUESTIONER: Hi. Jason Healey, from the Atlantic Council. Paul, one of the big things that's happened this year when it -- when it comes to Internet and cyberspace, has been the Arab Spring. And we saw on all three levels of the Internet that you talked about how that got blocked by states. I'd be very interested in your perspective, particularly your ICANN perspective, in how that played out and what you think that means for the future of Internet and governance.

TWOMEY: I think what was interesting -- certainly my understanding of some of the stuff that happened in Egypt, for instance, is that while there was attempts to -- and I'll be quite clear: it was my understanding it was not done by the communications ministry; it came from the security parts of the Egyptian regime. There were attempts to close the -- close the transit layer, and even then potentially play with DNS routing, et cetera -- that the -- that the second layer had so much stuff going out of the country, they didn't understand it. I mean, they still saw it in terms of the telephone system, right?

And one of our -- one of our colleagues this morning was talking about how, you know, in the private sector now you're running these big, information-based online businesses. You have no idea where the data -- where each bit of the data comes from any one stage. These are completely globalized structures, because Internet protocol does not understand geography; it doesn't have geography in the protocol.

And so I think one of the surprises, at least my understanding of what happened certainly when the pressures (were brought up ?) in Egypt, was so much other stuff stopped -- you know, real financial stuff that really mattered stopped -- because people didn't realize how much -- how much the traffic was actually going outside the geographic boundaries.

So again, I think, you know, some people -- some parts of the world have their -- have their networks configured and controlled in different ways. And even those parts that people point the finger at and say that these are really controlled, my experience is they're not as controlled as people think they are.

BAKER: Yes.

QUESTIONER: Jody Westby, Global Cyber Risk. Paul, some members of the cybersecurity research community have been studying for a long time recursive domain name service providers and the role they're playing with botnets, and actually have concluded that if ICANN were more active in shutting down some of these recursive providers, that it would significantly curb cybercrime. So if I'm not putting you too much on the spot from your previous role, could you comment on that, please?

TWOMEY: Yeah, well, I think it's a -- it's a good question. Sometimes I think the community doesn't fully understand what the contracts between ICANN and the registrars say and the ability to actually -- you know, what powers the contracts have. I think there is more -- there has been more discussion and understanding about that as of recent times. But again, the need to -- the need to keep refining the contracts I think is the -- is the key point around the registrars.

Understand that ICANN does not have -- if you're a DNA service provider who is linking up through an ICANN registrar to have a domain name, you know, we can only -- ICANN can only enforce against the registrar, or kind of look through the registrar to the -- to the end user. The -- I think there's much more work been done on that -- done that of recently. But I think you put your finger on an aspect about this that I think -- again, turn the negative into a positive. Those mechanism -- contract law, I think, is an incredibly powerful instrument for us to use in this process. And what we need then is engagement of people about the constant reforming of contract law.

Now, the ICANN model has an interesting aspect in it, in that it talks about this idea of accepted policy, or the policy process that's developed. It actually has a clause in every contract where it says: Here's the contract, except for this clause which says this may vary according to what becomes the policy.

All right, now, it's an -- it's a -- it's a piece of legal genius. Some people have argued it makes the contract void for uncertainty. It's a piece of legal genius, and it works that way, but it's too -- sometimes it's too slow. So I think we've got a -- we've got a challenge about how do we do that.

BAKER: Right. Let me -- let me ask you about botnets. It strikes me -- they're surely not the worst cybercrime problem we face, but they also seem like a much more eminently fixable problem. There's a variety of ICANN approaches, and then there's simply identifying the machines, which are eminently identifiable, and going to the people who own them and either getting them fixed or getting them taken offline because they've been compromised.

Why do you think even that problem hasn't been solved, even in the United States?

TWOMEY: I don't know. Well, I mean, you talk about even in the United States -- there is an Australian example at the moment of a code of conduct amongst the ISPs. They've all agreed to actually inform their customers that they think their machines are infected. And there has been talk about actually moving then to put them -- putting them into some form of -- some form of isolation. And that's an industry code. It came from the ISPs responding to government pressure on something, and they came up with this. I mean, I think -- I think there are -- there's, you know, at least in terms of that, there should be more along those lines.

In terms of the botnets, I would just be a little concerned about a sense that they're not going to keep evolving, and particularly the command and control mechanisms are not going to keep evolving in ways that I think makes it very difficult. I mean, some of us who've been involved in the Conficker bot and responding to the Conficker bot, and part of us were always concerned about that was that all you're doing was you're driving the bad guys into other parts of the network where we'd have less visibility of what they were doing. So.

BAKER: Yes.

QUESTIONER: My name is Gene Maranz (sp). Question: How does WikiLeaks fit into your cybergovernance framework?

TWOMEY: I think that's an interesting question. I mean, how does it fit into the American Constitution? You've got an issue of a freedom-of-expression argument, right? I mean I think the WikiLeaks thing, first and foremost one needs to separate out where did that -- I assume when you say WikiLeaks, what you're really talking about is the U.S. cables, and not -- you're talking about WikiLeaks in general. Because if you actually look at WikiLeaks's history, I mean in its first part of what it did -- some of the stuff that got released, I think there was actually official U.S. government near-positive statements made about that.

If you're talking about the most recent stuff with the cables -- not the most recent, but the stuff about the cables, I think you need to differentiate quite clearly between what the criminal action that may have taken place and where did the data come from, the leaking thing, separating it out from the actual sites themselves. When you get to the sites themselves and actually putting that public data up, that starts getting very interesting. And I think this is where -- at the application layer and I think we're inconsistent.

If that wasn't diplomatic cables but was some record company's intellectual property, the French position is pretty strong -- right? -- that (they're going to intervene ?). So if you change the content of that stuff on the application and say, OK, now it's intellectual property instead of the music industry, well, all other governments in the West are really strong on that now, and the French in particular and others have sort of intervened on that -- if you suddenly said that data is suddenly political thought, then they change around.

So I think the governance issues around that sort of stuff at the application layer is where I think we need to parse it -- what the content is and then apply the law of the land as it presently applies. But if you're going to say, I want to stop there by going down into the protocol layers, whatever, and saying I want to block these IP addresses or I want to block this -- you know, then I think you're in a very different space, really.

BAKER: I like that as a solution. The State Department should say, send no cables without including an MP3 of Ke$ha's latest song. (Laughter.)

Yes.

QUESTIONER: Hi. I'm Rebecca MacKinnon with the New America Foundation. I was intrigued by your suggestion about applying a multi-stakeholder model, potentially, to a cybercrime regime and possibly some other governance regimes. And you talked about governments working more the private sector in companies. I'm wondering what about civil society and what is the role for civil society, and if we were -- would that help to kind of solve the problem, I think, of growing mistrust amongst, I think, a number of constituencies around the world that, you know, governments wand companies are sort of getting together and surveilling; and that there is concern that in the pursuit of greater security, of course, dissent and whistleblowing will become much less possible?

So how do we make sure that there's enough public trust and buy-in into these regimes so that we don't see an increasing amount of sort of, you know, guerrilla warfare a la "Anonymous," which is not productive, and people setting up darknets and just sort of, you know, creating something completely new because they have so little trust that those who hold power and wield power in cyberspace are going to protect their rights and interests? And how do we make sure that there's enough public buy-in and sort of social contract and consent that this whole thing will ultimately work? Because if you don't have the public along with you and civil society along with you, I think ultimately the problems are just going to get worse, and "Anonymous" is just the beginning.

TWOMEY: Well, I agree. And please take note that this is a thought-in-process, literally, up here in a seat, right? So I'm thinking aloud on this idea.

But first and foremost, as I have lots of scars to show, if you think about these new institutions or these new regimes, and particularly if you want a multi-stakeholder one, that any key to success is a very narrow statement of objective function. So anything that even looks like it's getting more broad, you'll have all sorts of problems.

So let's take the cybercrime one, just to have a thought idea. Something that was quite focused on, theft and fraud, or, you know, something -- that everybody could get close to having a sense that we've all got a common set of objectives, and that we'll have lots of disputes. These things produce all sorts of controversies, and I think we all know that. But if you were simply to say it's crime, I think you'd have a problem. If you said, no, it's some sets of crimes that I think we can all agree on -- it's theft, it's stealing, it's intellectual property theft, maybe -- (inaudible) -- but, you know, just if you narrowed it down, you might have a chance. And in that circumstance, of course, you'd have to have the civil society, but for the reasons you pointed out before, if you simply said it was crime, well, even in the so-called West we can't all agree what they are, let alone when we get broader.

So I think the civil society, of course, has to be involved in that, but again, narrowing these things down.

BAKER: In the back.

QUESTIONER: Dan Keel (sp), National Defense University. A quick comment and then a question. The comment: As an air power and modern military historian, I think the inhabitants of Warsaw would take exception to you observation that everybody followed the rules right from the start and --

MR. : (They didn't talk to ?) the German lawyers.

QUESTIONER: Didn't stop the Luftwaffe from fairly leveling Warsaw.

But my question is -- I'm in violent agreement with your observation about the unlikeliness of stand-alone cyberwar. In fact, some of my colleagues -- (inaudible) -- have published on this, that in my opinion, our opinion, that cyberwar will be an intrinsic part of real war with a kinetic piece of it, et cetera, et cetera.

But here's the question I want to drive at. One of the things that we've argued is that the cyber-dimension of this, because of the extreme attractiveness for the first strike, because of the potential for destabilization, for the difficulty of escalation control once you start this, it could be a far more dangerous path to go down than many of the cyber-zealots -- and I'm one of those -- would consider it to be.

Your observations.

TWOMEY: Yes. I agree. I mean, there is public reporting that says -- and I can't verify one way or the other -- that President Bush vetoed a proposal for a cyber-attack on Iraqi banks, and the public reporting says because he was not certain what the impact would be upon European ad other banking systems. If any of that was true, I think it was prescient in thinking through the possible unintended consequence problem. So I think your issue of -- the point you're -- I think that is absolutely valid.

And to come back to my point before about norms and the role of norms, I think one of the -- you know, we tend to be pessimistic at the moment because of this very large-scale problem of, you know, pretty large-scale cyberespionage and what-have-you that's going on. We're very -- we're very pessimistic about the age we're in. Potentially, we should think of it this other way around. We should say we're actually being given a blessing, in that we are actually now seeing and we're -- we're watching the capabilities being developed in real time in ways that in the '20s and '30s, for instance, people didn't really get to see. And so for the reasons you're saying and the work you're doing, development of that sort of norms, developing of what -- understanding what could happen and getting a common language around that is probably pretty valuable.

BAKER: So Paul, I sort of agree there's a -- there's an apparent advantage to first strike. And the -- and the difficulty of controlling it means you might as well just go all-in. So what technical and other barriers to a successful first strike should we be thinking about building into our global infrastructure?

TWOMEY: Well, it's an interesting question. I mean, it -- I think for an economy like the United States, it is a question of actually hardening the economy, as opposed necessarily of just sort of hardening the defense establishment. And I think when you come to the issue of hardening the economy, we come back to our conversation before, which is this is where the responsibility I think sits much more back in the boardroom than it does necessarily in the halls of the Congress. And maybe that should be more, again -- you know, I know this -- and DHS and others obviously have got quite big programs on critical infrastructure and identification and relationships with critical infrastructure. And I think that's important, but I think in doing that, that has to be driven at the corporate levels from the very top.

And I think that you can get that -- I think that, you know, in the next stage -- if you -- if you said -- if you said in the next six, 12, 18 months, what stage -- you know, 24 months -- what's that real -- what's the real challenge, I would say it's transferring the understanding from the CISO and the CIO, what-have-you, up to -- chief risk officers -- up to CFOs and CEOs and board members. I mean, I think that's one of the key aspects. And I'm -- it's happening, but I think it's one of the key things to look at.

BAKER: Yeah, I -- but I've spent time in those boardrooms. And when you say, "And you need to be prepared to deal with a Russian or a Chinese sophisticated attack," they just shut down. They don't believe they can do it.

TWOMEY: Yeah, I think some are beginning to understand -- some -- I think some are beginning to understand the true economic risk that is at -- those sort of things have for them. And for some companies it's not high, and for some companies it's quite high, and you'd be surprised what sectors. And, you know, one of the obvious ones at the moment is commodities. And so I think people are beginning to think about that they really do have to respond to that.

BAKER: Right.

Yes.

QUESTIONER: Anna Brady-Estevez, the AES Corporation. Thank you. And just to tie into that last question -- I think you start to touch upon it with the commodities -- how do we prepare ourselves best? Or what have you seen as the best practices against threats on not just the virtual infrastructure, but also the physical infrastructure: power -- you know, we've seen what happens when that goes down; transportation, whether it's trains, bridges; air traffic; all that? Have you seen -- do you believe those initiatives are moving forward? And what are the most effective ways to protect?

TWOMEY: There's a -- there's a lot of work -- there's a lot of work done on that, and there's -- you know, there are some quite fundamental differences between SCADA systems and other networks -- not least the question about logs. You know, you can do forensics in other networks; you could (hardly do ?) forensics in some SCADA networks, and -- but there's a lot of attention been paid to that. I mean, there's whole parts of the -- most governments have now -- have focused on -- a lot on that.

I think what gets interesting when it comes to -- at the moment, the present environment we're in, if you say who has an incentive to go after major SCADA systems, there is some scenario in a general warfare scenario. There might be a scenario vis-a-vis someone who wishes to extort or to -- you know, "hacktivism" or some sort of terrorist political thing -- respond. However, SCADA systems are so reliant and so important for the -- for the real economy, a lot of the players who I think have an incentive to go after industrial espionage don't necessarily have an incentive to want to actually take out production. They might want to understand it, but at the moment I don't think they've got the incentive. I may actually want to understand your intellectual property, but I want to still keep trading with you and doing business with you; therefore I don't necessarily want to take out your production capacity.

And so I think we tend to get -- it's been interesting, there's been a lot of focus on, you know, in the initial public reactions and writing about this, all the fear that the electricity plant will go down, or some mine will go down, or some bank necessarily will go down. I -- my sense at the moment, that's probably at the moment not the great fear. The real fear is: I'll get to understand what you are thinking about, your strategic information, much more; that's what I'm focusing much more on, your strategic information, than taking out your systems.

That's not to say, though -- and Stuxnet obviously is an illustration. If I've got a particular strategic objective to act now to stop your production, then I will look at ways to do that.

BAKER: We should wrap up, but I do want to pursue that just a second. If we were confronting Saddam Hussein today, surely, he would have an incentive to take out power systems and other SCADA systems in the runup to war, and to persuade the American people that it really wasn't worthwhile.

TWOMEY: So I think -- so I think, having given you the analysis I've just given you, one of the difficulties with that (analysis ?) -- (audio break) -- (if ?) I was sitting as the CEO of any major corporate -- (audio break) -- where this was relevant -- you know, power, transport, banking, commodity, you know, other logistics, anything -- the difficulty with the analysis I've just given you, which I think is an accurate analysis at the moment, is that I will say, OK, where do I allocate scarce resources for this quarterly reporting sort of thing? I'm going to allocate them to where I think the real risk is. I don't think necessarily I'm worried about the SCADA systems at the moment. I might be; I may not be.

The challenge that you've just pointed out -- and this is potentially, again, where there's a role -- very much a role for government -- I think is going to be saying and engaging -- saying: OK, we don't think there's a risk this month or next month, but this is a vulnerability that could get turned on at some stage in the future with the political pressure, if political things go wrong. And I think that's part of this.

BAKER: Right.

TWOMEY: I'm not -- I haven't enough insights into how much -- how well that dialogue is going, or how much they've prepared or -- not so much the dialogue as: How much -- how much have I got the incentives in place (so ?) the people who need to spend the money are spending the money.

BAKER: Yeah. So as a reminder, this has been on the record.

We're going to have to stop now, but this has been a terrific presentation, Paul.

UNAUTHORIZED REPRODUCTION, REDISTRIBUTION OR RETRANSMISSION CONSTITUTES A MISAPPROPRIATION UNDER APPLICABLE UNFAIR COMPETITION LAW, AND FEDERAL NEWS SERVICE, INC. RESERVES THE RIGHT TO PURSUE ALL REMEDIES AVAILABLE TO IT IN RESPECT TO SUCH MISAPPROPRIATION.

FEDERAL NEWS SERVICE, INC. IS A PRIVATE FIRM AND IS NOT AFFILIATED WITH THE FEDERAL GOVERNMENT. NO COPYRIGHT IS CLAIMED AS TO ANY PART OF THE ORIGINAL WORK PREPARED BY A UNITED STATES GOVERNMENT OFFICER OR EMPLOYEE AS PART OF THAT PERSON'S OFFICIAL DUTIES.

FOR INFORMATION ON SUBSCRIBING TO FNS, PLEASE CALL 202-347-1400 OR E-MAIL INFO@FEDNEWS.COM.

This roundtable presented and analyzed the results of a national, bipartisan poll conducted by the Atlantic Council's Adrienne Arsht Latin America Center, tracking public opinion and attitudes in the United States toward Cuba and U.S. policy toward the island.

The White House and Congress have several differences to reconcile on cybersecurity legislation, explains CFR’s Robert Knake.

Terms of Use: I understand that I may access this audio and/or video file solely for my personal use. Any other use of the file and its content, including display, distribution, reproduction, or alteration in any form for any purpose, whether commercial, non commercial, educational, or promotional, is expressly prohibited without the written permission of the copyright owner, the Council on Foreign Relations. For more information, write publications@cfr.org.