Wednesday, December 9, 2015

So one thing you might know about spooks is that they can "Talk Around" almost any subject. Essentially by using a complex dynamically generated shared key they can sit at a table in a crowded restaurant and converse about secret things in plaintext.

Commerce Dept officials have made an art of doing something similar but without the shared key.

Kevin Wolf (Asst Secretary of Commerce Dept) started off the meeting with a clear indicator that "intrusion software" was not going to get regulated any time soon. Here's how he did it: "As you know, we're coming up to a limit because of the election as to which regulations we can implement. I think we probably have three more slots left. Obviously the Rocket Engines one is almost done, and after that I think maybe we'll work on Night Vision Sensors and Lasers, always important. Lots of good work to do in that area before we're finished. Eventually I think Vehicle Ships and Armor." (Count them - that's three and "intrusion software" is not on that list.)

He also related a story about how in early 2014 they because anxious about the proposed scope of the cyber regulations, which is why it came out as a "proposed rule" and not a finalized rule, something they've not done before. He expected a response but not the "Rather Aggressive Response about the negative unintended consequences" he got. And of course, he mentioned that while various reports (leaked from State) have pointed towards some sort of resolution of this process by tweaking the implementation, the next step won't be a final rule. In fact, he hinted towards an opening for no rule at all by saying "I'm not sure what the next step is, because we're still talking with the US Gov Agencies, going through comments, getting industry input. When there is some sort of consensus, then we'll know what the next step is, but as the person who signs the rule, I can assure you that there are no positions other than the next step won't be a final rule."

Then he left. Randy Wheeler (of Commerce) pointed out that she's glad so many people from industry showed up to talk about cyber regulations and how they realized it was because there was continued "high interest" in the proposed regulation. The next hour was devoted to the proposed Wassenaar "Intrusion Software" rule.

"NAM is National Association of Manufacturers"

Dr. Sergey Bratus did an excellent job of looking at how there is NO WAY TO DEFINE THE STANDARD EXECUTION PATH OF A PROGRAM. This is key to the language of Wassenaar. It points to a need to go back and renegotiate and remove the whole thing. He was very clear and understandable even to a non-technical audience. The one telling question he got was "Is this something we can work around in our implementation?" (Which I assume was from State Dept representation). And also of course "What about the other clauses that relate to avoiding monitoring, exfilling data - do those help?" ("No.", said Sergey)

Afterwards the National Association of Manufacturers pointed out a couple key facts.

Every single one of their members, no matter how small, is international

And hires security researchers to find 0day in their equipment

And thinks this issue is important enough to show up and is not ... in favor of crazy regulations

Just having them there at the table was a sign that this process of getting industry input could go on forever. FS-ISAC spoke briefly over the phone at the last meeting, and there are many more ISACs left to go!

Then (Tom Millar) DHS and (Allen Friedman) NTIA (another branch of Commerce) had their say. They're not allowed to say anything about the regulation in the open. Instead they said "We feel like there may be some <pregnant pause> detrimental impact on the sharing of information in this space with the proposed regulation. "

Afterwards FireEye and IONIC presented some information about how informing their customers about intrusions would be hamstrung.

"Dear State: Your idea is bad and you should feel bad."

Then it was question and answer time, and DHS and NTIA pointed out that not only would the costs of getting a license be passed directly to them and their programs, but also that there would be a "chilling effect" on beneficial information sharing, and that the President has made "information sharing" a clear priority.

One more question that keeps coming up (from State) is "Why are we having all these problems when other countries in the Agreement have implemented the rule and don't seem to be having any issues?" It's a major sticking point for them.

The answer is four-fold.

There are no like-to-like comparisons for other country's industrial bases and the US industrial base.

Other countries don't enforce export control in the extremely rigorous way the US does. They have a "default we assume you are good unless you are clearly trying to be bad" policy. The US investigates any possible violation as a super-felony with massive liability.

When export control becomes inconvenient, other countries just issue blanket exceptions to local companies (F.E. HackerTeam).

We have a large level of interest from "security researchers" and our industry is not just protecting their own interests, but looking out for broader principles of freedom.

That last point is the most important, and speaks to the long history of those who are non-lawyers but heavily involved in the resistance effort. In the US, researchers have been absorbed into Govt and Industry and are in positions to make this kind of regulation difficult - but of course, more hard work needs to be done to finally kill it forever.

Keep in mind, right now State's argument is not about how beneficial the rule is. It's about how much of a pain in the ass it would be for them to go back and renegotiate.

Thursday, December 3, 2015

So the NTIA meeting was livestreamed yesterday (and livetweeted) and also you could physically attend! I had good luck with a combination of the conference-call and video system they had, but I know other people did not.

But some of you in our community don't want to sit through the entire day of action packed livestream, or read my twitter feed. So I'm going to provide some perspective below.

First of all, there were a few "moments" that caught my eye during the day.

Wendy Nather (who works in the retail industry space) started the day off by stating her group was interested in a way to tell extortionists, which they get a lot of, from normal vulnerability researchers. I'm not sure this is as hard a problem as it sounds, since extortion is already illegal? I also don't know the scope or scale of this problem in the real world. In the financial industry Immunity has, in the past, gotten requests to look at a potential issue reported from an outsider to validate it or find it if not enough information was given about it. In some cases the original reporter was looking for a gig of some type (aka, leading to money) but to be honest, these cases are not "extortion" and a "bug bounty" would have likely handled it better than any other solution for nominal cost.

Keep in mind, it is literally impossible to prevent full-disclosure on the Internet. There's a genuine Multiverse of vulnerability disclosure possibilities, one of which is tell a vendor about something and then forget about it forever, and the rest are going to make every vendor in the room uncomfortable in some way.

Nevertheless, she said one thing I thought was interesting: "Is there anyone in the room who thinks there are no situations where you should sue vulnerability reporters?" And nobody raised their hands. I would have raised my hand, and not because I'm naturally contrarian, but because it seems obvious and even in the past when someone has hacked into an Immunity server and then told me about how, my instinct was to thank them, not sue them (and then of course we removed that server from the Internet forever).

Some interesting statements were made towards the end of the day by Juniper's representative, who is much closer to the position of many big software houses than a lot of the other speakers (Oracle, for example). In particular, he stated that he would "personally rail against any document" that was a proponent of bug bounties or of an open vulnerability marketplace.

Toyota, at the end asked for a less publicly transparent forum, under Chatham house rules (which prohibit the attribution of statements made during a meeting). NTIA's process is fully transparent to the public, as they pointed out (to their credit), but DHS offered to host such a meeting.

And of course KatieM of Hacker0x1 (who is obviously a proponent of bug bounties) pointed out that in some cases, although a lot of thought goes into "how long before a fix is made available should be acceptable", there's often cases where no fix will ever be made available for various reasons. This was a theme of the day as pointed out by KatieM and Art from US-CERT: When the details of any particular issue were discussed, reasonable people disagreed widely.

There's always a drumbeat theme from various parts of industry of "If only we could have Commerce sign off on what is GOOD and what is BAD behavior on the part of vulnerability disclosure" - and this, of course, is the clear and present danger I am hoping never happens.

Chances for it actually happening are low, despite a process in place to guide the "community" toward that point if it is at all possible. Most of the people are from West Coast software companies or the Government. There's the thought that "SOMETHING MUST BE DONE" which guides their actions but that said, even after two grueling meetings, there is still nothing even close to consensus what that thing might be.

Keep in mind that these people are all extremely well intentioned, but literally today I was proofreading a deliverable for a critical infrastructure company and I know that for all the noise about Internet of Things the way loss of life happens in the critical infrastructure space is probably SQL Injection, just like in any other space. Stuxnet is the perfect example of this, if you look closely enough.

In other words, the thought that the "safety" space is somehow magically different or more important or more sensitive than say, the financial space, is more marketing than science. I say this as someone who has hacked all those things. Although, to be fair, this again is one of those issues with wide disagreement.

In summary, there may be massive ancillary benefits to having all these companies all together in one place. The companies clearly would prefer that place to be in Silicon Valley. But it is still highly unlikely a "statement of principles" will be the final result.

Here you can see that Greg disagrees with me about the "Safety" industry being different.

Dino was basically the one "researcher" voice in the room, although obviously others have experience finding bugs.

The FDA is of course doing their thing. But this is pretty far out of range for their capability set (and in fact, maybe all roads lead to the NSA?)

There is palpable anger still for Charlie Miller and Chris for their car hack on a highway shenanigans.

The consequences most companies are worried about are "Their bottom line".

Important to note that FIRST was very active during this meeting and is of course doing a lot of work on vulnerability disclosure issues for multi-party - as are many other people and groups. This is something that keeps coming up - every group on Earth is working on making a methodology for disclosure.

Thursday, October 29, 2015

So I went to the Department of Commerce to speak at one of their export control working group meetings. It was fascinating. Not my talk, which is here, but the meeting itself.

For example, one of the items on the list was the machine used to make wrapping paper. Apparently modern wrapping paper is very similar to stealth coatings and other important things. But obviously, restricting a machine used to make wrapping paper is a useless task.

The Commerce Department is not crazy about doing dumb things that hurt the US economy. WHICH IS GOOD NEWS FOR THE SECURITY COMMUNITY, because meetings like this one clearly show that they are listening on the "Intrusion items" export control issue, and are unlikely to bow to State Dept or NSA/DoD pressure to eviscerate our whole industry.

Some bulleted thoughts:

State Dept pointed out that while many people think of this as a human rights issue, it is not. This agreement and process is solely about National Security, so we can table all the discussion of human rights issues with regards to "intrusion software"

Nobody in the room (or anywhere else) is willing to support the export control language as written, which means one way or the other it has to change. This includes the human rights community, but also State and NSA (who were in the room)

There was a lot of clarification of the role and value of Penetration Testing as a process, and how that would be adversely affected. We handed out a sample deliverable, for example. Also I invited everyone to INFILTRATE.

FS-ISAC weighed in as an "End User" and said "We are regulated by certain laws, and this proposed wording violates those laws."

The idea that you can separate intrusion software ("not regulated") and software used to generate and control intrusion software ("Super Regulated") was shut down pretty heavily both by the Coalition for Responsible Security and by other speakers

Microsoft is one of the strongest voices against this regulation, along with the Coalition

Commerce does not like regulations with so many carve-outs that they don't have anything actually BEING regulated. They actually like nice clean regulations the way we like nice clean programs. This is not the kind of thing they like.

Other topics that came up: Real time information sharing conflicts with regulation, the fact that any regulation needs to be "cloud friendly", export control not being the place for this sort of thing, many others which point out that this regulation cannot go forward

State Dept has another meeting today, which I'm not going to, but hopefully it will not be a backwards step.