tag:www.schneier.com,2016:/blog//2/tag:www.schneier.com,2005:/blog//2.400-2016-09-03T05:41:33ZComments for A Socio-Technical Approach to Internet SecurityA blog covering security and security technology.Movable Typetag:www.schneier.com,2005:/blog//2.400-comment:12552Comment from Lisa on 2005-08-26Lisa
Ralph - The "my widget is better" research is not just paid for by industry; much of the funding for cybersecurity in the US is about technical issues in which it is "my algorithm is more secure than yours". In March, a presidential commission (Cybersecurity: A Crisis of Prioritization) noted some problems with cybersecurity research in the US including:
(1) a shift from more basic security research to more applied research;
(2) a shift from public domain research to classified research;
(3) an acceptance rate of cybersecurity grant proposals at the National Science Foundation at 8% (compared to about 20% for the NSF as a whole);
(4) not enough researchers.

Yet, their report missed the chance to emphasize the need for behavioral issues to be included in the list of priorities. For example, PKI is a great widget but PARC researchers have shown that even experienced users find it too difficult to implement. A "better widget" needs to be a "better-used widget".

Roy -- one of the appealing parts of Leveson's proposal is that she will be studying security as a SOCIO-technical system, not just a technical one. Her work on safety makes me a bit more confident that this engineer will actually include the people part in her work.

Indeed, the "people" part of improved security is a tough nut to crack, and a study at Michigan State has been looking at how to improve end user's online behavior: http://www.msu.edu/~isafety/.

Some of the best research on how users really "do" and think about security has been coming out of University College London, not the US...time to include how people use the widgets in our research.

]]>
2005-08-26T17:20:44Z2005-08-26T17:20:44Ztag:www.schneier.com,2005:/blog//2.400-comment:12500Comment from Roy Owens on 2005-08-25Roy Owens
If they come up with smart ways to tighten security, there is still the problem of institutional resistance in the upper echelons to accepting that rules apply to themselves.

I know of a high-profile institute where all visitors have to get temporary badges to get inside, and have to be escorted everywhere, and where all employees have to have their photo-badges in clear view above the waist at all times in all places, no exceptions. Guards are visibly armed. Employees are encouraged to challenge anybody who seems suspicious.

Of course some bigshots refuse to comply (including a top level security professional) and can get through layered security on the strength of a pricy suit and an uppity attitude. It's a visible sign of power to blow right through every stage of perimeter security, especially with an awestruck entourage in tow.

If a security guard, or other employee, challenges the wrong bigshot, he can expect to get fired. Everyone knows the horror stories.

Infiltrators could easily impersonate bigshots. It wouldn't take a Frank Abagnale in person or a Kevin Mitnick on the phone: terrorists could recruit out-of-work actors. The recruits could practice the attitude at every turn in ordinary places, refining their skills before the actual penetration that counts.

Ironically, the security guards the infiltrators finesse will be letting them through for fear of losing their jobs when it is precisely their job to keep those same people out.

]]>
2005-08-25T23:55:03Z2005-08-25T23:55:03Ztag:www.schneier.com,2005:/blog//2.400-comment:12497Comment from Ralph on 2005-08-25Ralph
I agree with Lisa, vendor funded research designed to prove a company sales line ("wheelbarrow research") is of limited use and brings nothing new to the table. When too much of this kind of thing is shovelled at end users they can become numb and confused.

If other useful perspectives or tools for thinking about and analysing our problems can be introduced it can't help but be of some value.

At the current point in time I feel we are losing the battle for the internet on many fronts and we should welcome new approaches.

]]>
2005-08-25T23:18:15Z2005-08-25T23:18:15Ztag:www.schneier.com,2005:/blog//2.400-comment:12492Comment from Lisa on 2005-08-25Lisa
Dr. Leveson is pretty well respected in the systems engineering community (e.g., where software is critical to safety of the entire system).

Systems security concepts have helped reduce failure rates in many areas (e.g., aviation). I am happy to see some theory applied to this problem, since most of the research I've seen is along the lines of "my widget is better than your widget" -- which is nice, but ultimately limiting.

]]>
2005-08-25T22:33:18Z2005-08-25T22:33:18Ztag:www.schneier.com,2005:/blog//2.400-comment:12445Comment from Bruce Schneier on 2005-08-25Bruce Schneierhttp://www.schneier.com/blog
"I hope this is not just another taxpayer-funded scientific boondoggle to prove with a bunch of statistics and fancy jargon what common sense and vast experience have never left in doubt, e.g., that Murphy's Law works."

Agreed. In general, though, I think research is good. Even if you don't learn what you set out to learn, you sometimes learn something surprising. Research is good. Funding researchers and research institutions is good.

]]>
2005-08-25T18:25:48Z2005-08-25T18:25:48Ztag:www.schneier.com,2005:/blog//2.400-comment:12419Comment from Andrew on 2005-08-25Andrewhttp://anomalyuk.blogspot.com/
The reesearchers' page is at http://sunnyday.mit.edu/safety.html
]]>
2005-08-25T14:18:43Z2005-08-25T14:18:43Ztag:www.schneier.com,2005:/blog//2.400-comment:12418Comment from Jim Duncan on 2005-08-25Jim Duncan
I hope this is not just another taxpayer-funded scientific boondoggle to prove with a bunch of statistics and fancy jargon what common sense and vast experience have never left in doubt, e.g., that Murphy's Law works.]]>
2005-08-25T13:41:12Z2005-08-25T13:41:12Z