Faulty statistical methods and other common errors that can trip up your program

September 26, 2012
—
CSO
—
Executives know they face risks, but they often don't know which risks are real, or what that exposure means to their business.
The aim of security risk management is to remove the guesswork and help the business make smarter decisions.
As Jay Jacobs, vice president of the Society of Information Risk Analysts (SIRA),
says, "Security risk management is simply a decision-support system for
the business. It should exist to inform the decisions of the business."
Unfortunately,
many experts believe that most companies aren't quite there yet and
that their efforts, while well-meaning, fall short and may even
incorporate bad habits that can increase an organization's risk.

Jeff
Lowder, president of SIRA, says, "There is a mistaken perception that
expertise in security equals expertise in risk management. In fact, we
see many experts in security who also claim to be experts in risk
management. They're often not. These are two separate disciplines and,
ideally, someone is knowledgeable about both if they're performing
security risk management."
To get a better understanding of where
many enterprises go wrong, CSO asked a handful of experts what they
commonly see enterprises do wrong in security risk management. "In many
organizations, based on what we've seen, it could actually be better if
the organization chose to make decisions based on coin flips rather than
their internal security risk management frameworks. At least when you
flip a coin, you have a 50 percent chance of getting it right," says
Jacobs.
Here are the most common mistakes and misconceptions made in well-intentioned risk management efforts:

1 Starting from scratch.

Many security professionals will attempt to reinvent the discipline of security risk management. Fortunately,
there are well-established methods for risk-analysis tasks, such as how
to solicit an expert opinion and how to represent uncertainty in risk
models. However, as Jacobs and Lowder explain, most people remain
unaware of the research about how to do this correctly, and end up
re-creating not only the same models but also the same shortcomings that
basic approaches suffer from.
"The most prominent model is to
pick some 'risk' factors that seem important, assign some ordinal score,
and then perform basic arithmetic on these or place them on a matrix
that has been shown to produce poorly," says Jacobs. The only saving
grace for organizations that rely on these homegrown frameworks is that
experienced decision makers often distrust the results these basic
approaches produce, Jacobs says.

2 Replicating the audit department.

One way security risk management programs set themselves up for
failure, says Alex Hutton, director of operations risk and governance at
a large financial services firm and faculty member at IANS, is to copy
the functions of the audit department. "While there are
similarities between the two, the roles are dramatically different,"
says Hutton. The audit team should be concerned about where failures can
occur through breakdowns in security controls,
whereas risk management should be concerned with the potential
frequency and impact of IT risks. And where audit's role is to help the
company understand how to implement controls, risk management's role is
to determine how to get the most out of investments in security controls
and related processes.
"Most organizations whose risk
management programs end up failing do so because they end up merely
enforcing policy rather than consulting to the organization about what
controls do and don't make sense," says Hutton.
"Audit doesn't
necessarily concern itself with threat and audit doesn't necessarily
care about reporting an aggregate picture of risk, based on the entire
outlook of threats, assets, controls and impact," says Hutton. "Security
risk management does."