Tag Archives: statistics

Latest report shows significant changes in the scale and type of attacks being executed, as recorded by one of the largest internet infrastructure companies that includes additional data sources. Akamai published their quarterly report today (January 23, 2013) and I am nearly through it … a few striking details that shift how I will recommend clients to identify; consider; and mitigate risks. The top two items that are significant (one obvious) and important include:

China held its spot as the #1 source of observed attack traffic at 33%, with the United States at #2 at 13% (Not a huge surprise but an affirmation for many)

The amount of attack traffic that was seen during the activist (Operation Ababil) DDoS attacks was ~60x larger than the greatest amount of traffic that it had seen before for similar activist-related attacks (The volume, intensity, and strategy of the attacks is important as most do not consider a SIXTY TIMES in factor in risk mitigation calculations)

About the Akamai State of the Internet report
Each quarter, Akamai publishes a “State of the Internet” report. This report includes data gathered from across the Akamai Intelligent Platform about attack traffic, broadband adoption, mobile connectivity and other relevant topics concerning the Internet and its usage, as well as trends seen in this data over time. Please visit www.akamai.com/stateoftheinternet

Senior leadership (board of directors, audit committee members, CIO, COO) must ensure these realities are absorbed into the organization’s business processes. Leadership and strategy shifts required to tackle these evolutions remains an executive responsibility.

The fine folks at Verizon Incident Response last week put out their fine, annual, report on evidence and trends based on their forensic efforts. This year they focus the statistics on approximately 90 caseloads that they served over the calendar year. The report is quite good and at a short 52 pages a worthwhile read cover to cover. Readers of this site know I won’t print their report here, but will highlight the areas that jumped out.

These tidbits and the report of discussion should be employed for applying true numbers to your security calculations; a wake up call for a ‘Back to the Basics’ call to arms; and a magnifying glass that what is often published (data breach reports for example) does not always tell the whole tale.

“Target of Choice or Target of Opportunity. If the former, expect and prepare for determined and sophisticated attacks. If the latter, minimize the opportunities presented so as to become less of a beacon for attack. At the very least, make sure your beacon shines less brightly than everyone else’s.”

An interesting highlight by the authors of the report to minimize the perceived value and increase the perceived difficulty of compromising an enterprise.

Among the numerous thoughtful recommendations provided within the report the authors highlight an area that is of extreme importance to me, and deserves to be highlighted:

“Control data with transaction zones: Based on data discovery and classification processes, organizations should separate different areas of risk into transaction zones. These zones allow for more comprehensive control implementations to include but not be limited to stronger access control, logging, monitoring, and alerting. ”

“81 percent of organizations subject to PCI DSS had not been found compliant prior to the breach.”

– a surprising figure, but one that has been pushed around in the media. The reality of this quote is that these business were NOT be found to be compliant prior to the breach, meaning they were not compliant with the standard. This leaves 19% of proved compliant organizations to have been either mistakenly approved, approved and fallen off the wagon, they were compliant and the attack vector was one not within the scope of the PCI controls, or finally they were compliant and the attackers were good enough to override said controls.

Vulnerability Patching vs. Configuration Setup/Mgmt: “2008 continued a downward trend in attacks that exploit patchable vulnerabilities versus those that exploit configuration weaknesses or functionality. Only six confirmed breaches resulted from an attack exploiting a patchable vulnerability.” “…focus on coverage and consistency…” is more important, effective, and statistically relevant.

“…four of 10 hacking-related breaches, an attacker gained unauthorized access to the victim via one of the many types of remote access and management software”

Continued support to protect the systems that are designed to provide access remotely as they tend to provide a simple avenue into the business.

“During 2008, malware was involved in over one-third of the cases investigated and contributed to nine out of 10 of all records breached. In years past, malware was generally delivered in the form of self-replicating email viruses and network worms. “

Custom code is increasing and the old standbys of Anti-Virus and signature based solutions are falling increasingly behind. Additional controls are necessary both at the preventive and detective areas of a business risk program.

“Most application vendors do not encrypt data in memory and for years have considered RAM to be safe. With the advent of malware capable of parsing a system’s RAM for sensitive information in real-time, however, this has become a soft-spot in the data security armor. “

This was an interesting highlight that brought forward an emerging area of assault and one that is becoming more of a concern. Capturing data while in transit – any channel – is now at risk as other aspects of the business functions and digital processing become more sophisticated and secured. An interesting additional reality is as complexity increases within business technology environments attackers will seek areas that tend to be more stable and consistent to allow for automated and subtle means of intrusion, capture, and release of targeted information.

“85 percent of the 285 million records breached in the year were harvested by custom-created malware”

The relative difficulty of attacks leading to data compromise:

“Category: None, No special skills or resources required. The average user could have done it.

Combined NO skill and LOW skill makes up 52% of all breaches; however “…few highly difficult attacks compromised 95 percent of the 285 million records…” The end result – no organization is immune to the use of these tools and it is financially and technologically feasible to attack even single IP address businesses across the globe.

“Targeted attacks (The victim was first chosen as the target and then the attacker(s) determined a way to exploit them.) are at a five-year high and accounted for 90 percent of the total records compromised. “

As always, review the data – make your own conclusions – and plan accordingly. The costs and impacts of these attacks on businesses ($14.6 billion for new account fraud), to individuals ($1.8 Billion to U.S. citizens), and to the world at large (the total amount of fraud in 2008 is the same as the aggregate sum of 79 industrialized nations).

I spoke at RSA 2009 this year on PCI and Beyond – great crowd and a post-CON post will follow soon.
Best regards,