Current Feed Content:

Forty eight percent of organizations had a security incident that caused moderate to severe business impact. Undoubtedly, delays in identifying threats and the lack of information that extends the length of incident investigations cause real business problems.

Join leading IT analyst firm Enterprise Management Associates (EMA) and Gigamon to get insights into “identifying advanced threats” and “enhancing incident investigations” use cases based on EMA’s new “Top 3 Decision-Makers’ Guide to Security Analytics” report.

Ninety-eight percent of organizations are using cloud to perform ITOps and/or SecOps functions. Cloud is a fantastic resource for IT and business. Compute, storage, and even application delivery capabilities are activated on-demand to meet the business's needs. While cloud delivers on most of its promises, the one aspect of cloud that seems to be regularly in question is data security. This is not generally due to a technical failing on the part of any of the cloud services providers, so much as a failing in the education of their customers, creating a misunderstanding of whose responsibility it is to manage data security in the cloud. Fifty-three percent of cloud customers errantly believe that the responsibility for their cloud data security falls wholly or mostly on the shoulders of their cloud provider. Twenty-one percent also incorrectly believe the burden is equally shared, while only 15 percent understand that the owning organization is responsible for data security. Some reports have as many as seven percent of AWS S3 buckets exposed to the Internet. Companies experiencing security problems are not just a bunch of inexperienced startups or security amateurs. They include companies like Accenture, Booz Allen, Dow Jones & Co., Time Warner Cable, and Verizon Wireless, among many other notable names who have experienced cloud data leaks. To overcome these shortfalls, cloud data owners need to better investigate and understand their role in protecting their cloud, to address these four key factors:

1. Who owns cloud security?

2. How can security teams address the lack of visibility into cloud?

3. How can data sharing be manged to reduce the associated data leakage?

4. Not all on-premises security tools fit the unique needs of cloud, so how can IT and security choose the best tools?

Join leading IT analyst firm Enterprise Management Associates (EMA) for a webinar that will reveal the current state of enterprise security readiness within the context of security management tools, issues, and practices.

This report is a time-saving guide. It is designed to help decision-makers who have identified problematic security use cases to select analytics tools that best address those use cases to aid in narrowing selection choices for proof of concept testing or other interviews.

If the security team has invested in the proper tools and still is not able to render a solid defense, and reaches a point where they have been able to break down data silos and address the political silos that impede information flow and cooperation, then this report can aid in choosing a vendor to take the security practice to the next level.

How to use this document:

It is important to recognize that every organization is different, with a unique set of IT and business requirements. As such, EMA strongly recommends that when using this guide to create a shortlist, each organization conduct its own evaluation to confirm that other aspects of the solutions will best match its business needs or that the disclosed use cases also meet other requirements, like business workflows and full reporting necessities. This guide will assist with the process by providing information on key use cases common to many prospective buyers to review during the selection process, and an associated shortlist of vendors with solutions that meet them.

For each use case, EMA provides the following sections offering insights for use in the platform selection process:

• Quick Take - This is an overview of the use case, why it is important, and how the solutions address it.

• Buyer's Note - Key considerations prospective buyers should be aware of, and questions they should ask during the evaluation process.

• Top 3 Solution Providers - By identifying and recognizing the most innovative vendor solutions that address the greatest business priorities for secure access enablement, the table in this section provides a brief overview of each platform and the respective capabilities. Within the Top 3, the solutions are listed alphabetically by vendor, so the order in which they appear is not an indication of EMA's preference. It is highly recommended that organizations seeking to adopt solutions addressing a particular priority investigate each of the corresponding Top 3 vendors to determine which best meet their full and unique requirements.

Threats abound, but people are out there trying to deal with them. This report delves into several areas of concern today including cloud security issues, SecOps frustrations and tools, the Internet of Things, data sharing and leakage, DDoS, endpoint security, and artificial intelligence. The report identifies challenges and perceptions that enterprises, midmarket companies, and SMBs face across seven industry verticals including manufacturing, financial, and healthcare. The goal is to help readers to understand the common issues and where they are doing a better or worse job than others. Ultimately, the report will help readers understand how to handle threats better, no matter where they stand now.

Threat intelligence has been around in one form or another for many years. Only in the last few years did the information really become digestible for any but the largest organizations. Its most recent form evolved into platforms that collect and analyze information through various automated and manual means. The information is focused on delivering indicators of a valid threat against the company. Rather than terabytes of superfluous data, organizations that invest in the toolset can specify what types of information they are most interested in and begin collection.

The speed of detection and mitigation are the true issues today. How fast is as fast as possible? Over the last few years, research like the Verizon Data Breach Investigation Report demonstrated that "as fast as possible" has not been nearly fast enough. Compromises can happen in hours, but identifying an attack may not take place for months or years.

It is this issue that focused innovators on how to identify and respond to security incidents faster. The first challenge is being able to wade through the incessant and overwhelming noise of alerts, and reduce them to a workable volume of real problems that can be clearly defined and addressed quickly.

Over the past several years, numerous startup companies were established to address the gap in analytics and visibility of real issues in the sea of alerts. Security analytics solutions were initially designed to perform one or more of three primary types of security-focused analytics: User and Entity Behavior Analytics (UEBA), Anomaly Detection, and Predictive Analytics. Since their inception much of these analytics have merged, leaving only a thin line between combined UEBA/Anomaly Detection and Predictive Analytics.

This report is the second of a two-part series. Part one, released earlier this year, delved into the platforms, solutions, and products supplying log-based security analytics for the express purpose of providing them with fewer actionable alerts without the side effects that can filter out alerts on actual threat activity. This second report focuses on vendors that use network information, such as net flows, deep packet inspection, and forensic packet analysis, to gather telemetry.

This report evaluates vendors across five major categories supported by over 130 KPIs. EMA evaluated and scored each vendor under the same documented criteria. Each participating vendor has a profile that outlines their solution, its strengths and weaknesses, and its performance ratings compared to the other vendors evaluated. It also documents key decision-making factors important to the buying process and ultimately depicts the vendors relationship to each other based on value vs. functionality.

For the past 20 years, the security information and event management (SIEM) market has been a staple for many security operations teams. However, in the last five to seven years, many security organizations have been increasingly dissatisfied with their abilities and performance, giving way to a new class of technology known as security analytics.

Join David Monahan, Managing Research Director of Security and Risk Management at EMA, to learn more about these solutions.

With all the news about cyberattacks, it’s easy to feel like there aren’t enough people to cover all of the security bases. This means proper identification and management of threats and vulnerabilities is an absolute necessity to keep risk at its lowest levels.

Join David Monahan, managing research director at leading IT analyst firm Enterprise Management Associates (EMA), and John Dasher of RiskSense, to learn why a threat and vulnerability management solution is a must have for your security portfolio.

With all the news about cyberattacks, it’s easy to feel like there aren’t enough people to cover all of the security bases. This means proper identification and management of threats and vulnerabilities is an absolute necessity to keep risk at its lowest levels.

These slides--based on the webinar featuring David Monahan, managing research director at leading IT analyst firm Enterprise Management Associates (EMA), and John Dasher of RiskSense--reveal why a threat and vulnerability management solution is a critical component in any security portfolio.

Gigamon has been known for years as a market leader in the network packet broker space, conquering the challenge of getting the right data to the right network and security tools. Over the last few years, Gigamon extended into the security space, which paved the way for their acquisition of ICEBRG. With ICEBRG, Gigamon has expanded its capabilities to help organizations use data better by delivering next-generation security analytics with all of the captured data. Now, in addition to giving network tools the visibility to increase network and application performance and giving security tools access to the traffic they need to process and act upon, Gigamon delivers further insights to combat threats.

With the ICEBRG acquisition and integration, Gigamon can capture network traffic metadata into a central platform that offers an API, a user-friendly query language, and built-in advanced security applications, transforming itself from a leading-edge network traffic company into a leading-edge cybersecurity company. Gigamon's vision is to transform not only the way it handles data, but the way cybersecurity teams operate by providing a rich, open-access network traffic analytics platform upon which multiple security applications can be deployed, whether such apps are developed by Gigamon, a customer, or a third-party partner.

The speed of detection and mitigation are the true issues today. How fast is as fast as possible? Over the last few years, research like the Verizon Data Breach Investigation Report demonstrated that "as fast as possible" has not been nearly fast enough. Compromises can happen in hours, but identifying an attack may not take place for months or years.

It is this issue that focused innovators on how to identify and respond to security incidents faster. The first challenge is being able to wade through the incessant and overwhelming noise of alerts, and reduce them to a workable volume of real problems that can be clearly defined and addressed quickly.

Over the past several years, numerous startup companies were established to address the gap in analytics and visibility of real issues in the sea of alerts. Security analytics solutions were initially designed to perform one or more of three primary types of security-focused analytics: User and Entity Behavior Analytics (UEBA), Anomaly Detection, and Predictive Analytics. Since their inception much of these analytics have merged, leaving only a thin line between combined UEBA/Anomaly Detection and Predictive Analytics.

This report is the second of a two-part series. Part one, released earlier this year, delved into the platforms, solutions, and products supplying log-based security analytics for the express purpose of providing them with fewer actionable alerts without the side effects that can filter out alerts on actual threat activity. This second report focuses on vendors that use network information, such as net flows, deep packet inspection, and forensic packet analysis, to gather telemetry.

This report evaluates vendors across five major categories supported by over 130 KPIs. EMA evaluated and scored each vendor under the same documented criteria. Each participating vendor has a profile that outlines their solution, its strengths and weaknesses, and its performance ratings compared to the other vendors evaluated. It also documents key decision-making factors important to the buying process and ultimately depicts the vendors relationship to each other based on value vs. functionality.

Attacks on and from wireless are continuing to rise. IoT has exacerbated the problem by increasing the wireless connections into networks on devices that are often installed with little or no security controls and even more often unknown to administrators and security personnel.

Cy-oT has introduced a technology for identifying, monitoring and controlling wifi and bluetooth enabled devices within the enterprise airspace to decrease risk from unmanaged, rogue or compromised devices whether inside the perimeter or not. Unlike Wireless IPS, control can be provided without interfering with transmissions on devices which can lead to violation of federal laws.

Cy-oT integrates with other control technologies such as NAC to create both visibility and defense from IoT and other wireless devices.

EMA "Vendors to Watch" are companies that deliver unique customer value by solving problems that had previously gone unaddressed or provide value in innovative ways. The designation rewards vendors that dare to go off the beaten path and have defined their own market niches.

This report queried companies that use Network Security Policy Management (NSPM) tools and companies that do not in order to compare and contrast their security change management process, timeliness and efficacy. The evaluation considered whether there were any differences in their inherent risk profiles and if or how NSPM created improvements in security performance.

Cybersecurity as a discipline is a fast-paced, dynamic area. New and innovative attack methods are combined with old ones to make nearly infinite avenues of attack. Whether an attack is a single packet compromise or a low-and-slow attack drawn out over many days, the defenders are responsible for identifying and stopping the attacks as soon as possible. It’s the last phrase that is the issue. How fast is as fast as possible? It seems that over the last few years, "as fast as possible" has not been nearly fast enough. Compromises can happen in hours, but identification may not take place for months to years.

It is this issue that drew innovators to try to figure out how to identify and respond to security incidents faster. The first challenge is being able to wade through the incessant and overwhelming noise of alerts and reduce them to a small trickle of real problems that can be clearly defined and addressed quickly.

Over the past several years, numerous startup companies were established to address this gap in analytics and visibility of real issues in the sea of alerts. Security analytics solutions were initially designed to perform one or more of three primary types of security-focused analytics: User and Entity Behavior Analytics (UEBA), Anomaly Detection, and Predictive Analytics. Since their inception, much of these analytics have merged, leaving only a thin line between a combined UEBA/Anomaly Detection and Predictive Analytics.

This report, which is part one of a two-part series, delves into the platforms, solutions, and products supplying log-based security analytics to security practitioners for the express purpose of providing them with fewer actionable alerts without the tuning side effects that can filter out alerts on actual threat activity. The report evaluates vendors across five major categories supported by over 100 KPIs. EMA evaluated, scored, and ranked each vendor under the same documented criteria. Each participating vendor has a profile that outlines the solution, including its strengths and weaknesses, in comparison to the other vendors evaluated. It also documents key decision-making factors important to the buying process and ultimately depicts the vendors’ relationship to each other based on value vs. functionality.

Part two will follow the same methodology, but will focus on security analytics solutions that primarily rely on network-based data for analysis.

Is the time right for your organization to purchase a threat intelligence platform?

These slides--based on the webinar from leading IT analyst firm EMA and IntSights--provide research-based insights into DTIM to help you determine if it is right for your organization. You will also get key insights into new research, including the methodology behind platform evaluation and an overview of key players in the market.

Is the time right for your organization to purchase a threat intelligence platform?

Join leading IT analyst firm EMA and IntSights for a research-based webinar to find out if DTIM is right for your organization. You will also get key insights into new research, including the methodology behind platform evaluation and an overview of key players in the market.

As the internet has expanded and criminals have found more ways of creating revenue from stolen information, the need for digital threat intelligence management (DTIM) has increased. Without a means of early identification, companies that are being targeted have no way of knowing their customer’s or employee’s security is threatened or that their brand is being stolen, resulting in an erosion of reputation. DTIM is the early warning system to aid those organizations in identifying the infringements and thefts before severe damage is done.

Join leading IT analyst firm Enterprise Management Associates (EMA) and RiskIQ to discover why DTIM is a growing necessity for mid- and large-sized organizations.

Security policy orchestration and automation (SPOA) is the area of technology that facilitates managing and standardizing network firewall security policies. At a minimum, SPOA tools aid in analyzing, deploying, and updating firewall policies across on-premises and cloud for homogenous or heterogeneous firewall vendor environments. When fully utilized, they can improve areas such as change and risk management, disaster recovery, and on-premises to cloud application migration

Join David Monahan, managing research director at leading IT analyst firm Enterprise Management Associates (EMA), and discover the difference between organizations using an SPOA solution to manage their firewall environments versus those not using one of these solutions.

You will also get insights into new research findings, including:

Over 50% of organizations use manual policy inspection to determine security and compliance. Find out how using SPOA changes this process.

71% of organizations identify problems with establishing and maintaining security policy baselines and standardization. SPOA users, however, have a vastly different story.

69% of respondents indicated it is virtually impossible to moderately difficult to maintain standardized and synchronized policies across their firewalls. This is not the case for SPOA users.

Threat intelligence has been around in one form or another for many years. Only in the last few years did the information really become digestible for any but the largest organizations. Its most recent form evolved into platforms that collect and analyze information through various automated and manual means. The information is focused on delivering indicators of a valid threat against the company. Rather than terabytes of superfluous data, organizations that invest in the toolset can specify what types of information they are most interested in and begin collection.

Some vendors focus on specific types of infringement, like brand infringement or domain abuse. Others focus on data sources like social media or the dark web. The third group has a broader coverage, looking at many different sources, analyzing and correlating them, and delivering directed information on the infringement. As expected, greater or premium coverage often demands a premium price, so those looking into the solutions should evaluate the scope they really need instead of just what they want.

EMA is seeing a surge of M&A activity, as well as significant infusions of capital in the established companies. Only some of the companies in the space are profitable at this time, though the analysis points to a number of them crossing into profitability later in 2018.

This report queried companies that use Network Security Policy Management (NSPM) tools and companies that do not in order to compare and contrast their security change management process, timeliness and efficacy. The evaluation considered whether there were any differences in their inherent risk profiles and if or how NSPM created improvements in security performance.

In fact, organizations leveraging NSPM demonstrated significant advantages in both IT operations (ITOps) and security operations (SecOps). Advantages included more consistent security policies, which led to fewer attack surfaces, shorter change approval and implementation processes, fewer change-related outages, more successful business continuity and disaster recovery testing, and more.

Participants coming from environments where NSPM was not used felt they had strong IT and security visibility, but had more significant issues with poorly implemented security policies, non-standardized policies, and failed cloud migrations for critical business applications. The NSPM group had a more realistic outlook.

This report queried companies that use Network Security Policy Management (NSPM) tools and companies that do not in order to compare and contrast their security change management process, timeliness and efficacy. The evaluation considered whether there were any differences in their inherent risk profiles and if or how NSPM created improvements in security performance.

In fact, organizations leveraging NSPM demonstrated significant advantages in both IT operations (ITOps) and security operations (SecOps). Advantages included more consistent security policies, which led to fewer attack surfaces, shorter change approval and implementation processes, fewer change-related outages, more successful business continuity and disaster recovery testing, and more.

Participants coming from environments where NSPM was not used felt they had strong IT and security visibility, but had more significant issues with poorly implemented security policies, non-standardized policies, and failed cloud migrations for critical business applications. The NSPM group had a more realistic outlook.

This report queried companies that use Network Security Policy Management (NSPM) tools and companies that do not in order to compare and contrast their security change management process, timeliness and efficacy. The evaluation considered whether there were any differences in their inherent risk profiles and if or how NSPM created improvements in security performance.

In fact, organizations leveraging NSPM demonstrated significant advantages in both IT operations (ITOps) and security operations (SecOps). Advantages included more consistent security policies, which led to fewer attack surfaces, shorter change approval and implementation processes, fewer change-related outages, more successful business continuity and disaster recovery testing, and more.

Participants coming from environments where NSPM was not used felt they had strong IT and security visibility, but had more significant issues with poorly implemented security policies, non-standardized policies, and failed cloud migrations for critical business applications. The NSPM group had a more realistic outlook.

David Monahan, managing research director of security and risk management at leading IT analyst firm Enterprise Management Associates (EMA), recently released the most comprehensive and detailed piece of research in the endpoint space produced to date. This updated report identifies the market size, direction, trends and all of the major players in this space, 50 in total. The report contains a page profile for each of the 32 participating vendors.

Join David for this webinar to see the identified players in the space and who participated in the research. Get highlights from this new research, as well as answers to the following questions:

What is Next Generation Endpoint Security, and how is it different than antivirus?

This is the second iteration of this report (the first was distributed in 2015). The vendor-related research focuses on solution providers that are supplying proactive next-generation endpoint security services covering prevention, detection, and response. EMA provided all identified participants the opportunity to participate in a vendor-answered questionnaire and interviews. EMA then combined that information with research efforts external to the providers to create company profiles and assess each vendor on their applicability to the space, as well as their market share by revenue and license volumes. Most of the vendors competing in this space emerged or refocused in the last few years, with only a few having competed in the market for more than five years.

As with any study, this study is only as good as its data inputs. This research identified 42 solution-provider candidates as participants. Of those companies, eleven vendors elected not to respond or share data. Within the remaining 31, some were unable to provide complete data due to company policies limiting some analysis, but efforts were made to fill in as many blanks as possible using publically-available information.

The NGES market is highly competitive. With a 2014/2015 annual growth rate over 100 percent and 2016-2017 shaping up similarly, NGES is pushing a five-year average annual growth rate of over 50 percent.

IBM has been delivering performance with its Z-series mainframes for years. Each release built on the previous, delivering enhanced performance and greater features. The July 2017 release goes even further. Not only did IBM increase performance, but it introduced pervasive encryption that protects data on the IBM Z and other platforms for which the server manages or provides data, creating an ecosystem of integrated data protection.

On July 17, 2017 IBM introduced the fourteenth release of its Z mainframe platform. Despite all the hype around various x86 platforms, including virtual host systems, none can hold a candle to the mainframe as a data center workhorse. The performance of the new system comes from a ground-up redesign of the main microprocessor, the system board, and the data communication bus. The focus of these changes was not just increasing the total volume of processes per second, but enhancing cryptographic processing and key management to create the first system with enough strength to provide native encryption across its own storage, and be able to fulfil the data protection role. Previously, specialized hardware security modules (HSMs) fit the data protection space to protect information for other off-system applications, including cloud. IBM is calling this breakthrough "pervasive encryption."

This Enterprise Management Associates (EMA) research report, "State of File Collaboration Security," offers insights into file data leakage risks and incidents, security process and control maturity, perceived cloud-based file platform threats, and anticipated investments that attempt to preempt file access and usage exposures. Sponsored by FinalCode, the research serves to provide IT professionals an understanding of where their organization sits on the file security spectrum, how policy and controls staff may need to consider existing and new file collaboration risks, and how organizations should move to close file data leakage gaps.

The data indicates that alerting systems are not operating in a generally efficient manner. Many incidents are automatically misclassified as critical alerts. By itself this problem is unacceptable, but added to the fact that a large number of alerted incidents are actually false positives that should not have been generated in the first place, it is becoming easier to see why security teams feel stressed and overwhelmed. Because of the time needed to manually investigate each alert to determine whether it is really critical or a false positive, teams are falling behind on alerts--creating a huge backlog of unworked tickets. This is a strong reason why dwell time for breaches is over six months. Many organizations turn to ìtuningî systems to reduce generated alerts, leading to the scenario where real alerts are never generated due to improper tuning.

While larger teams could solve the problem, trained personnel are not available and this particular solution does not scale. It also does not address the root of the problem. Ultimately, this is a tools issue. The systems are not given enough context at alert creation to properly classify the incoming alerts and identify vulnerabilities.

May 2017: Corvil announced its Virtual Security Analyst (VSA) feature to detect and monitor threat actors in high-speed, high-volume, low-latency trading networks. The Virtual Security Analyst will leverage Corvilís existing appliances and an enhanced set of machine-learning algorithms to identify threat actors that managed to access the highly-secure and highly-sensitive trading environments. The current release of VS- automated security analytics will control unused appliance capacity by using afterhours batched analytics jobs. This will decrease the opportunity cost of the appliances, increase security within the trade environments, and decrease man hours spent looking for threats.

The "Data-Driven Security Unleashed" report is the fourth iteration in the Data-Driven Security series. Each report uses core questions for trending demographics and technology uses, but each also has its own unique focus. "Data-Driven Security," the first report (released in 2012), asked about the tools and data collection preferences for improving security. EMA released "The Evolution of Data-Driven Security" in 2014 as the second iteration. Beginning in 2015, the report began asking more questions about key drivers and value. "Data-Driven Security Unleashed" also adapted to the marketspace by replacing some of the previous technologies with emerging or impactful solutions.

EMA used the collected data to paint the picture of perceived most valuable tools and their use cases, as well as key drivers to adoption and the frustrations users experience. During the course of the research, EMA substantiated that insufficient staff is a problem that is only getting worse. This is not due to budget pressures as often as it is the lack of skilled or qualified personnel available in the market. In 2015, 68 percent of respondents indicated their organizations were experiencing impacts from staffing shortages. That number rose to 76 percent in 2016. While 35 percent of organizations are hiring less skilled/qualified personnel and training them to meet their needs (up five points from the previous report), 21 percent say they just cannot find personnel at all (up seven points from 2015). However, though staffing issues were the primary frustration within IT security, respondents reinforced the idea that meeting compliance requirements was detracting from making real security improvement was introduced, along with the recognition that organizations' lack of repeatable, saleable processes is also a major hindrance. Lastly, though organizations did not complain about false positives to the same level as they did 2015 and early 2016, they indicated that it is difficult to prioritize remediation of threats and exposures. These changes in focus help vendors understand where improvements were made and where they need to continue.

Some of the top use cases for security technology were enhancing breach/compromise prevention; detection and/or response; malware prevention, detection, and/or removal; identifying malicious threat actor activities and data exfiltration; and providing highly actionable intelligence/context for incident prioritization. It is clear from the research that teams are looking for improvements at the beginning of the cyber kill chain. Reducing dwell time for attackers has become paramount.

2014 and 2015 battered confidence in security's ability to detect incidents before becoming a significant impact. Confidence dropped from 31 to 21 percent of respondents being only "somewhat confident" into "highly doubtful" of detecting a security issue before it made a significant impact. However, advances in technology changed the previous four-year trends in these areas. As of 2016, 48 percent of respondents were confident that they could detect an incident prior to it becoming a significant impact. This is at least partly due to the fact that more companies are creating security baselines for their environments. As a result, companies feel more confident about their ability to monitor and prioritize threats to their high-priority assets and detect breaches before they create a significant impact.

The largest change in the report for 2016 was the inclusion of network security policy management as a technology. Though vendors in this space have existed for as many as 12 years, the growing complexity and span of networks (combined with the need for centralized security policy implementation and visibility) gave this category a huge introduction to the report. 53 percent of organizations said they were using a tool of this type, making it the most widely used solution in the report. It also scored at second place for value based on total cost of ownership.

The "Data-Driven Security Unleashed" report is a guide to market perceptions about the strengths of various tools and weaknesses from security and IT personnel, as well as individual contributors through the management ranks.

In May 2017, Palo Alto Networks announced the latest edition of their standalone endpoint protection solution package. Traps 4.0 builds on and expands the capabilities of previous Traps releases by extending standalone abilities that provide new malware and exploit prevention capabilities across more operating systems. Traps 4.0 also increases its integration with Palo Alto Network Next-Generation Security Platform to deliver on their promises of a greater value ecosystem.

Midmarket companies are aware of the increasing threat --investments in strengthening cyber security posture rank amongst their top priorities in 2017. Yet there is dissonance in this segment on exactly what makes a security program "strong." Even those with resources to hire dedicated security staff are facing intense competition to find, afford, and retain individuals with specialized skills. In 2016, EMA research identified that 76% of organizations are impacted by security staffing shortages, which is an increase of eight percentage points over 2015. It's why forward-thinking midmarket companies, cognizant of their budgetary constraints, security vulnerabilities, and the difficulties of finding and retaining people to fix them, are maximizing ROI and effectiveness from security spend by turning to managed security services providers (MSSPs). In 2016, 54% of organizations EMA surveyed identified that they were using an MSSP for more than 50% of their security operations and 55% indicated they would increase spending in this area.

One MSSP service that is gaining popularity is the managed security operations center (SOC), such as those provided by Arctic Wolf Networks (AWN). The managed SOC is a premium service and more than managed security incident and event monitoring (SIEM). Managed SOC ensures organizations that face the dilemma of a lack of qualified staff have 24/7 monitoring, investigation, and remediation of threats, freeing the company to focus on their core competencies.

EMA "Vendors to Watch" are companies that deliver unique customer value by solving problems that had previously gone unaddressed or provide value in innovative ways. The designation rewards vendors that dare to go off the beaten path and have defined their own market niches.

This document contains the 2017 EMA Vendors to Watch in the Security & Risk Management field, including CyberGRX.

Additional vendors will be added to this document throughout the year.

Gain actionable insights on how to defend your websites and APIs for the coming year’s onslaught of automated threats.

These slides - based on the webinar featuring leading IT analyst firm Enterprise Management Associates (EMA) and Distil Networks - dive into the latest Distil Networks Bad Bot Report —the IT security industry's most in-depth analysis on the sources, types, and sophistication levels of bot attacks—data to reveal 6 high-risk threats every IT security pro must protect against.

CA Technologies announced its intent to acquire Veracode, a SaaS platform and application security testing (AST) vendor, for $614M USD. Veracode is a leader in the AST space with capabilities in web, mobile, and third-party security testing. What really brings value to CA in the transaction is that Veracode is not only a leader in the testing space, but has also been a force majeure in creating cloud services. Prior to its emergence in 2006, the AST world was based on-premises. Since then, Veracode's change in the paradigm pushed more established on-premises vendors into the cloud space to continue competing. Veracode showed true vision and ability to execute in the space as demonstrated by their growth, especially over the last five years.

The transaction is expected to close in 1QFY18 pending customary approvals by the CA shareholders and federal regulators.

The endpoint security market is an extremely vendor-dense and highly competitive area. Vendors in the space are generally divided into two camps: those promoting prevention and those promoting detection. Though the vendors who provide prevention also alert on the attacks they detect, thus providing detection, the primary differentiator in the two approaches is pre-compromise versus post-compromise. Prevention vendors alert on attacks they identified and stopped while the detection vendors alert on what happened to the system that looked like an attack.

While the vendors in this space agree that a (strictly) signature-based approach is not the way to go for ongoing success due to inability to scale and false positive/negative alerts, they disagree on which general approach is better. The most common argument between supporters of these approaches goes something like this: detection supporters say that prevention cannot stop everything and thus is not reliable. Prevention supporters say that once the endpoint is compromised, it cannot be trusted and therefore must be cleaned/remediated, taking valuable time from the personnel affected by the clean-up.

The two camps are further joined by the "Big 5" traditional antivirus vendors who dominated the antivirus market for more than a dozen years but in recent years began losing revenue to the startups. Though these companies are currently not much more than a revenue nuisance, as they continue to grow, that impact will increase. It is ultimately the failing of these companies to adapt their technologies and defense strategies to accommodate the advancing attacks that gave way for the creation of the NGES startups.

This research quantifies these perceptions. The approach was to identify as many vendors in the endpoint protection space as possible, letting the research participants identify whether or not they heard of or used each and whether or not they thought they met the provided definition of an NGES solution. Once the baselines were established, qualified respondents were asked about their perceptions of the vendors and what drove the perceptions. The outcome reveals market penetration of the vendors and how well they are perceived in the marketplace, and the data can be used by marketing teams to redirect their efforts in the most useful manner to bolster positive perception and address negative perception.

Speaking in front of an audience of 300 security specialists at a conference last year, the head of the Department of Homeland Securityís Industrial Control Systems Cyber Response team (ICS-CERT) made a dire pronouncement: the department saw an increase in attacks on industrial control systems in 2015, and he was "dismayed" by the accessibility of some of the networks. To say that many people in that audience (and those who would follow the coverage) were equally dismayed at how to handle the paradox that gave birth to this problem would be an understatement.

The convergence of information technology and industrial technology supplies analytic applications with massive amounts of industrial data, resulting in streamlined operations, improved safety, predictive maintenance, and optimized processes. However, this convergence is occurring in an environment that was never designed to be accessible from the outside world. Security strategy for operational technology (OT) was developed decades ago, under the assumption that restricting physical access to industrial control systems and networks was enough to protect them. The simple truth is that OT grew independent of IT, with no intention to integrate the two. Over time, the reliance on physical protection approaches proved to be short-sighted and insufficient due to the integration of business and industrial systems, and the expansion of remote management. For manufacturing, transportation, and critical infrastructure, this reliance is creating enormous risks with potentially catastrophic outcomes.

When hackers disabled systems and dumped unreleased films, unfinished movie scripts, private emails, salary figures, and tens of thousands of employee social security numbers onto public file sharing sites in December 2014, Sony Pictures became an embodiment of a situation that seemed straight out of one of its movies. While other major corporations faced high-profile attacks, the depth and breadth of the Sony hack, commentators said, would finally force corporations to take a hard look at their own security protocols and layers of protection (or lack thereof).

Still, daily, there’s another story about attackers stealing credentials and gaining access to everything from sensitive customer data to corporate documents, even to the email accounts of public figures, as we saw with the 2016 U.S. presidential campaign.

What these attacks unequivocally demonstrated is just how exposed our networked society is. They are forcing organizations to confront the reality that we are now in a state of continuous compromise when it comes to cyberattacks.

Every piece of equipment in the enterprise seems to be vulnerable to cyberattack. The commercialization of the Dark Web provides hackers with a diverse toolkit and evolving tactics, allowing them to gain access to everything from critical systems to BYOD devices and IoT-enabled office equipment. To address this, security architectures can no longer rely on predefined rules and parameters; they must be redesigned to read and react to contextual signals. Robust endpoint security should aim not only to physically protect devices, but act as a digital sleuth and lend clear, actionable data to detect intrusions and insight to prevent similar types of attacks. Organizations that take an adaptive approach to designing security architectures will be equipped to deal with evolving cyber threats.

View these slides - based on the webinar featuring security experts David Monahan, research director at leading IT analyst firm Enterprise Management Associates (EMA) and Tom Obremski, software product manager of QRadar Security Intelligence at IBM - to learn how to get faster visibility and richer context into breaches as they occur.

It can be argued that the largest impact to workforce productivity in the last 20 years was the introduction of Wi-Fi networks. In today's workforce, moving from meeting to meeting or place to place is a job requirement; being tethered to a LAN port seems almost unfathomable. There are many factors driving the growth of Wi-Fi, including new standards with increased bandwidth, expanding use cases for mobile and Internet of Things devices, maturation of cloud-based management infrastructure, and the rise of virtualization and open source hardware for Wi-Fi. Given its importance and impact on productivity, choosing a solution to provide that connectivity is a critical business choice. Throughput, scalability, reliability and resiliency, simplicity, and ultimately cost all play a part in the vendor selection process and achieving a desired level of return on investment (ROI). This EMA paper discusses these factors in relation to total cost of ownership (TCO).

Information security problems abound. Though they may vary to some degree, it seems some problems are nearly universal and all get worse as organizations grow. Security policy standardization and unified enforcement, change management automation, baselining security posture, and monitoring high-value assets are all issues that increase with the speed of business and the growth associated with it. However, these problems can not only be addressed, but resolved, reducing attack surface and the associated risks while simultaneously accelerating IT and security functions. This EMA case study discusses how a Fortune 500 and Forbes Global 200 information technology, consulting and business process outsourcing company addressed these problems.

Security professionals are overwhelmed by data. There is an overload of information presented to people with limited time and bandwidth. Without context, this information can also overwhelm tools, making security analysts' lives more difficult rather than hastening a path to incident response and resolution. When early detection and discovery fails, advanced persistent threats remain.

With dozens of disparate and often overlapping data sources, early warnings are frequently missed. Security monitoring solutions trigger alerts without adequate context for identification and categorization. In order to be most effective, security intelligence must be coupled with automation and orchestration tools designed specifically for enhanced protection, using information across multiple silos.

This is where Ixia Security Fabric comes in. It facilitates security operations and analysis, acting as an intelligent traffic manager with the ability to direct packets based on numerous administrator-defined criteria and augmenting that traffic with additional data and metadata through network and application protocol analysis. All of this is done to deliver greater context around incidents in a shorter period of time so analysts can achieve results faster. This allows them higher efficiency in operations and gives them time to address other operational issues in a proactive manner.

While this concept originated in network operations, its benefits have much further-reaching effects in the security world. The top four value drivers for Security Fabric include:

Visibility - Illuminating and overseeing network traffic originating from various input sources, including contextual intelligence using both data and metadata

Context - Understanding the information provided by various sources through situational, protocol, behavioral, and time awareness

Resilience - Ensuring continuous function when a link or tool within the chain is overwhelmed or broken

Many point solutions have attempted to solve the context and automation problem using log file aggregation and flow analysis. Unfortunately, these disparate parts do not overcome the most critical shortcomings, notably intelligent traffic management and contextual analysis coupled with a smart console.

The Ixia Security Fabric offers simplicity and performance -- a recipe for success!

Growing in number and becoming increasingly more malicious, security threats and attacks pose a severe threat to the survival of a business. Security operations teams need to leverage every available tool to respond more quickly and effectively to these incidents. While network packet capture and forensic analysis has traditionally been used by network operations, such a tool can also help security teams augment existing defenses and get on top of these threats. Given the hostile IT security environment, close collaboration between these groups is crucial to overall IT organizational success.

This white paper explains the importance of packet capture and forensic analysis to security operations, examines the dynamics of this growing collaboration between security and network teams, and explores a leading platform in this market from Viavi Solutions.

Enterprise Management Associates recently conducted a research project that investigated multiple aspects of cloud service providers and the adoption of their services. The research looked to identify major motivators and inhibitors of adopting cloud services for both compliance and non-compliance constrained workloads. Security emerged as the top adoption concern.

The research showed that though they have differing responsibilities, both IT and security agreed that implementing proper security in their cloud environments is better than accelerating deployment speed but achieving a less secure environment. IT respondents admitted that they would rather suffer a delay for a new cloud application deployment than rapidly deploy an application into a potentially insecure environment and, even better, if they help the business people understand the issues, the business people agreed in nearly a 3 to 1 margin. IT sided with security to delay a product launch due to security concerns rather than suffer a significant security breach by a margin greater than 2 to 1.

Cloud adoption is a long-term partnership between the business and the provider. It provides a new toolset to address both traditional and new IT service delivery challenges. Within that toolset security, elasticity, and agility are all available to accelerate business service delivery, but the organization must evaluate its needs to determine not only current requirements but also expected future needs to ensure how each provider under consideration can or will meet those needs.

It can be argued that the largest impact to workforce productivity in the last 20 years was the introduction of Wi-Fi networks. In today's workforce, moving from meeting to meeting or place to place is a job requirement; being tethered to a LAN port seems almost unfathomable. There are many factors driving the growth of Wi-Fi, including new standards with increased bandwidth, expanding use cases for mobile and Internet of Things devices, maturation of cloud-based management infrastructure, and the rise of virtualization and open source hardware for Wi-Fi. Given its importance and impact on productivity, choosing a solution to provide that connectivity is a critical business choice. Throughput, scalability, reliability and resiliency, simplicity, and ultimately cost all play a part in the vendor selection process and achieving a desired level of return on investment (ROI). This EMA paper discusses these factors in relation to total cost of ownership (TCO).

As criminals discover the profitability of attacks against information systems, fraud has increased significantly with no end in sight. Adversaries learned the lucrative nature of harnessing cyber threats. Innovations now make it easier to steal from a wider range of victims, spurring the commercialization of multiple forms of crimeware and an entire Internet subculture known as the "Dark Web" or the "Dark Net," where both software and services can be rented or purchased. The Dark Net services gave rise to specialization, competitive pressures, and other factors that illustrate how fraud, abetted by cybercrime, grew from the unrelated activities of a few into an industry in its own right.

This industry produced a level of automation and sophistication in fraud techniques rivaling those of the legitimate business world. The commercial-grade packaging of complex threats makes it possible to readily convert personal systems into pawns that facilitate fraud, often unbeknownst to their rightful owners. Large-scale systems management capitalizes on the ability to harness entire networks of compromised hosts whose masters often avoid detection and eradication through highly nimble evasive tactics. The net result: an industrialized threat that costs businesses billions to trillions of dollars worldwide.

In this paper, Enterprise Management Associates (EMA) explores the response organizations must marshal to stand up to the threat of industrialized cybercrime. If attackers are well organized and well informed, take advantage of the latest innovations in the shadow market of crimeware and automation, and capitalize on intelligence to maintain their advantage, organizations must respond accordingly.

Coordinated strategies embracing multiple tactics to limit exposure and improve effectiveness are now mandated by guidance such as that of the PCI Council, the US Federal Financial Institutions Examinations Council (FFIEC), and other regulations worldwide, affecting businesses targeted by fraud. The RSA Fraud and Risk Intelligence services portfolio offers an example of just such a coordinated approach. With its early leadership in technologies and services that integrate intelligence with anti-fraud tactics in real-time, the RSA Fraud and Risk Intelligence portfolio provides organizations with the tools to enable strategies for confronting an industrialized threat with an industry-wide response.

This EMA white paper lays the foundation for what bots are, why they are a threat, how they are adapting to bypass traditional defense technology, and how to overcome those shortcomings. Most current solutions were designed to combat programming deficiencies and errors, such as those addressed in the Open Web Application Security Project (OWASP) Top 10 and SANS/Mitre Top 20, but are not sufficient to protect them from exploits in business logic.

Organizations hoping to gain the best defense from these newly-defined OWASP Automated Threats to Web Applications must understand and then adopt an adaptive defense that uses both older and new defense capabilities.

Carbon Black is the leading next generation endpoint security company, formed when endpoint prevention and application control company Bit9 acquired endpoint detection and response company Carbon Black in 2014. The combined company existed as the compound name Bit9-Carbon Black until early 2016, when the current name "Carbon Black" was adopted for the combined company. The most recent acquisition of Confer Endpoint Security augments Carbon Black's prevention, detection, and response capabilities with a lightweight, next-gen AV offering for customer looking for a simple yet powerful way to replace legacy AV, as well as numerous additional features to be addressed herein. If integrated properly with their existing portfolio, the enhancements to malware-less prevention and detection and the additional response automation will significantly increase the competitive standing in the marketplace for a company who is already the market leader in several ways.

EMA "Vendors to Watch" are companies that deliver unique customer value by solving problems that had previously gone unaddressed or provide value in innovative ways. The designation rewards vendors that dare to go off the beaten path and have defined their own market niches.

This document contains the 2016 EMA Vendors to Watch in the Security & Risk Management field, including Tender Armor and Verodin.

Additional vendors will be added to this document throughout the year.

While document and file sharing is increasing rapidly, 97% of organizations cite file sharing as a high risk for information loss, and 75% believe their organization is currently at risk for data loss. Join us to obtain guidance on how organizations can protect themselves from document sharing and e-signature risk while maintaining a high level of collaboration and productivity.

These slides - based on the webinar featuring David Monahan, research director at EMA, and Kevin Froese, VP of global sales at Cirius - examine how to resolve risks in document and e-signature security.

Breach detection company SS8 has introduced their innovative "BreachDetect" solution, which is based on their experience delivering communications analytics for intelligence and law enforcement agencies. Though it has a robust feature set, BreachDetects' forte is to find locating previously unidentified suspects. BreachDetect is designed to reduce breach dwell time from the current average of months down to minutes. They call the platform "the time machine for breach detection" because it analyzes network communications and application behavior in real-time, scaling up to 160 Gbps networks, to create enriched high-definition records (HDRs) that BreachDetect can then store for years, retrieving them as needed to verify, corroborate, or supplement future findings.

Employees are a critical part of an organization's defense against many IT security threats. Just as having the correct technology solutions is important, training personnel to recognize security threats is a critical part of any security strategy. As part of that strategy, organizations must consider both the content and the training methods. Training that does not engage employees or provide for continuous learning and reinforcement is not sufficient to truly make employees more security aware.

The volume and sophistication of IT threats is putting unprecedented pressure on security teams. With limited time and manpower, security teams need the right combination of visibility, intelligence and context to find threats in real-time before damage is done.

These slides - based on the webinar featuring David Monahan, research director at leading IT analyst firm Enterprise Management Associates (EMA), and Wade Williamson, director of threat analytics at Vectra Networks - explain how to implement high-fidelity security to manage today’s most challenging threats, efficiently and effectively.

PUBLISHED: Wed, 04 May 2016 00:00:00 +0000
AUTHOR: David Monahan

These slides - based on the webinar featuring David Monahan, research director of security and risk management at leading IT analyst firm Enterprise Management Associates (EMA) - provide an overview of the most comprehensive and detailed piece of research in the endpoint space produced to date. These slides identify all of the major players in this space.

The "High-Fidelity" research project was created to understand more about organizations' data collection and use habits with regard to security. Specifically, the research focused on the collection and use of network and endpoint data and examined how these data types are used individually and in tandem to create an information stream that provides high-value telemetry to its users about their environments.

Both network and endpoint data can be highly valuable in identifying threats from breaches (incursions into a secured environment) and compromises (the extraction of private/sensitive data or information) as well as malicious and negligent insider activities. However, these two data sources have their own strengths and weaknesses. Endpoints may suffer from inoperative/inoperable agents or a lack of deployment. Network segments may not have monitoring systems turned on or even installed, or their log detail and collection settings may not be high enough to provide sufficient detail.

Even in the cases where both network and endpoint data sources are active and operating as designed, neither data type is perfect for all use cases. But together network and endpoint data create a greater visibility than either can individually. Here the whole is truly greater than the sum of its parts.

This report outlines issues with data collection and use and shows how these gaps can impact an organization's ability to maintain high-fidelity security.

Today's cloud market is often a fragmented place where applications and data may each exist on multiple platforms anywhere in the world and where security is often perceived as a work in progress. IT teams are frequently supporting a mix of on-premises and cloud applications along with cloud integration services. Petabytes of confidential information are moving among these cloud applications and the security practices and safeguards for data are often suspect. Without the proper security programs in place by both the data owner and the cloud service provider, it is possible for data to be leaked or compromised at multiple points in the data lifecycle. Vulnerabilities can occur at any point as data moves from users and their endpoints to the servers in a hosted data center, across communication lines out to the cloud, within the cloud environment, or in return transmissions to users.

As organizations adopt and internally implement an ever-increasing number of cloud applications, Informatica has stepped up to provide cohesive security and protection required by CSOs to protect business-critical data. Informatica Cloud brings cloud, on-premises, relational, and big data together for better value and easier management. With its web-based application, Informatica provides thorough cloud security throughout the data lifecycle. Possible security challenges found throughout the infrastructure include data transmission, data standards and connectivity, data governance, and audit compliance. To address these challenges, Informatica has created a layered, holistic security structure that is resistant to attack and resilient against failure.

This vendor-related research focuses on solution providers that are providing proactive next-generation endpoint security services covering prevention, detection, and response. EMA provided all identified participants the opportunity to participate in both a vendor-answered questionnaire and interviews. EMA then combined that information with research efforts external to the providers to create company profiles and assess each on their applicability to the space as well as their market share by revenue and license volumes. Most of the vendors competing in this space have emerged or refocused in the last few years, with only a few having competed in the market for more than five years.

As with any study, this study is only as good as its data inputs. This research identified 34 solution-provider candidates as participants. Of those companies, nine vendors elected not to respond or share data. Within the remaining 25, some were unable to provide complete data due to company policy.

The Next-Generation Endpoint Security (NGES) market is most similar to the Endpoint Threat Detection and Response (EDR) market identified by Gartner, but also overlaps the Specialized Threat Analysis and Protection (STAP) market identified by IDC. It is contained within the broader endpoint software security market, which includes traditional antivirus, also identified by IDC, and the even larger endpoint security market identified by "MarketsandMarkets", which includes all of the previous functionalities plus firewall, endpoint device control, and more. There is not a more comprehensive report in the market today discussing endpoint security.

The NGES market is highly competitive. With a 2013-2014 annual growth rate over 100% and 2014-2015 shaping up similarly, NGES is pushing a five-year compounded annual growth rate (CAGR) of over 50%.

The term "hi-fidelity" was first coined in the entertainment industry in the 1950s to indicate advances in audio technology that provided the listener with a richer "just like being there" experience. In the security context, "high fidelity" communicates the ability to provide a richer experience to the security analyst to deliver better security outcomes.

High-fidelity security systems provide more comprehensive and timelier information from multiple sources, both internal and external, in the appropriate volume and with the appropriate types of data to provide the best context and priority for decision making and to drive appropriate detection and incident response activities.

This paper discusses the benefits of using both network and endpoint data with a strong analysis toolset to create high-fidelity security.

The "High-Fidelity" research project was created to understand more about organizations' data collection and use habits with regard to security. Specifically, the research focused on the collection and use of network and endpoint data and examined how these data types are used individually and in tandem to create an information stream that provides high-value telemetry to its users about their environments.

Both network and endpoint data can be highly valuable in identifying threats from breaches (incursions into a secured environment) and compromises (the extraction of private/sensitive data or information) as well as malicious and negligent insider activities. However, these two data sources have their own strengths and weaknesses. Endpoints may suffer from inoperative/inoperable agents or a lack of deployment. Network segments may not have monitoring systems turned on or even installed, or their log detail and collection settings may not be high enough to provide sufficient detail.

Even in the cases where both network and endpoint data sources are active and operating as designed, neither data type is perfect for all use cases. But together network and endpoint data create a greater visibility than either can individually. Here the whole is truly greater than the sum of its parts.

This report outlines issues with data collection and use and shows how these gaps can impact an organization's ability to maintain high-fidelity security.

The "High-Fidelity" research project was created to understand more about organizations' data collection and use habits with regard to security. Specifically, the research focused on the collection and use of network and endpoint data and examined how these data types are used individually and in tandem to create an information stream that provides high-value telemetry to its users about their environments.

Both network and endpoint data can be highly valuable in identifying threats from breaches (incursions into a secured environment) and compromises (the extraction of private/sensitive data or information) as well as malicious and negligent insider activities. However, these two data sources have their own strengths and weaknesses. Endpoints may suffer from inoperative/inoperable agents or a lack of deployment. Network segments may not have monitoring systems turned on or even installed, or their log detail and collection settings may not be high enough to provide sufficient detail.

Even in the cases where both network and endpoint data sources are active and operating as designed, neither data type is perfect for all use cases. But together network and endpoint data create a greater visibility than either can individually. Here the whole is truly greater than the sum of its parts.

This report outlines issues with data collection and use and shows how these gaps can impact an organization's ability to maintain high-fidelity security.

These slides - based on the webinar featuring David Monahan, research director at leading IT analyst firm Enterprise Management Associates (EMA), and Wade Williamson, director of product marketing at Vectra Networks - provide insight on how algorithms can improve incident response, reduce risk, and improve ROI.

Advanced persistent threat (APT) detection and analysis are instrumental in identifying cyber attacks that evade perimeter security and spread inside networks. But the Big Data collected often lacks actionable, real-time information about attack behaviors, infected hosts, and the impact an active threat may have on critical business assets. Theses slides help explain how threat detection algorithms can replace your Big Data with better data.

SecureKey Concierge brings its authentication services to U.S. markets with the aim of streamlining identity management and increasing consumer security and satisfaction. SecureKey Concierge is a centralized platform that combines established, secure user credentials with the proliferation of consumer social, cloud, and business logins to eliminate an overload in password management requirements, increase security, and simultaneously reduce user friction and uncertainty for the service providers.

Resilient Systems is gaining ground with its full-service incident response platform for enterprises, which acts as a central hub for IT security teams to manage all aspects of incident response activities. Resilient's platform focuses on integration and offers a module-based approach to bring together the people, processes and tools to transform, orchestrate and empower an organizations incident response (IR).

Resilient Systems, which bills itself as the first IRP--incident response platform--in the security market, recently released version 24 of its solution. At the time of publication, Resilient boasted over 100 customers operating on its platform and 500% year-over-year growth in 2015, and in the past year has expanded its channel partnerships and gained accolades in the market.

It's no wonder, the incident response market is expanding rapidly. According to research, over 43% of companies worldwide experienced a breach in 2014. A large number of these organizations, especially in the midmarkets and SMB spaces, identified that they didn't have the processes or tools in place to adequately respond to that breach. Beyond that, even organizations with compliance requirements to have incident response procedures in place have found themselves short of a functional IR program. These organizations generally suffer from the common problem of "shelfware" IR procedures. "Shelfware" IR procedures are named for the fact that they only sit on a shelf and are not appropriately maintained for accuracy or tested for relevancy or efficiency so when an event actually occurs the process breaks down leading to a failure to appropriately respond increasing frustration and business impact during a highly critical time.

Information Security has always been a large producer and consumer of data. More sophisticated best practices and expanding compliance and regulatory requirements have almost exponentially accelerated the production and consumption of data. Event and activity logs have grown to Big Data proportions and the diversity of data being consumed has become significantly more varied. As the need for continuous security intelligence and accelerated incident response increases, traditional log and event management tools and monitoring practices are becoming increasingly insufficient.

IT and Security are deluged with thousands of alerts daily, a majority of which appear to be critical, making response an insurmountable task with affordable staff levels and traditional tools. With so many critical alerts, they have moved from the analogy of finding the needle in the haystack to identifying and prioritizing THE needle in the stack of needles.

The era of Big Data has begun demonstrating to information security that there is more that can, and must, be done to identify threats, reduce risk, address fraud and improve compliance monitoring activities by bringing better context to data creating information for actionable intelligence.

This research studies how both management and operations level IT and information security practitioners perceive the change in the volume and types of data available and the tools needed to provide analysis to generate actionable threat intelligence.

Advanced security analytics provide new adaptive algorithms called Machine Learning and Big Data analysis techniques that can be utilized to identify abstract data relationships, anomalies, trends, fraudulent and other behavioral patterns, creating information where only data existed. The era of Big Data is driving the next technology evolution.

Security analytics, though a relatively new field of technology, is the next step in detection and response technology, with possible impacts on prevention as well. Machine-learning algorithms and analysis techniques have advanced far beyond the capabilities of what was available in the commercial markets only 2-3 years ago. They also address the issue dubbed "We don’t know what we don't know." Security analytics' core function is to monitor and collect vast amounts of information from the environment to identify threats that indicate elevated risk and ultimately prevent lateral spread of those threats and data exfiltration. To succeed in this endeavor, the analytics platform performs the identification of threats and prioritization of threats without the requirement for the administrators and analysts to create policies or rules.

Deep Instinct provides an on-device, anti-APT/zero-day malware solution. Deep Instinct is currently unique in the marketplace. It is the only commercially available cyber defense solution that uses a cutting-edge technology called "deep learning." Deep learning is the next generation of artificial intelligence that uses the concept of brain synapses in its design and operation to mimic the function of the human brain for learning and decision-making. Founded in 2014, Deep Instinct Israel is headquartered in Tel Aviv. Deep Instinct USA is headquartered in San Francisco, CA.

2014 was dubbed "the year of the breach" as over a billion consumer records across nearly every industry vertical worldwide were exposed, costing billions of dollars in recovery costs and lost revenue for the affected organizations. Though this was a tough wake-up call, many organizations have seen that technology, though a necessary part of a security strategy, is not able to fully prevent breaches. They see that people are now most often the weakest link in security defense. At the same time, the old strategies of locking down everything so people cannot possibly cause a problem increases worker and business friction to a point that is unacceptable to both, putting security programs, and the security personnel, at-risk. To achieve both security and usability, security teams must create a change in the mentality and even business culture that by making personnel more aware of and vigilant against the various attacks they face on a near daily basis.

For the 2015 Security Awareness Training: Are We Getting Any Better at Organizational and Internet Security? report, EMA surveyed nearly 600 people in North America across the small-to-medium businesses (SMB), midmarket, and enterprise spaces. Respondents represented line of business, IT, and security/fraud/risk across major verticals including education, finance/banking/insurance, government/nonprofit, health care/medical/pharma, retail, and utilities/infrastructure.

The research revealed that a tremendous shift in awareness training programs has taken place, especially across the previously underserved SMB space. While in 2014 56% of individuals reported they had not received any training from their organizations, in 2015, 59% indicated they had now received some level of training. Many positive trends continued in the research showing the following.

** Training content is becoming more accessible to organizations of all sizes from both a delivery and cost perspective.

** Programs are becoming more effective and have better measurement and management capabilities.

** Due to training, employees are better at recognizing various forms of social engineering.

** Trained personnel recognize that they make better security choices at home as well as at work, further increasing the value of training.

Through awareness, as a collective corporate and Internet populace we are becoming more diligent in detecting and avoiding compromise by social engineering methods, especially phishing attacks. However, attackers are constantly honing their skills and adapting their attack methods. Only through continued diligence and expansion can we be successful in the long run. Program content and delivery must change to include new attack methods and programs must continue to expand to train the other 41% that have not received training as of yet.

Security professionals often wrestle with the unknown, struggling with a daunting array of exposures and threats. They are not trying to identify the needle in the haystack but the needle in the stack of needles. Each activity within a network, system, or application may be "the one" that indicates an Advanced Persistent Threat (APT) has taken hold or an insider has gone rogue and requires a response. But how do they know which one is the one?

In most cases, the key to success is not just more data but better data -- data that provides context to improve incident analysis and, therefore, appropriate and timely response. Better data can also help security teams to be more proactive. Accurate and timely information in the volumes collected today, however, is not manageable by human hands and eyes alone. Security professionals need the tools that allow them to identify the how and where attacks succeed in overcoming defenses.

In the update to the 2014 Evolution of Data-Driven Security study, Enterprise Management Associates (EMA) returns with updated research that encompasses the insight of over 200 IT and security practitioners and management worldwide, ranging from the SMBs to the enterprise markets across key industry verticals including financial, retail, federal government & aerospace, local government, technology, manufacturing, and utilities & infrastructure. EMA explores how data-driven security continues to evolve security tactics looking at 18 different categories of security tools to understand security management and strategy, and the data sources fueling those efforts.

Some of the questions this new research will answer include:

How is the data explosion affecting security prevention, detection, and response?

What are the best tools of gaining context for security alerts?

Which is more important for gaining context, endpoint data or

network data?

Which types of data and tools are most useful for improving prevention?

Which types of data and tools are most useful for improving detection?

Huntsman Security, founded in Australia ten years ago, entered the U.S. security information and event management (SIEM) market in 2015. Its mature security platform counts defense, intelligence, and national infrastructure deployments as customers, and several of its solutions address organizational information and operational silos and disparate security intelligence tools and communications to its ultimate goals of achieve more accurate and faster incident response.

2014 was dubbed "the year of the breach" as over a billion consumer records across nearly every industry vertical worldwide were exposed, costing billions of dollars in recovery costs and lost revenue for the affected organizations. Though this was a tough wake-up call, many organizations have seen that technology, though a necessary part of a security strategy, is not able to fully prevent breaches. They see that people are now most often the weakest link in security defense. At the same time, the old strategies of locking down everything so people cannot possibly cause a problem increases worker and business friction to a point that is unacceptable to both, putting security programs, and the security personnel, at-risk. To achieve both security and usability, security teams must create a change in the mentality and even business culture that by making personnel more aware of and vigilant against the various attacks they face on a near daily basis.

For the 2015 Security Awareness Training: Are We Getting Any Better at Organizational and Internet Security? report, EMA surveyed nearly 600 people in North America across the small-to-medium businesses (SMB), midmarket, and enterprise spaces. Respondents represented line of business, IT, and security/fraud/risk across major verticals including education, finance/banking/insurance, government/nonprofit, health care/medical/pharma, retail, and utilities/infrastructure.

The research revealed that a tremendous shift in awareness training programs has taken place, especially across the previously underserved SMB space. While in 2014 56% of individuals reported they had not received any training from their organizations, in 2015, 59% indicated they had now received some level of training. Many positive trends continued in the research showing the following.

** Training content is becoming more accessible to organizations of all sizes from both a delivery and cost perspective.

** Programs are becoming more effective and have better measurement and management capabilities.

** Due to training, employees are better at recognizing various forms of social engineering.

** Trained personnel recognize that they make better security choices at home as well as at work, further increasing the value of training.

Through awareness, as a collective corporate and Internet populace we are becoming more diligent in detecting and avoiding compromise by social engineering methods, especially phishing attacks. However, attackers are constantly honing their skills and adapting their attack methods. Only through continued diligence and expansion can we be successful in the long run. Program content and delivery must change to include new attack methods and programs must continue to expand to train the other 41% that have not received training as of yet.

operations. The acquisition provides Splunk with machine-learning capabilities not yet available in its solution, moving Splunk into an advantageous position in the IT operations and security markets. More information is set to be released on this acquisition in the coming months.

Security professionals often wrestle with the unknown, struggling with a daunting array of exposures and threats. They are not trying to identify the needle in the haystack but the needle in the stack of needles. Each activity within a network, system, or application may be "the one" that indicates an Advanced Persistent Threat (APT) has taken hold or an insider has gone rogue and requires a response. But how do they know which one is the one?

In most cases, the key to success is not just more data but better data -- data that provides context to improve incident analysis and, therefore, appropriate and timely response. Better data can also help security teams to be more proactive. Accurate and timely information in the volumes collected today, however, is not manageable by human hands and eyes alone. Security professionals need the tools that allow them to identify the how and where attacks succeed in overcoming defenses.

In the update to the 2014 Evolution of Data-Driven Security study, Enterprise Management Associates (EMA) returns with updated research that encompasses the insight of over 200 IT and security practitioners and management worldwide, ranging from the SMBs to the enterprise markets across key industry verticals including financial, retail, federal government & aerospace, local government, technology, manufacturing, and utilities & infrastructure. EMA explores how data-driven security continues to evolve security tactics looking at 18 different categories of security tools to understand security management and strategy, and the data sources fueling those efforts.

Some of the questions this new research will answer include:

How is the data explosion affecting security prevention, detection, and response?

What are the best tools of gaining context for security alerts?

Which is more important for gaining context, endpoint data or

network data?

Which types of data and tools are most useful for improving prevention?

Which types of data and tools are most useful for improving detection?

Fidelis Cybersecurity has acquired Resolution1 Security with plans to integrate the technology acquisition with Fidelis' XPS platform. The goal is to provide a combined endpoint and network protection solution for advanced threat defense. By combining endpoint and network technologies into a single solution, Fidelis provides organizations a more comprehensive view of threats in their environments than using only a network-based technology or an endpoint technology with the advantage of single-pane-of-glass visibility.

In April of 2015, Red Canary announced $2.5 million in seed funding from tech incubator Kyrus. The company, established at Kyrus just a few months earlier, entering the burgeoning cloud-based security service market as an endpoint threat detection and response (ETDR) as a service provider. Red Canary uses a combination of proprietary analysis algorithms, technology partners and human intelligence to simplify endpoint threat detection for its customers eliminating the burgeoning issue of false positive alerts.

In early 2015, threat intelligence management solution provider Lookingglass received $21 million of Series B funding, fueling the March 2015 acquisition of CloudShield for its deep packet processing and protection capabilities. This extends Lookingglass' ability to ferret out potential threats from many sources with data path visibility and real-time processing of the threat landscape, pulling from more than 1,000 threat indicators to create purpose-built data feeds and reports to customers. With the CloudShield buy, Lookingglass increases its visibility inside the network layer to deliver further on its promise of high-fidelity threat detection for actionable customer use.

Information security has always been a large producer and consumer of data. More sophisticated best practices and expanding compliance and regulatory requirements have almost exponentially accelerated the production and consumption of data. Event and activity logs have grown to big data proportions and the diversity of data being consumed has become significantly more varied. As a result, traditional log and event management tools and monitoring practices are becoming increasingly insufficient.

To add to this, the problem of maintaining security for an environment is at an all-time low. Executives are being dismissed or forced to resign post breach whether they knew about security issues prior to the breach or not. Threats seem to come from every angle. Not only are attackers consistently probing, but the attacks themselves are more persistent; once a foothold is achieved, detection and removal are also more difficult.

This research summary discusses how "the death of antivirus" has not meant the end of protecting the endpoint. Both management- and operations-level IT and information security practitioners are re-embracing the idea that, despite the onslaught of malware and other persistent threats to endpoints, prevention is possible with endpoint threat detection and response (ETDR) technologies.

Over the last year, ETDR solutions have seen a significant surge in adoption, jumping the technology chasm from an emerging technology into a growth technology (see Figure 1 in the Analysis Summary). Through a best of breed approach, administrators and security personnel responsible for protecting information are getting higher fidelity data to provide better context for preventing incidents in a world where traditional prevention methods have often failed. ETDR tools provide practitioners with a means to thwart attacks and verify success via bidirectional information exchange with other systems. Bit9 + Carbon Black and Enterprise Management Associates have partnered to provide this research, which identified that nearly 80% of respondents believed that ìconsistent prevention of stealthy threats, advanced persistent threats, or advanced target attacks are possible with technology solutions existing today."

The emergence of the Internet of Things (IoT) and increased scrutiny on security procedures and tactics means that the stage is set for the entrance of innovative technologies that protect end users as well as multiple devices. Identiv seeks to allay IoT security fears with custom sensor technology that protects organizations of every size and industry, making it easier for these organizations to confidently move forward into this world of a "thing" for every need.

On April 22, 2015 at the RSA Conference in San Francisco, Identiv announced its new sensor technology. Created by Identiv Labs, a division of Identiv focused on the Internet of Things (IoT), the technology utilizes RFID, NFC, and Bluetooth tags as part of their uTrust Sense devices to remotely monitor assets and their interaction with the physical world. The announcement marks a bid by Identiv to combine their identity capabilities with extended real world IoT applications to maintain information security in the age of IoT.

Information security has always been a large producer and consumer of data. More sophisticated best practices and expanding compliance and regulatory requirements have almost exponentially accelerated the production and consumption of data. Event and activity logs have grown to big data proportions and the diversity of data being consumed has become significantly more varied. As a result, traditional log and event management tools and monitoring practices are becoming increasingly insufficient.

To add to this, the success record of maintaining security for an environment is at an all-time low. Executives are being dismissed or forced to resign post breach whether they knew about security issues prior to the breach or not. Threats seems to come from every angle. Not only are attackers consistently probing, but the attacks themselves are more persistent and difficult to block; once a foothold is achieved, detection and removal are also more difficult.

This research summary discusses how both management- and operations-level IT and information security practitioners are impacted by staffing shortages, lack of visibility into their environments, and how they are getting higher fidelity data to provide better context for detection and response to incidents in a world where prevention has often failed. Security analytics tools provide practitioners with a way to meet their actionable threat intelligence needs for an appropriately prioritized, timely response to attacks. Damballa and Enterprise Management Associates have partnered to provide this research, which identified that across the board, 79% of respondents were only "somewhat confident" in to "highly doubtful" of their ability to detect an important security issue before it had significant impact. In contrast , 95% of the participants using security analytics were between "highly confident" in and "somewhat confident" of their ability to detect similar issues, thus demonstrating that the information security discipline needs next generation analytics capabilities to be successful in the age of advanced and persistent threats.

Information security has always been a large producer and consumer of data. More sophisticated best practices and expanding compliance and regulatory requirements have almost exponentially accelerated the production and consumption of data. Event and activity logs have grown to big data proportions and the diversity of data being consumed has become significantly more varied. As a result, traditional log and event management tools and monitoring practices are becoming increasingly insufficient.

To add to this, the success record of maintaining security for an environment is at an all-time low. Executives are being dismissed or forced to resign post breach whether they knew about security issues prior to the breach or not. Threats seems to come from every angle. Not only are attackers consistently probing, but the attacks themselves are more persistent and difficult to block; once a foothold is achieved, detection and removal are also more difficult.

This research summary discusses how both management- and operations-level IT and information security practitioners are impacted by staffing shortages, lack of visibility into their environments, and how they are getting higher fidelity data to provide better context for detection and response to incidents in a world where prevention has often failed. Security analytics tools provide practitioners with a way to meet their actionable threat intelligence needs for an appropriately prioritized, timely response to attacks. Prelert and Enterprise Management Associates have partnered to provide this research, which identified that across the board, 79% of respondents were only "somewhat confident" in to "highly doubtful" of their ability to detect an important security issue before it had significant impact. In contrast , 95% of the participants using security analytics were between "highly confident" in and "somewhat confident" of their ability to detect similar issues, thus demonstrating that the information security discipline needs next generation analytics capabilities to be successful in the age of advanced and persistent threats.

When analyzing the major breaches of 2014, they follow a consistent blueprint of the attacker gaining privileged access within the network, moving laterally to extend the compromise, and then stealing or destroying key assets. Vectra Networks' core value is detecting these active network attacks as an attacker attempts to spy, spread and steal. To increase visibility, Vectra has introduced its S-series sensor hardware appliance, with a virtual image slated for June 2015, to deliver a plug-and-play system that can provide visibility at any location within the business network. This makes monitoring remote sites like small offices, retail sites, health clinics, bank branches, and critical internal network segments containing key assets much more cost effective. Adding Vectra's Detection Triage allows operations and incident response teams to filter irrelevant events, creating better clarity and context by delivering only the high-fidelity information.

Java is one of the most pervasive application platforms in the world, running on billions of devices from servers, desktops, and smartphones, to embedded devices for business and entertainment applications. It is also plagued with security issues.

SQL injection (SQLi) is one of the most common web-based attacks across the Internet and identified as the top threat by both the OWASP TOP 10 and the SANS TOP 25 Programming errors.

Successful application attacks against any vector lead to distrust of the applications, data and monetary losses, and brand degradation. Current network-based solutions used to protect web applications, such as web application firewalls (WAF) and process approaches, are merely insufficient Band-Aids to the problem. Addressing legacy application problems through recoding is an arduous and daunting task. Enter Waratek.

One of the most significant divides between security groups and operations is systems and application security management. Traditionally, the security team performs its vulnerability assessments in something of a vacuum, then passes the results over to the operations team for remediation. This handoff often resembles more of an over-the-wall dump of one or more tools' output with little correlation, coordination, or consideration of what the operations team already has on its plate to keep things running and what it has in play from the last security update. BMC Software (BMC) and Qualys have identified this as a significant issue and have teamed up to integrate BMC BladeLogic and Qualys Vulnerability Management (VM) technologies to provide correlated data, coordinated work flows, and context for remediation activity prioritization.

When analyzing the major breaches of 2014, they follow a consistent blueprint of the attacker gaining privileged access within the network, moving laterally to extend the compromise, and then stealing or destroying key assets. Vectra Networks’ core value is detecting these active network attacks as an attacker attempts to spy, spread and steal. To increase visibility, Vectra has introduced its S-series sensor hardware appliance, with a virtual image slated for June 2015, to deliver a plug-and-play system that can provide visibility at any location within the business network. This makes monitoring remote sites like small offices, retail sites, health clinics, bank branches, and critical internal network segments containing key assets much more cost effective. Adding Vectra’s Detection Triage allows operations and incident response teams to filter irrelevant events, creating better clarity and context by delivering only the high-fidelity information.

Illumio was co-founded in mid-2012 by CEO Andrew Rubin and CTO PJ Kirner. Having over 20 years each in the tech industry and significant leadership time invested in previous start-ups, their strategy was to staff a minimal core leadership team and focus on applying more resources on the engineering team to take care of business. That strategy has paid off. By January 2013, they were able to close $8M USD in series A funding and another $34.5M USD in series B funding in August 2013. Nearly a year later in August 2014, they released their first production release and subsequently worked until October 2014 to bring the company out of stealth mode and into the limelight.

Illumio's Adaptive Security Platform (ASP) was created to address three fundamental problems with security. One, the need for the flexibility to run an application anywhere and scaling at will. Two, the need for delivery at the speed of business to eliminate the security bottleneck. Three, the need for granular context about all of the resources in the delivery of the application to clearly define, isolate, and protect those resources form other infringement while allowing the application to freely operate.

For endpoint protection, traditional signature-based antivirus protection is insufficient, so organizations must find its successor. However in many industries, organizations continue to rely upon traditional antivirus as a first line defense or to meet out of date language in regulatory compliance requirements. Depending upon the progressiveness or open-mindedness of the organizations auditor and his/her desire to "check the box," organizations responsible to compliance regulations can be handcuffed to solutions that may meet compliance requirements, but do not provide sufficient security.

Bit9 + Carbon Black has partnered with Microsoft to provide enhanced prevention, detection and response capabilities to organizations who are using traditional antivirus but also recognizing it as insufficient for complete endpoint threat detection and response (ETDR). This paper discusses key benefits of the Bit9 + Carbon Black partnership with Microsoft to provide comprehensive endpoint defense and enhanced alerting capabilities. In the course of the discussion, the paper covers both the financial and operational values of using Microsoft System Center Endpoint Protection (SCEP) and the Enhanced Mitigation Experience Toolkit (EMET) in conjunction with Bit9 + Carbon Black to meet regulatory, compliance, or internal requirements and simultaneously improve organizational security posture, business return on investment (ROI), and in some cases, simultaneously reduce IT spend.

Vectra Networks’ X-series automated breach detection platform invokes user, group, and community behavior analysis to direct security teams to insider and advanced threats within their environment. The Vectra X-series graphically displays both changes in behaviors in the environment and their relative risk in addition to the threat potential, proximity to, and potential impacts on the key assets within the environment. This provides security with contextual, real-time visibility into the environment, facilitating better informed and timely decision-making for response activities.

EMA "Vendors to Watch" are companies that deliver unique customer value by solving problems that had previously gone unaddressed or provide value in innovative ways. The designation rewards vendors that dare to go off the beaten path and have defined their own market niches.

This document contains the 2015 EMA Vendors to Watch in the Security & Risk Management field, including Illumio, PFP Cybersecurity, and Twistlock.

Additional vendors will be added to this document throughout the year.

Founded in 2013, SentinelOne entered the endpoint protection market in late 2014 and has applied its resources and talents to bring to market a solution that defends against emerging, zero-day, and persistent attacks. On December 3, 2014, SentinelOne released a solution upgrade that combined their predictive execution inspection engine with other established and respected defense mechanisms, including cloud scanning, application whitelisting and real-time forensics, to provide some of the broadest endpoint protection capabilities available from a single solution, as well as a distinguished vision for delivering what they term as a "Continuous Cycle of Protection" against advanced malware.

Founded in 2010 by Executive Chairman Steven Chen, President Dr. Jeffrey Reed, and CTO Dr. Carlos R. Aguayo, PFP Cybersecurity, also known as Power Fingerprinting, Inc., provides a uniquely different approach to detection of malware within a myriad of systems.

Value Proposition

PFP uses a physics-based approach for detecting malware and counterfeit and used microchips. It is especially adept at detecting hardware Trojans such as the NSA Tailored Access Operations (TAO) hardware implants and malware that alters firmware and microprocessor kernels. This innovative approach can be applied to embedded systems, electrical power generation and transfer facilities, nuclear facilities, satellites and their control centers, banking and financial transfer data centers, control technologies such as SCADA and nuclear facility management platforms, and other highly regulated/controlled critical infrastructure environments that are high risk targets for malware attacks.

Though the marriage of Belden and Tripwire may not seem likely on the surface, there is a significant market opportunity for both companies post acquisition. Their target markets have little to no overlap; however, everyone connected to the Internet needs security, so Belden receives the opportunity to improve their products with Tripwire’s industry-recognized R&D and Tripwire inherits the opportunity to move into new markets.

Anyone who has been on the Internet in the last few years understands the burden of needing passwords for all of their accounts. These people should also understand the insecurity of normal passwords. Even by removing the end user threat from writing them down, accounts are still regularly compromised by shoulder surfing, persistent phishing, malicious web sites, and trojan programs injecting malware onto systems to harvest them. Major service providers such as Google Gmail and Yahoo Mail are now offering support for multifactor authentication (MFA), further endorsing that normal passwords are not enough, even for email. This makes it apparent that organizations should evaluate an MFA solution for reducing account compromise risks. SyferLock has developed a new approach to MFA delivery, GridGuard, which is highly flexible and has some very useful innovations so the user PIN and the MFA code are never revealed.

Organizations need improved automation to bring more efficient capabilities to their IT risk, compliance security and vendor management programs. In the current environment, IT professionals charged with creating and managing these programs, on top of their other operational responsibilities, struggle to meet workloads and often have insufficient data to be truly proactive and programmatic about what once was nice to have but now required. In today’s world of monthly breach announcements, maintaining the triad of evaluating risk, maintaining compliance, and improving security among their own company and business partners are paramount to success, but they must be maintained in a methodical manner. All too often we see failures in risk programs resulting in noncompliance, increased risk, and security breaches. This is often due to a lack of visibility into risks created by business needs such as faulty or unchecked processes; lack of, or unmanageable, policies; and/or unmanaged or undermanaged software and systems. This lack of visibility creates an inability to perceive the big picture impacts to the business from threats against individual assets or asset groups in context of the business. Without a means to collect all of the relevant individual data points into a single repository and automate the maintenance of that data, IT, compliance and security professionals will remain behind the curve in reactive firefighting mode, unable to appropriately prioritize current and potential risk and get ahead of problems.

Though every organization wants to believe that all of their employees, interns, contractors, consultants, vendors, and partners are above reproach and all threats are external, the truth is somewhere in the middle. Research shows that there is an upward trend of suspicious activities that can be used as breach indicators, including unauthorized accesses that lead to a data breach. Sixty-nine percent of reported security incidents involved an insider. Of the insiders that were part of a data breach, administrators were only 16%. In cases of administrator-caused data breaches, 62% were determined to be human error. Most interestingly, 10% of the threat actors were unknown. In these cases, the infrastructure logging was so deficient that it was impossible for forensic investigators to use it to determine the attack origin regardless of time or money spent. This begs the question what else is there? The answer is user-based monitoring.

In medium to large enterprises the number of firewalls and complexity of the security infrastructure are moving beyond what can be reasonably managed by hand. Manually updating policies on a single firewall with hundreds of rules is a daunting and problematic process, often leaving old rules in the policy that are never used but add to the processing overhead and create difficulty when reviewing the policy for compliance and other security review purposes. Take that problem and multiply it by 50 or 100 for larger enterprises makes the issue even more of a problem. Before the change can get to the implementation phase, it has to be reviewed for both security and operational impacts. This can be an arduous process in any environment but without the proper tooling, it is guaranteed to be fraught with delays and increased probability of implementation errors.

Compliance reporting for firewalls and other security-based changes is laborious enough on a small scale. Enterprises are often in the unadmirable position of having to segment and increase their infrastructure to meet growth demands but not having enough revenue to adequately increase staff, if they can find staff. With the increase in infrastructure, compliance tasks are unmanageable for the smaller staff. Without increased automation that has reporting built-in, compliance is a recurring burden that significantly impacts staff operations productivity.

Security analytics (SA), also often referred to as advanced security analytics or security intelligence, is quickly gaining momentum among security organizations. Products/solutions in this field offer a highly advanced set of analysis capabilities across a wide variety of data sources and produce results on both live data and historical events that have been captured and processed.

Depending on an organization’s current security monitoring and management infrastructure, SA solutions can either gather data directly from the source software, systems, networks, and [big] data warehouses/lakes, or sit on top of existing log management or SIEM systems to both mine and complement that data and push alerts back into it for centralized response and remediation prioritization. Some of the SA solutions can replace a traditional SIEM system altogether. The implementation flexibility of SA solutions provides organizations with a number of implementation options to meet their current and future needs.

IBM understands that "the cloud" is here to stay, and that as its adoption expands, so does the need for security. As part of that security, identity and access management (IAM) can no longer be contained as an internal control or process. With the expanding adoption of cloud services, policy and control enforcement need not be contained within the confines of the organization. A process requiring IAM may start, transfer, or complete any part of its operations within the cloud and rely on a partner, supplier, or customer for any part of the information being supplied or used. IBM is acquiring Lighthouse Security's Gateway to provide a cloud based IAM management system to offer these services to a broader customer base at reduced infrastructure costs and faster implementation times.

This acquisition will be combined with the July 31, 2014 acquisition of CrossIdeas. The CrossIdeas solution focuses on identity and access governance (IAG) which enables clients to address risk and compliance requirements with a business driven approach across both enterprise and cloud deployments. CrossIdeas' solutions bridge the gap between compliance, business, and IT infrastructure to help reduce the risk of fraud, conflicts of duties, and human error in business processes.

Security vendors know that there are (at least) three phases of information security management: prevent, detect and respond. It is a given that regardless of their vigilance, security groups will not be able to prevent all incursions that could translate into breaches. This means that companies must apply resources to detection and response. There are many tools already available to provide better detection, but few that offer better response capabilities. Organizations are looking for actionable intelligence, but even when they get it they often don’t have the capacity to quickly remediate all high priority incidents in as timely a fashion as desire. Even if organizations have the budget to hire the necessary staff to muscle through everything, there are not enough qualified personnel to fill the positions, so it is a no-win scenario without additional help from response capabilities.

Hexadite provides an automated Cyber Security Incident Response solution. They believe there are ample detection technologies available to provide insight into activities in the organization, but recognize the gap in ability to execute against the detected events. Hexadite has introduced the “Automated Incident Response Solution” (AIRS) to integrate with detection technology already in place such as SIEM, IDS, DLP, etc. The data from the detection systems provide the necessary information for AIRS to determine the source of the problem and begin remediation in as automated a fashion as the administrator allows.

As risks of data theft and the compromise of personal information have risen, the payment card industry has responded with one of the most prescriptive regulatory mandates to impact IT: the Payment Card Industry Data Security Standard. In November of 2013, the PCI Standards Council published the updated standard as PCI DSS 3.0. Enterprise Management Associates (EMA) evaluates how DELL's suite of products and solutions including DELL KACE, SonicWall, SecureWorks, and Data Protection Encryption, if implemented properly can help organizations address many requirements in the updated standard.

Clearly, systems management is front-and-center when it comes to assuring many of the requirements, for building, deploying, configuring and maintaining secure systems that access and handle cardholder information. These requirements are not just limited to the data center; the PCI DSS is as specific about requirements affecting user endpoints that handle cardholder data as it is about servers and networks at the heart of payment card information systems. These diverse requirements call for systems management solutions that embrace breadth as well as depth in compliance-enabling capability.

In this paper, EMA evaluates the capabilities of many of DELL's solutions against specific PCI DSS requirements identifying which PCI 3.0 requirements those solutions address in both leading and key supporting roles. The Dell KACE family of systems management appliances and features are highlighted as examples that enable organizations of all sizes to manage their PCI compliance requirements without the cost and complexity often associated with more traditional approaches, supplemented by Dell SonicWall, Data Protection Encryption, and SecureWorks security services that help organizations implement PCI compliance with proven expertise.

In the age of information, everyone is connected to the Internet. It is the backbone of consumer and business communications and e-commerce. The United States Bureau of the Census estimated that $4.1 trillion worth of retail and wholesale transactions took place over the Internet in 2010 alone.

The anonymity of the Internet combined with mobile payments and Internet monetization have brought out the "bad-guys" in droves. E-commerce fraud losses peaked in 2008 at $4 billion, then dramatically dropped to nearly half of that in 2009 due to improvements in Web Application Firewalls (WAF) and other defense and anti-fraud technology. Unfortunately, the fraudsters have rebounded, and fraud rates climbed up to $3.5 billion in 2013.

For companies that use multiple operating systems and platforms at both ends of the quadrillions of web application interactions they oversee, identifying threats and shutting down fraudulent transactions before merchandise or money changes hands is a paramount concern, as these companies live and die by these transactions. RSA's Web Threat Detection 5.0 (WTD5) solution is the latest arrow in the fraud fighting quiver from RSA providing significant enhancements for security and fraud departments to address web and mobile based transactional fraud across the gambit of prevention, detection and response.

The ability to capture, consume and correlate multifaceted data from all over the enterprise is a growing need. No single data source or type can provide sufficient forensic capabilities to solve all of today’s security problems. End user research conducted by ENTERPRISE MANAGEMENT ASSOCIATES (EMA) demonstrates that the data needs of security organizations are growing at breakneck speeds reaching volumes associated with Big Data. Log information from network and server infrastructure is no longer sufficient to provide a full picture. Security needs to process a broader and richer data set including network and Big Data repositories. Additionally, the security technology has to be able to correlate commonalities within those variant data streams to produce meaningful data trails and do it in as near to real time as possible. A 2013 study by Ponemon Institute identified that if a security incident can be resolved in less than 60 seconds, the remediation costs could be reduced by as much as 40%.

Traditional log management tools do not contain the range of data or data mining and analysis capabilities to deliver true security analytics and forensics. Security Incident Event Management (SIEM) tools, provide more capabilities but are also insufficient for full forensic analysis. Fifty-three percent of EMA research respondents understood that Security Analytics and forensics tools augmented their SIEM tools and 46% understood that security analytics and forensics tools were a natural evolution of the traditional SIEM. A good rule to follow is that a SIEM should provide correlation, normalization and alerts on key events and have the ability to query the data to retrieve answers to complex questions about the specific environment. A security analytics solution is able to adapt to the activities and behaviors within its monitored environment providing improved visibility into activities and why they should be investigated. It can ingest non-standard log data types at Big Data proportions to provide visibility into abstract data relationships bringing attention to problems that operators and administrators hadn’ t even thought of.

The introduction of a forensics solution will provide the increased capabilities to reduce false positives and time spent per case, thereby increasing the incident response team’s ability to process the key highest risk incidents first and faster, and create a proper case file to manage all of the required data.

Having the capability of doubling the number of incidents the response team can resolve in minutes makes choosing the right solution imperative. This report evaluates security forensics tools from an operations standpoint and identifies IBM Security QRadar as a leader among those evaluated. The investigation discusses the evaluation criteria for 6 tools widely recognized for their support in forensics data gathering and processing, and provides evaluation input on several other tools.

Information Security has always been a large producer and consumer of data. More sophisticated best practices and expanding compliance and regulatory requirements have almost exponentially accelerated the production and consumption of data. Event and activity logs have grown to Big Data proportions and the diversity of data being consumed has become significantly more varied. As a result, traditional log and event management tools and monitoring practices are becoming increasingly insufficient.

This research studies how both management and operations level IT and information security practitioners perceive the change in the volume and types of data available and the tools needed to provide analysis to generate actionable threat intelligence. Security analytics tools provide practitioners a means to meet their needs for actionable threat intelligence and timely response to attacks to prevent attacks from becoming breaches.

EMA surveyed nearly 300 personnel comparing and contrasting many of the responses by industry vertical, organizational revenue size, and personnel size. A number of key findings and supporting details were brought to light in the areas of SIEM, Security Analytics and APT/ATA defense technologies. The data reveals many other useful points that will aid the IT/security practitioner and management in advancing the security toolset and practices and the impacts of key program factors.

Information exchnaged factors will include:

1) Rankings of 13 different tools categories with their respective deployment and satisfaction within the repspondent groups.

2) Value of SIEM vs Security Analytics

3) Effectiveness and Value of security analytics tools as perceived by the business

Security is a key aspect of business in today's world. Announcements are made daily about new data thefts, breaches, and other related security issues, many of which originate as attacks against the workforce. Accordingly, the importance of the human component of security has become increasingly obvious.

The "Security Awareness Training: It's Not Just for Compliance" study conducted by Enterprise Management Associates is a groundbreaking research study examining the implementation and efficacy of security awareness and policy training programs across organizations ranging in size from small to medium businesses to large enterprises, including government entities. The full report analyzes details by organizational size in terms of personnel and revenue as well as by role within the business (IT, Security, and Line of Business), as well as evaluating responses by age groups, 30 years old and younger, thirty-one to forty-five and greater than forty-five.

This research comes at a time when organizations are seeing data breaches announced weekly. The study arms security and IT decision makers with insight on how to champion security awareness and policy programs where there are none and how to improve their existing programs.

The research study included over 600 respondents representing organizations ranging from 10,000 or more personnel, down to small businesses having fewer than 100 employees. Organizations also included public and private companies, government and non-profit groups. Respondents were evaluated collectively, by age group and organization size.

Security is a key aspect of business in today's world. Announcements are made daily about new data thefts, breaches, and other related security issues, many of which originate as attacks against the workforce. Accordingly, the importance of the human component of security has become increasingly obvious.

The "Security Awareness Training: It's Not Just for Compliance" study conducted by Enterprise Management Associates (EMA) is a groundbreaking research study examining the implementation and efficacy of security awareness and policy training programs across organizations ranging in size from small to medium businesses to large enterprises, including government entities. The full report analyzes details by organizational size in terms of personnel and revenue as well as by role within the business (IT, Security, and Line of Business), as well as evaluating responses by age groups, 30 years old and younger, 31 to 45, and greater than 45.

This research comes at a time when organizations are seeing data breaches announced weekly. The study arms security and IT decision makers with insight on how to champion security awareness and policy programs where there are none and how to improve their existing programs.

The research study included over 600 respondents representing organizations ranging from 10,000 or more personnel, down to small businesses having fewer than 100 employees. Organizations also included public and private companies, government and non-profit groups. Respondents were evaluated collectively, by age group and organization size.

Palo Alto Networks, based in Santa Clara, CA is one of the leading Next Generation Security Platforms providing integrated appliance and virtual machine-based protection solutions for the network, cloud and mobile. In an agreement announced on March 24, 2014, PaloAlto has proposed to acquire Cyvera, Ltd. (Cyvera) for $200 Million to expand its protection portfolio to the endpoint. Cyvera, Ltd. is a Tel-Aviv based cyber security solutions and services provider founded in 2011, with offices based in San Francisco, CA. Cyvera was co-founded by Netanel (Nati) Davidi (co-CEO and Chief Product Officer), Uri Alter (co-CEO) and Moshe Ben Abu (Chief Master Hacker). Cyvera provides cutting edge endpoint protection solutions that interrupt malware execution on the endpoint, thus rendering malware incapable of compromising the target.

As information is the currency of the digital age, it is imperative that organizations know where theirs resides so they can evaluate the risks of exposure and loss. There has been an explosion of enterprise data proliferation. The intersection of mobile devices in the enterprise, especially BYOD, and the expansion of Cloud Services engaged by end users can be a scary place to be. The scarier part is there are many enterprises that are at that intersection and don't even know it. A recent report by 2ndWatch revealed that 61% of business units bypass IT to get access to cloud services. Given this, the old adage, "Ask forgiveness not permission." is still alive and well, so IT and Security professionals should be highly concerned about where their data is going, especially if they are not actively tracking mobile devices AND the data that is on them.

In recent years, researchers have noted sharp increases in the number of successful attacks against the small- to medium-sized business (SMB). These attacks are ever increasing and successful due primarily to two factors. First, SMBs have valuable assets to be had. Secondly they don't have the budgets to support the security talent and in-house technology they desperately need to adequately defend those

assets. EMA analysts see that this is a part of the SMB lifecycle, the consequence of long-term security neglect. Longstanding resource insufficiency and the limited availability of security expertise has left many SMBs vulnerable to the expanding epidemic of attacks that have the efficiency and impact fully industrialized scale. Today, however, SMBs are recognizing that they can leverage cloud-based security technologies and services to make significant advances in reversing the consequences of long-term neglect, giving them the opportunity to rebalance the scales to their advantage. As a long term player and an established leader in cloud security, Qualys is among those championing this trend, with solutions such as QualysGuard Express Lite, which is tailored to the needs of the SMB.

Cloud adoption for services of all kinds is not only here to stay but, in some estimates, is expected to exceed $130 Billion in 2014. The most common cloud services deployments include Infrastructure as a Service (IaaS), Software as a Service (SaaS), and Platform as a Service (PaaS), with multiple others classes trailing out behind them. Those are each broken out into multiple specific services such as file sharing, business productivity, CRM/marketing and social business/collaboration to name a few. According to research, adoption of cloud services within enterprises, either by the enterprise or by the employees operating outside of sanctioned IT applications can exceeds 500 distinct service providers. CIOs and CISOs are realizing that with as much as they have vested in The Cloud, they need tools to manage and monitor cloud applications, their uses and the data they consume and produce, just as they have traditionally demanded within their own data centers. Skyfence offers such management capabilities.

We are in the age of continuous connectivity with 7x24x365 e-commerce. Going off-line due to a Distributed Denial of Service (DDoS) attack targeted at a data center or services infrastructure can mean millions of dollars in lost revenue and time with severe damage to brand reputation. It is crucial to prepare for this eventuality by knowing the options for mitigating the DDoS threats that are all too real and becoming more commonplace.

DDoS service provider, Black Lotus, has been providing services to customers since 1999. The Company began as a hosting service provider but shortly after the first major DoS struck, founder Jeffrey Lyon realized the threat potential, and began working on a solution to better support Black Lotus clients. Since that time, Black Lotus has continued to support hosting and a premier network-based DDoS mitigation services focusing on the service provider market space. Though it currently supports more than 560 customers with over 675,800 users, this back office focus has made it a "best kept secret" to most consumers.

Transacting parties have to trust each other millions of times daily to make ecommerce happen. With normal ecommerce based transactions we use digital certificates to protect the consumer from fraudulent sites pretending to be trusted vendors; we also have the Better Business Bureau and other consumer advocacy groups to help consumers be informed.

Most of the problems we have with fraud is the over-extention of trust by the vendors trusting the consumer. Consumers are not required to have digital certificates from trusted third parties (TTP's) and there is currently no standard for creating a digital identity lifecycle that can be used with ecommerce. Given that we have little way of either protecting businesses from fraudulent consumers or protecting consumers from being defrauded by other consumers.

There are numerous organizations out there trying to implement and distribute unique versions of a secure manageable identity for ecommerce. The problem for vendors is which identity program do vendors invest in going forward? Which standard is going to be "The Standard?"

The FIDO (Fast IDentity Online) Alliance has positioned itself as the key group to provide standardized strong authentication protocols to improve transaction trust and reduce fraud.

Over the last three years, there has been an 81% increase in security attacks; there are currently over 400 million unique malware variants being leveraged for systems compromise. In a recent study, over 700 security practitioners said they needed actionable intelligence in less than 5 minutes to prevent a cyber-attack from becoming a compromise. Twenty three percent (23%) of survey respondents said it would currently take them more than a day to identify a compromise, while 49% said it could take as much as a month to identify a compromise. In today's business environment where minutes can mean the difference between success and failure of a defense strategy and the loss of millions of dollars in revenue, brand equity and additional restoration costs, the timeliness of context sensitive data is not only a security imperative but a business imperative. ThreatOptics delivers Actionable Intelligence reducing time to respond to incidents.

Hexis Cyber Solutions has developed a incident response and threat management system that provides event identification, security intelligence and active defense methodology. Ultimately the system uses big data approaches, heuristics, active defense countermeasures, and continuous capability delivery to detect, engage, and remove incursions into the network. Each scenario it encounters is incorporated into it active intelligence model to learn how to better identify and respond in the future. The system support any range of response from for fully automated to prompt driven analyst response based upon the created policy. Its ability to detect the spectrum of cyber threats inside a network allow it to establish a progressive mitigation strategy; execute that strategy through the use of automated countermeasures, and continuously deliver new detection algorithms, threat data, and countermeasures across a community.

Cloud use is growing at an astounding rate exceeding many early predictions. A June 2013 study from Northbridge Venture Partners, the largest of its kind to date, shows that cloud adoption within the business sectors passed 75%. Cloud adoption is similar in consequences to the outsourcing of the late 1990s and early 2000s. Though reducing service deployment costs, expediting initial roll out timeframes and reducing initial investment costs, cloud also removes certain forms control and visibility from IT. With the combination of off premises cloud implementations and the mobile perimeter and proliferation of BYOD, companies now have a tough battle to wage in maintaining visibility and control of Intellectual Property and other data resources. One of the heretofore lacking abilities is digital forensics.

Cloud use is growing by leaps and bounds. As part of that explosive adoption, organizations must be prepared for personnel adopting services quicker than it is prepared to accept them. The old adage, "Better to beg forgiveness than to ask permission." seems to be an early adopter mantra. The average company has users engaged in using hundreds of cloud services to support business and personal endeavors, both sanctioned and unsanctioned. IT is placed in an awkward position of either trying to close access to the services or turn a blind eye and allow them. The former is difficult or impossible to enforce and may negatively impact business while the latter may conflict with established or desired information security objectives. This paper discusses how IT can manage cloud services usage at a granular level maintaining both usability and security.

This EMA Vendor to Watch paper discusses the unique characteristics of SQL Injection Protection Vender DB Networks. DB Networks has been able to position itself to address SQLi attacks against corporate databases thereby protecting customers and enterprises alike. The way the CORE IDS technology analyses transactions is fundamentally different from signature and traditional behavioral-based technology so Core IDS’ capacity to identify an anomaly is much higher, with significantly greater accuracy against false alerts.

In many secure communications protocols where encryption is utilized, key management is a critical part of ensuring that only the privileged parties can participate. When properly implemented, modern mainstream encryption algorithms such as AES 256, and RSA™ public key algorithm are only currently breakable if the keys are mismanaged like being insufficiently strong, overused, or left accessible by unauthorized parties.

Secure communications pioneer, SSH Communications Security, known for Secure Shell (SSH) and Secure File Transfer (SFTP) protocols, used by enterprises and governments worldwide, delivers an evolutionary step in the development of SSH as an enterprise class product releasing the Universal SSH Key Manager™ (UKM) solution.

EMA "Vendors to Watch" are companies that deliver unique customer value by solving problems that had previously gone unaddressed or provide value in innovative ways. The designation rewards vendors that dare to go off the beaten path and have defined their own market niches.