Wading Through Tool Overload and Redundancy?

After walking the RSA Conference’s show floor this year, you may be asking, “What is the right set of technologies to protect our enterprises?” While most of us know effective cyber security requires a well-coordinated set of people, processes and technology, the conference at its core is focused on cyber security technology. With new vendors sprouting by the dozen, the challenge of finding the right technology grows every year.

The starting point for any home renovation is sketching out the existing layout, from which you can start to nip, tuck and enhance for the greatest effect. Finding the right cyber security technology is no different. The starting point for any conversation needs to be a comprehensive understanding of the existing environment and architecture. Most large enterprises are running countless tools, often overlapping, either purchased organically or inherited via mergers and acquisitions. For one reason or another, legacy tools are often only implemented partially or are configured less than optimally. Buying more tools, without understanding the existing landscape, and how the new tools fit or how they are to be implemented, only adds to the problem.

The first step is to inventory your cyber security tools and information assets, including documenting technical details like software versions and underlying platforms, security functional areas, coverage and the business value of assets. By understanding your tool coverage and the business value of assets, you can better understand the actual value each tool is providing and drive a risk-based approach for filling in the gaps. In other words, prioritizing protection of your most critical assets, those that if compromised, would impact the business the most.

The results of inventories are often surprising. One of the important outcomes of implementing security/user and entity behavioral analytics is that it provides visibility into the coverage and effectiveness of the tools that feed it data. The process of identifying, integrating and reporting on all of that security and business data in a consistent and comprehensive way provides an understanding of where the skeletons lie and where the organization should be focused to reduce risk the most.

Based on my experience working with some of the largest, global enterprises in the world, typical findings include tools implemented through only part of the organization, the use of multiple vendors in the same category with each only partially deployed, configurations detuned to reduce the number of alerts, and worst of all, crowned jewels that are exposed to the elements.

With an inventory in hand, the next step is identifying gaps and redundancies in coverage and/or functionality. The result of rationalizing security tools and their coverage will be a clear list of actions and needs, often with neutral or positive budget impact. It is not uncommon for companies to pay for enterprise licenses for tools that only get deployed across half the organization. That money can be recovered or utilized by completing the deployment with no additional license cost. Reducing vendor and tool redundancy can reduce support costs and drive better deals with the winning vendor.

Companies that are not new to cyber security, typically have some semblance of traditional tools, including firewalls, data loss prevention, endpoint protection, vulnerability and configuration scanners, SIEMs, cloud access security brokers and the such. Notwithstanding any shiny new toys that struck your fancy, these tools are still a great foundation on top of which to build your architecture. Don’t automatically discount your existing platforms before you understand how well they have been deployed and the value they are providing. Finally, prioritizing your actions and acquisitions based on the value of the assets being protected will speed the burn down of your cyber risk exposure.

With an inventory and action plan in hand, you can start to execute and understand how you may benefit from any new tools or technology. Execution should be risk driven, starting with protecting your highest value assets. New tools should be evaluated based on how they fill a gap or up your data protection game to the next level. If the prospective tool is not part of an entirely new category, it is important to assess where it fits and where it overlaps, with the goal of minimizing functional and budgetary redundancy. Having your architectural ducks in a row will also enable you more freedom to be an early adopter to pilot and assess new technologies that may not yet be ready for enterprise deployment but provide the promise of future benefit.

Documenting and rationalizing an enterprise’s security architecture is not a trivial process. It requires a dedicated effort to complete. While the process is focused on tools and technologies, it cannot be completed without looking at policy, people and process as well. Tools don’t operate in a vacuum; they operate in living, breathing organizations that need to be able to conduct business while also protecting their operations. However, without an understanding of where you stand today, new tools will result in more redundancy and gaps in protection. A false sense of security is worse than a real sense of vulnerability.

Steven Grossman is VP of Strategy and Enablement at Bay Dynamics, where he is responsible for ensuring our clients are successful in achieving their security and risk management goals. Prior to Bay Dynamics, he held senior positions at consultancies such as PriceWaterhouseCoopers and EMC, where he architected and managed programs focused on security, risk, business intelligence, big data analytics, enterprise program management offices, corporate legal operations, data privacy, cloud architecture and business continuity planning for global clients in the financial services and health care industries. Steven holds a BA in Economics and Computer Science from Queens College and has achieved his CISSP certification.