Security Tools Rationalization – An AHEAD Perspective

By: Jason Foss, Senior Technical Consultant

What is the Problem?

Over time, organizations often acquire and deploy a multitude of security tools and solutions from various vendors to address different aspects of their security needs. This can result in a complex and fragmented security landscape with overlapping functionalities, inefficient resource utilization, increased maintenance costs, and potential security gaps.

The primary goal of a security tools rationalization exercise is to streamline and optimize the organization’s security toolset. It involves a comprehensive evaluation of the existing security tools, their functionalities, effectiveness, and value in addressing the organization’s security requirements.

How Did We Get Here?

Most modern organizations are acutely aware of the need to ‘pay attention to security.’ Reports such as the Verizon Data Breach Investigations Report (DBIR) highlight the threat that actors, both internal and external, constantly pose to modern business. It is, however, often vague and confusing for many organizations to know what it means to ‘pay attention to security’ and raises more questions than it answers. Does it mean more personnel? More managed services? More vendors? More tools?

Without a clear-cut answer, the easy approach for many organizations has been to invest more in tools. The rationale is that the more information you have about the activities in your infrastructure, the more intelligent and proactive security actions and security decisions that are possible. Reductive interpretation of some security frameworks, such as Zero Trust Architecture (ZTA), support this approach with an oversimplified interpretation along the lines of “buy all the security tools and use them.” This interpretation not only ignores any nuances such a model espouses, but it is sometimes also a harmful approach when applied without proper planning for—and integration between—such tools.

Taken in the historical context of security industry messaging of ‘Defense in Depth,’ which recommends multiple redundant layers of security, some organizations see no problems with those layers covering the same security functions. These may be multiple layers of endpoint security, multiple layers of vulnerability scanning and management, multiple layers of monitoring and alerting, etc.

In some cases, when properly planned and integrated as suggested above, such overlapping tools can complement each other and fill in capability and visibility gaps. In most cases, however, this turns into a nightmare of excessive ‘noise-to-signal’ ratio, where organizations struggle to identify what is a ‘real’ issue versus what can be safely ignored. Some instances of overlapping tools with disparate data can complicate the task of getting the proper details for even simple security investigations into a single case for management to the point where it is virtually impossible.

What is the Solution?

Perform a security tools rationalization activity, be it an external assessment, an internal audit, or some other type of formalized review of security tooling. Most importantly, tie the review to a standardized security framework.

The justification behind using a standardized security framework is multifold. Some of the reasoning includes:

  • It will generally align with overall security industry best practices.
  • Industry domain experts and (in some cases) vendors are the parties that generally suggest, develop, review, and approve the guidance included in frameworks.
  • It offers a generally broad range of security subdomains that cover most common and important IT functions for an organization, such as network security, application security, etc.
  • It will often include a mix of technical controls and process-oriented/non-technical controls to ensure a well-rounded security program.

With these reasons in mind, the approach to a rationalization exercise should follow these general steps:

1. Identify the framework the organization uses (or will use) for the tool rationalization.

2. Identify the domains, control areas, security functions, or safeguards (naming varies based on framework) used to define the scope of the tool rationalization.

3. Map the tools that have security functions to the controls used for measurement (note that some tools are not considered ‘security tools’ but do have functions that satisfy security controls).

4. Evaluate each tool and how well or how fully it satisfies the mapped controls. Consider using a scale such as:

  • Does not satisfy – the tool does not have capabilities and is not configured to satisfy any of the technical or procedural requirements for the control. This kind of rating is for a tool that is not related to a control’s purpose. For example, a firewall does not typically have endpoint security capabilities and therefore would not satisfy a control that requires antivirus software on workstation and server systems.
  • Partially satisfies – the tool has some capabilities and is configured such that it only partially meets the technical or procedural requirements for the control. This kind of rating is for a tool that has a partial feature set or is not fully configured. For example, a system management tool that has asset inventory capabilities, but is only able to scan Apple Mac OS systems would only partially satisfy an asset inventory control.
  • Supports – the tool has some supporting capabilities or is configured such that it provides support for a technical or procedural requirement, but does not directly satisfy the control itself. This kind of rating is for a tool that provides some supporting functionality but does not directly satisfy the control. For example, a GRC platform or tool that stores policies does not itself satisfy controls that require an organization to create the policies, but supports the need to do so.
  • Fully Satisfies – the tool is both capable and configured such that it meets the requirements of the control.

Normally, AHEAD uses the ISO 33004 scale when performing assessments. From a product analysis perspective, this scale is very documentation- and process-oriented in its ratings and does not necessarily provide a direct mapping of control coverage to product implementation. It is, however, useful for the purpose of analyzing capability gaps and overlaps to supplement a custom scale, like the one above, to provide a holistic assessment of products and processes:

0: Control not implemented.

1: Control implemented and produces an artifact, a change in state, or meets a constraint.

2: Control has been documented and reporting measures are in place to validate that the process has been run and produces a measurable result.

3: Control has been documented, results are tracked, and improvements to the process have been run and produced a measurable result.

4: Optimized or Automation is used to execute the process and report upon result trends.

The objective should be to identify:

  • True defense-in-depth and complementary functionality. This may be an instance where multiple tools have specialized functions that satisfy the same control. For example, a control may require intrusion detection. This would be an instance where multiple products have specialized functions, such as a host-based IDS (HIDS) in one product and a network-based IDS (NIDS) in another product.
  • Gaps in security function coverage. This could be an instance where the organization does not have a product that satisfies the control or has a product that is not configured correctly to satisfy the control. For example, a control may require encrypting sensitive data. A DLP tool with data protection and selective encryption capabilities may only be enabled in an audit and not enforcement mode.
  • Overlaps and redundancy in security function coverage. This would likely be the case where multiple products provide coverage of the same security need and control. For example, a control may require endpoint antivirus. Windows Defender comes standard as a part of the Windows OS, but an organization that also has Crowdstrike and Trellix on top of Windows Defender may be paying for duplicate functionality between these products.

Once complete, these data points, correlated analysis, and internal discussion should yield decisions for the organization on whether the suite of owned tools is excessive, just right, or not enough. If the analysis includes review of processes and procedures, it may be possible to identify additional opportunities for improvement of organizational security policies and procedures to better address controls.

To learn more, get in touch with AHEAD today.


Subscribe to the AHEAD I/O Newsletter for a periodic digest of all things apps, opps, and infrastructure.
This site is protected by reCAPTCHA and the Google Privacy Policy and Terms of Service apply.