Measuring Security Tool Utilization

When I first assumed the role of CISO at my current organization, I knew that a variety of commercial security tools had been purchased, along with numerous other open source, or otherwise free tools that were also being used for various functions. This raised a number of questions for me, including:
- What specific functions does each tool offer?
- Many tools are multi-functional. Which tool is the right tool for the job?
- How effectively had each tool been implemented?
- What value was each tool was bringing to the security program?
With the assistance of my security architect, we set about conducting a study to measure and report on security tool utilization. The results of the study really helped me gain a high-level understanding of what our toolbox looked like. The summary report provided both insight into what consideration could be made for consolidating or eliminating tools, as well as what effort could be invested to better leverage tools we already had, but were not leveraging to their fullest potential.
Primarily, my goal was to identify three things:
- Gaps in visibility that we either lacked functionality for, or had not fully implemented.
- Overlaps where multiple products offered the same, or similar, functionality.
- Steps that could be taken to better-utilize existing tools.
Cost May Go Well Beyond the Price Tag
Each tool, whether costing a lot in licensing and maintenance, or even free and open source products, represents a “cost” of time and effort required to implement, maintain, and administer it. This cost can vary widely, depending on the nature of the tool, whether an enterprise SIEM, or a client-based utility. Other factors, such as the level of paid support or maturity of the open source project community also determine how much effort each particular tool “costs” to the organization. Understanding the utilization and additional potential utilization from each tool can help justify budget and personnel head count.
Methodology
First, identify the capabilities that each tool offers. This was an easy step for us, as we have an Atlassian Confluence documentation repository, similar to a wiki, with pages for each security tool. However, we discovered that we did not have documentation on everything. If you don’t have something like this, start by visiting the vendors website. Weed through all the marketing lingo to pull out a description of basic functionality. Then continue documenting tool network requirements as well as environment-specific configurations.
Next, gather feedback from your team. We did this in the form of distributing a spreadsheet survey, with columns like this:
Tool |
Utilization |
What We Do Well | What We Can Improve | Comments |
SEIM | 75% |
|
|
Each team member was asked to document their opinions of how well they perceived each tool was utilized (percentage), what we do well with the tool, and what we could improve with the tool. The responses were then tabulated and averaged out to begin documenting a summary report.
Reporting
Once the survey spreadsheets were received from all team members, results were averaged and results were placed in an executive style report format. The report categorized each tool into the following sections, each with an associated color code:
- Well-Utilized tools (Green)
- Under-Utilized tools (Yellow)
- Tools That Require Implementation (Red)
- Tools to be Decommissioned (Gray)
Details on each tool contained the following:
Tool Name |
Utilization |
Progress / Next Steps |
SEIM | 65% | Completed professional services engagement.
|
We made sure that the output of this report was fully actionable. While some tools contained notes on many actions that were needed, other tools that were already fully utilized contained few action items. In some cases it was determined that a tool was no longer needed, so the action notes included decommissioning steps. Tasks are then assigned out to the team based on the Next Steps outlined in the report.
Outcomes
This review and reporting exercise produced more significant value than I expected. In addition to the obvious result of inventorying our tool set, I was able to quickly assess the status and perceived value of each tool in our inventory. This uncovered opportunities for cost-savings where some tools could be consolidated, allowing budget to replace aging unsupported tools, or invest in professional services.
We decided to conduct this exercise on a quarterly bases. Each quarter it is very interesting to watch the progress as the utilization metrics moved and the associated action items changed as progress was made.