Co-Authors: Keith Rhea and Alex Nanthavong
It was the best of times, it was the worst of times, it was the age of technological advancements, it was the age of attack, it was the epoch of cybercrime, it was the epoch of opportunity, it was the season of Remediation, it was the season of Exploitation, it was the spring of Security, it was the winter of Vulnerability. We had targets and queries before us, with the data all going direct to SecurityCenter, while the queries were all staying in Splunk — in short, the race between attackers’ access to exploits and defenders’ ability to assess, remediate and mitigate them remained a never-ending cycle. The usefulness and identification of new vulnerabilities could no longer rely on either tool operating independently of each other. When Splunk met SecurityCenter, the alerts of outstanding vulnerabilities were received for remediation before compromise and helped to stay ahead of exploitation.[1]
With the integration of security tools, vulnerability management programs can improve the security posture of cloud environments. Tenable Research published a study that measured the difference in time between when an exploit for a vulnerability becomes publicly available (Time to Exploit Availability (TtEA)), and when a vulnerability is first assessed (Time to Assess (TtA)). The delta, negative or positive, indicates the window of opportunity (or lack thereof) for an attacker to exploit an unknown vulnerability. The researchers used a sample set for this analysis based on the 50 most prevalent vulnerabilities from nearly 200,000 unique vulnerability assessment scans over a 3-month period in late 2017, the findings from the researchers below indicate that attackers have a significant advantage over defenders.
As migration to the cloud and adoption of cloud business models increase, the introduction of cloud assets to those environments is constantly increasing and decreasing. Traditional forms of asset tracking are woefully inefficient in highly dynamic cloud environments. This extends to traditional vulnerability management systems and techniques as well. In order to improve the TtA, the implementation of continuous vulnerability assessments can be used. However, that alone is not enough to fully mitigate the nightmare of performing effective vulnerability management in these rapidly changing environments. Analysis of vulnerability scanning behavior for most organizations indicates that just over 25 percent of organizations are conducting vulnerability assessments with a frequency of two days or fewer. Contrary to popular belief, a successful vulnerability management program includes more than just a snapshot in time scan of an environment. While point in time scanning is an achievable first step for most organizations, that will reduce the head start that attackers have for most vulnerabilities, it still leaves a negative delta and exposure gap for many vulnerabilities. The impact of this exposure gap can be significant depending on the vulnerabilities in question. Shortening the window between scans and moving towards continuous or near real-time vulnerability scanning will have the most positive impact on the TtEA vs TtA time delta.
Not only should regular scanning occur, but there needs to be careful analysis of the vulnerabilities identified to determine the risks associated with those vulnerabilities, dependent on any compensating controls available in the environment. This analysis provides the basis behind the determination of the remediation timeframes. Everyone agrees that vulnerability management is a necessary function of an effective security practice, in our experience however this is not enough to combat the speed at which attackers move. We advocate for organizations to shorten the vulnerability scan cycle time as much as possible, while also improving upon traditional, static asset tracking by gathering data dynamically from sources like cloud infrastructure APIs and CMDBs. As Dickens says, as if he were a Security Officer, “Nothing that we do, is done in vain. I believe, with all my soul, that we shall see triumph.”[1]
Achieving Better TtA via Integration of Splunk and SecurityCenter
MindPoint Group security engineers were able to enhance all phases of their vulnerability management program by integrating Splunk and Tenable SecurityCenter. This integration allows the team to gather asset data via the cloud infrastructure API and correlate that data with near real-time vulnerability data. The team is now able to adapt and react more quickly to the rapidly evolving threat landscape in highly dynamic operating environments. The correlation and analysis of vulnerabilities within a highly dynamic cloud environment is made possible by using SecurityCenter to scan, consolidate, and evaluate vulnerability scans across the organization, and Splunk to aggregate vulnerability data, asset data, and other sources of events and log data from various components of a large cloud environment. With all these sources of data ingested real-time into the Splunk environment, reports and alerts can now be generated to provide in-depth, on-demand vulnerability data to address potential threats as they are discovered.
So How Does it Work?
Security tooling is important, and having tools configured and operating correctly is an important first step for a security team. The effectiveness of individual security tools is greatly reduced when they operate independently of each other, and many security teams greatly increase their effectiveness by working to integrate existing tools, processes, and data sources, instead of buying yet another tool. The diagram above illustrates the vulnerability management process and the components needed to integrate SecurityCenter, and Splunk. This integration is important because it provides security teams with the ability to move beyond the old standards and methods of periodic vulnerability scanning. Integration of these two tools, provides security teams with an enhanced view of their data for improved aggregation, searching, and reporting capabilities. An enhanced vulnerability management approach based on an agile, API driven, DevSecOps model is necessary to decrease the TtA vulnerabilities and ultimately shorten the time delta for defenders. Each tool plays a crucial role in the overall integration of the two and enables security teams to have more actionable information to ensure timely remediation.
Once scan data, cloud asset data, and other data sources have been fed into Splunk we are able to use the following query:
index=tenable severity.name=* (
[ search index=tenable scan_result_info.name!=*DEAD* scan_result_info.name!=*Security* (scan_result_info.name=GC* OR scan_result_info.name=COMM*)
| rename scan_result_info.name as ScanName
| convert num(scan_result_info.finishTime) as time
| eval finish=strftime(time, "%Y-%m-%d %H:%M:%S")
| dedup ScanName
| table ScanName finish scan_result_info.id
| return 15 scan_result_info.id])
| lookup aws-instances.csv private_ip_address as ip
| search tags.ApplicationID=* accountName=* tags.op_env=*
| stats count
From within Splunk we are then able to produce reports, alerts, and dashboards to provide development, operations, and security teams with in-depth, on-demand vulnerability data to address potential threats as they are discovered. Alerts can be customized so that they are generated using the remediation and prioritization criteria mandated by an organization.
Once security teams are continuously alerted and armed with vulnerability data, they are better able to align operational processes to support rapid response and ad hoc remediation and mitigation requests outside of regular maintenance and patch windows. Those efforts for targeted remediation and prioritization can be better focused on vulnerabilities with publicly available exploits and those actively being targeted by malware, exploit kits and ransomware.
This enables up-to-date situational awareness and threat context to evaluate true risk and exposure as well as to inform and guide decision making. By leveraging the integration of Tenable, Splunk, and AWS, vulnerability, configuration, and asset data can be used to conduct deep security analysis, and achieve the awareness, perspective and information needed to make effective security decisions.
References:[1] Dickens, C. (1867). A Tale of two cities, and Great expectations (Diamond ed.). Ticknor and Fields, Book 1, Chapter 1: The Period[2] Dickens, C. (1867). A Tale of two cities, and Great expectations (Diamond ed.). Ticknor and Fields, Book 2, Chapter 16: Still KnittingQuantifying the Attacker's First-Mover Advantage. (2018, May 24). Retrieved June 1, 2018, from https://www.tenable.com/blog/quantifying-the-attacker-s-first-mover-advantage