Regression-Based Attack Chain Analysis and Staffing Optimization for Cyber Threat Detection Open Access
Downloadable ContentDownload PDF
Cybersecurity incidents and data breaches continue to be costly events, with global prevalence. Incidents are compromise events which impact business operations or information security. When sensitive data is accessed, the incident becomes a breach. The more time available to an adversary to operate on a network, the costlier an incident becomes to resolve, and the greater the potential impact. Cybersecurity studies confirm this relationship, that improving the detection time of a breach reduces the overall cost of the incident. In response, cybersecurity operations have shifted focus away from prevention, to place increased emphasis on timely detection.Engineering managers, however, are challenged to meet this demand. Cyber-attacks are complicated and ever evolving, making detection increasingly difficult. Additionally, Security Operations Centers (SOCs) are often encouraged, or pressured, to take on more data to improve visibility of the network, without understanding how changes, including the additional analysis burden, will impact detection time. Lack of understanding of these interactions results in inadequate methods and measures of detection, driving an increase in global average detection times to 196 days, and incident costs which have increased 6.4% from 2017-2018 to an average cost of $3.86 million per incident.This research approaches the problem from two perspectives. To understand the significance and interactions of features of cyber-attacks, the variables of real-life incidents are analyzed, including the hacking techniques employed, the features implemented in malware, and their impacts on the milestones of the incident response timeline (e.g. detection, containment, recovery). From the management perspective, a quantitative model is developed to facilitate understanding of how factors including human data analysis and resourcing affect the time-to-detect an incident. This praxis uses various methods of regression analysis to measure the correlative relationships and impacts of the features of cyber-attacks. This approach seeks to determine if there are individual features or patterns within cyber-attacks that are contributing to increased detection time. Results are provided in tables of regression metrics and analysis showing the correlation of occurrence between the individual hacking and malware features observed and incident timeline milestones. This regression analysis produces chains of attack activities likely to occur within an incident and identifies focus areas for engineering managers based on the features which are most impactful to the incident response timeline.Additionally, this research examines the effects of understaffing, with respect to the volume of data collection on the ability to rapidly detect an incident. A model is presented to calculate expected detection time, a measure which can vary based on the skill level of the analyst performing the event review. Monte Carlo simulation is used to account for the variance in analysis time caused by different tiers of analyst skill levels. The model illustrates how detection time can slip linearly when a SOC is not properly staffed for the level of data that requires review, an enables sensitivity analysis to determine which factors are significantly impacting detection time. The output of the model is also used to calculate the optimal staffing levels to correct the difference between the amount of analysis required and the amount of analyst resources available. Cost analysis compares optimized staffing costs versus the cost of historical breaches.