- In the wake of a security incident, different users have different goals in remediating the issue. But unintentional action bias could hurt the overall response to a cyberattack, according to Josiah Dykstra, technical fellow at the Cybersecurity Collaboration Center within the National Security Agency, during Black Hat last week.
- When cybersecurity defenders are hit with a ransomware attack, for example, their job is to determine what went wrong and how to prevent it in the future. Leaders will want to know what response will cause the least friction for the overall business, according to Dykstra.
- However, in the event of a ransomware attack, non-technical or security users may feel more immediately inclined to pay a ransom because losing their information is the ultimate loss, above any other risky implications. "Users feel the loss of their information very, very intensely," and they often don't see cybersecurity's long-term benefits, Dykstra said.
Security leaders often assume non-technical users will make rational choices "at all times," which will result in conservative policy-making, said Dykstra. Conservative policies, in theory, help create a "never again" standard for security. But those in security know "never again" is an unlikely goal.
The "never again" standard could indirectly influence users to exhaust all remediation efforts immediately after uncovering an incident. It is a waste of resources with an imaginative goal — that it will stop attackers from targeting the company again.
The reason action bias exists is because it's human nature to take action in times of urgency, according to Douglas Hough, senior associate at Johns Hopkins University Bloomberg School of Public Health, during Black Hat. "It's to prevent second guessing."
In terms of cybersecurity, each group — users, defenders, and leaders — will have the natural instinct to take action, which also means taking some sort of control of the response. In a phishing or ransomware attack, none of the three groups want to "passively stand by and gather more information or to build on a plan they had made early in advance," said Dykstra. "Phishing actually plays on this fear that you need to take some urgent action right away."
Bad actors take advantage of the instinct to respond rapidly, even for seasoned security professionals. When security professionals and leaders are at least aware of this desire to act quickly, it is the first step erasing different goals in action bias.
When cybersecurity incidents are familiar, incident response will be more or less routine. However, given how quickly cybercriminals are becoming more sophisticated and updating their techniques, "we might end up jumping to the wrong or even a contradictory decision in the end," which complicates the bias of action or inaction, according to Dykstra.
Risk management takes some of the pressure off the action-or-inaction decision. If a company can effectively evaluate their most valuable assets, they can designate where and what kind of security should be applied. "We're not living in a world of 'never again,' but we're lowering risk," without spending all of the security organization's budget, said Dykstra.
Action bias in cybersecurity is ultimately a company culture issue. Non-technical or security users and leaders need a realistic expectation of security — both from themselves and from the security organization. "They need to know that we're not always going to jump to the side of the goal during the penalty kicks," said Dykstra. "Sometimes we're going to stand still, deliberately waiting."