Which are you more afraid of sharks or cows? Most people would immediately answer “sharks.” This is an example of availability bias, because we know and hear about shark attacks we assume they are a greater risk. The reality is that sharks kill about 5 people a year and cows are responsible for 22 human deaths annually. There are other factors in play, do you live near an ocean? Or work on a farm? These types of variables will also change the risk levels. The point is that when looking at risk from a preconceived notion, we assume sharks as a greater threat.

Understanding availability bias can help us better evaluate risk and analyze incidents. For example, a client’s board member watched a 60 Minutes documentary on ransomware and the next day we had a conference call with the board to explain why they are actually at a low risk for ransomware (based on other factors). The risk level hadn’t changed overnight, the availability of information about the ransomware did, and therefore perception was that they must be a high risk without knowledge of the other factors and controls that were specific to their business.

Security Information and Event Management tools (SIEM)s often lead to bias. If your SIEM tool shows an alert that you have seen before, you may be inclined to dismiss it without investigation. But that could be a major mistake if it is the one time the alert is real.

Incident responders fall prey to availability bias all the time. There was a famous breach that was attributed to a foreign nation because of the tactics used. However, researchers from other companies proved that it emanated from a completely different country. This was an example of availability bias. Country A always used malware x to exfiltrate, since that what was used it must be that country that is responsible.

How do we avoid availability bias in our cybersecurity and cyber risk decisions? This requires both technology and humans. Technology to provide accurate data. The human needs to not jump to conclusions and analyze the information about each risk with a clean slate and regard each instance being a unique event.

Third parties who understand the probabilities in the information security threat landscape, understanding the data and data flow and what is in place to protect it better and more accurately represent the state of the threat world, specific to the individual business can avoid decisions made based on news cycles that potentially inflate or misrepresent the probability of certain types of threats.

By just being aware of availability bias, we can all better identify, protect, detect, respond to, and recover from real risks and not perceived risk based.