It is well documented that human error is a major weakness in information and cyber security — around 90% of UK data breaches are caused by this. As a result, organisations are increasingly turning to behavioural science to understand their vulnerabilities and improve defences. Cyber security company, CybSafe, raised $7.9 million in a recent funding round for their behavioural security platform. They already have an impressive list of clients, including HSBC and NHS trusts. It is clear startups like CybSafe are meeting a real gap in the market, but what is the science behind such companies, and how are these techniques being used to enhance security?
Generally, security awareness training is a task that employees dread. It takes time away from their core responsibilities, can be repetitive and, well… dull. Many organisations are now turning to gamification, a technique that utilises game-like features such as points, trophies and level progress to motivate employees. There is growing evidence that gamification really works — a large analysis of research in this area found that gamification significantly increased engagement in online programs.
Gamification works because it is both rewarding and immersive. Positive reinforcement is a key driver of behavioural change. Therefore, creating enjoyable experiences where employees can progress through a game, be rewarded for the progress and interact with their colleagues helps to make gamification a success. In addition, making the game interesting and challenging enough means employees remain engaged with the process without having to exert high levels of effort.
PwC’s Game of Threats ™ is a great example of gamification done well. Players have to either defend or attack their fictional organisation and make real-time decisions that affect the outcome. It has proved an excellent way to raise awareness about cyber security and increase understanding about real-world hacker behaviour.
Nudging is Richard Thaler’s Nobel prize winning technique that encourages a change in behaviour through positive, indirect suggestion. For example, healthy eating could be encouraged by filling the snack cupboard with fruit alternatives and hiding the chocolate in a difficult to reach spot. This doesn’t ban the unhealthy alternative (which could cause resentment and backlash) but gently encourages the desired behaviour by working with our human biases, and not against them.
A well-known example in cybersecurity is password strength feedback (i.e., encouraging a stronger password by having a rating system turn green). Other creative nudges include showing users examples of the data they will share when they accept privacy permissions (e.g., an image the application will have access to) and how many times their location has been shared with a certain organisation. However, the best nudges are often simple — it could be as basic as a printed sign that reminds employees how to store or dispose of sensitive documents once used.
It is easy to make mistakes when we have multiple demands on our attention. In behavioural science, this is referred to as cognitive load — a strain upon processing that impacts performance. You might have seen the famous ‘invisible gorilla’ video, where people watching a basketball game video fail to notice a person dressed in a gorilla costume casually walking through the game. This is called inattentional blindness and is one of the potentially alarming consequences of high cognitive load.
This psychological principle has important implications for the design of cybersecurity measures. Namely, that cognitive load should be kept as low as possible. Research into this area found that cybersecurity analysts were more likely to report missing a key piece of information when faced with a large amount of information from automated security measures.
The application of behavioural science to security is still relatively new. As such, there is a rich literature of existing psychological principles to explore. For example, the bystander effect (referring to our belief that someone else will deal with a problem) and optimism bias (our belief that we are the exception to the rule, causing us to underestimate risk). As the tech industry takes a greater interest in behavioural science, these terms may become increasingly familiar, and help turn the tide in the fight against rising cybersecurity threats.