Cyber Security Lessons From Behavioral Economics

In this series, I am exploring the intersection of information security and behavioral economics.  As a long time information security person that recently began studying behavioral economics, I’ve come to realize that much of traditional information security programs are built using standard economic models.

For example, the Simple Model of Rational Crime (SMOC) has implicitly influenced the creation of security policies and conduct guidelines, as well as much of criminal law.  Simply put, SMOC takes the traditional economic view of human decisions as it pertains to maximizing utility when in comes to committing crimes: people perform a cost-benefit calculation and decide whether or not to commit the crime.

We have four levers to push and pull on as it pertains to managing employee threat:

  1. The explicitness of the requirements to ensure employees understand their obligations and can’t hide behind ignorance
  2. The severity of punishment, such as getting fired or even sued by the company
  3. Controls that increase the likelihood that misdeeds are detected
  4. Controls that prevent misdeeds from occurring

Many corporate security programs rely heavily on the first three levers assuming that if people clearly understand the expectations of them, clearly understand the consequences and have some expectation that they’ll be caught, employees will make economically rational choices after weighing the cost-benefit of whatever opportunistic misdeed lays in front of them.  It’s hard to consider the possibility that a sane person would choose to risk their well paying job for a few hundred dollars or to cut a corner that saves them a few minutes.  Anyone who would do such a thing must, by definition, not be of sound mind and therefore isn’t really good for the company.  Right?

But this scenario happens all the time.  Our policies and expectations are built on the understanding that people are indeed rational and make rational cost-benefit assessments before taking an action.   A growing body of research points out that people are influenced by a great many things, from their mood to project deadlines to how tired they are.  We don’t like level number four, because it’s expensive, inconvenient and we shouldn’t have to do it anyhow given the above conditions.  But we should reconsider.

Dan Ariely’s book “The Honest Truth About Dishonesty” details many experiments that illustrate how the SMOC model doesn’t represent the actual behavior of people and is well worth a read for anyone responsible for designing security programs or security awareness training.

The take away for this post is that relying on employees “to do the right thing” as an integral part of a security program doesn’t make sense given what we know about the human mind.  As mentioned in the previous post, reminders about honesty can help in some cases, but not in all.  The integrity of key processes should not rely solely on policy and employment agreements, but rather be designed to prevent, or at least quickly detect, employee misdeeds.  Such controls clearly won’t work for all organizations or in all circumstances due to cost constraints, politics, technological limitations and so on, but we need to be clear about what to expect when those controls are absent.  Too many organizations are surprised when an employee violates policy, despite the policy being explicit on expectations, explicit on the ramifications of violating the policy and despite an elaborate security awareness campaign.


Cyber Security and Behavioral Science

I recently read a post about improving security awareness using lessons from behavioral science.  The field of behavioral economics and its intersection with information security has been a growing interest of mine, and the post I mentioned inspired me to start a series of posts, starting with this one, on the myriad opportunities there are to leverage the lessons of behavioral economics in improving information security programs.

Behavioral economics describes a set of nuances, biases and irrationalities in the way people, on average, thing.  This does not mean that every single person will be influenced using these techniques.  Also to be clear, these are my hypotheses and I do not mean to represent them as fact.  This is intended to be an exploration of the linkage between behavioral economics and information security, to drive discussion and to refine my thinking on the matter.

Insider Threats – The Ten Commandments

According to Dan Ariely’s research described in his book “Predictably Irrational”, a group of people who are asked to recite the Ten Commandments, regardless of whether or not they remember all 10, prior to performing a task intended to incite cheating don’t cheat.  Likewise, people do not cheat after signing a form in which they promise to abide by an honor code – an honor code that doesn’t really exist.

Ariely’s research found that people who are not asked to recite commandments or sign a honor code generally cheat when given the opportunity to do so, but they do not cheat to the full extent they could have.  But if people begin thinking about honesty just before the point of temptation, they stop cheating completely.  These effects don’t last long, however, and people must be reminded.

How can we apply this finding to information security?

1. If we put people in a position where cheating or stealing is possible, some number are going to do it.  It’s apparently human nature.  The threat of getting caught and losing one’s livelihood often doesn’t enter into the equation.  Implement controls that affirmatively prevent cheating where possible.

2. Remind people about being honest at points where they have the opportunity to cheat or steal.  A once a year conduct reminder isn’t sufficient.  For example, an on screen reminder that it’s to be dishonest when completing an expense report form.  Be careful, though, some research points out that people become blind to on screen warning messages over time.  Possibly something more subtle in the background, stating that employees of the company are known for their honesty.