The Value of Saving Data (from theft)

I am currently reading Richard Thayler’s new book “Misbehaving: The Making of Behavioral Economics”.  I trust I don’t need to explain what the book is about.  Early in the book, Thayler describes the work leading up to his thesis, “The Value of Saving a Life”, and points out something most of us can relate to: we value a specific person more than we value the nebulous thought of many unnamed people.  Let me give an example: a girl is very sick and needs an expensive treatment that costs $5 million which her family cannot afford and is not covered by insurance. We have seen similar cases, where the family receives a flood of donations to pay for the treatment. Now consider a different situation: the hospital in the same city as the girl needs $5 million to make improvements which will save an average of two lives per year by reducing the risk of certain infections that are common in hospitals. There is no outpouring of support to provide $5 million to the hospital. The person in the first case is specific – an identified life, while we have no idea who the 2 people per year that would be saved are in the second case – statistical lives. Identified lives vs. statistical lives.  If we were “rational” in the economic sense of the word, we should be far more willing to contribute money to the hospital’s improvement program since it will save many more people than just the lone sick girl. But we are not rational. 

There seems to be a powerful implication for information security in this thought: we have trouble with valuing things that are abstract, like the theft of some unknown amount of our data belonging to people who may not even be customers of ours yet. After a breach, we care very deeply about the data and the victims, and not just because we are in the news, may face lawsuits and other penalties, but because the victims are now “real”. We only move from “statistical” data-subjects to “identified” data-subjects after a breach. Post breach, we generally care more about and invest more in security to avoid a repeat because the impacts are much more real to us. 

One of the fundamental tenants of behavioral economics is that we humans often do not act in an economically rational way – this gave rise to calling the species of people who act according to standard economic theory “econs”. It occurs to me that, in the realm of IT security, we would do well to try to behave more like econs.  Of course, it helps to understand the ways in which econs and humans think differently.