Day 1: The Importance Of Workstation Integrity

We are pretty well aware of the malware risks that our users and family members face from spear phishing, watering holes, exploit kits, tainted downloads and so on.

As IT and security people, most of us like to think of ourselves as immune  to these threats – we can spot a phish from a mile away. We would never download anything that would get us compromised. But, the reality is that it does happen. To us.  We don’t even realize theat copy of WinRar was trojaned. And now we are off doing our jobs. With uninvited visitors watching.  It happens.  I’ve been there to clean up the mess afterward and it’s not pretty.

The computers that we use to manage IT systems and applications are some are some of the most sensitive in the average business.  We ought to consider treating them appropriately.

Here are my recommendations:

  • Perform administrative functions on a PC that is dedicated to the task, not used to browse the Internet, check email or edit documents.
  • Isolate computers used for these administrative functions onto separate networks that have the minimum inbound and outbound access needed.
  • Monitor these computers closely for signs of command and control activity.
  • Consider how to implement similar controls for performing such work from home.

What do you do to protect your IT users?

Human Nature and Cyber Security

This has been a particularly active year for large scale, public breaches in the news. Next year’s Data Breach Investigations Report from Verizon should provide some context on whether we are experiencing a “shark attack” phenomenon of continued media coverage of each new breach, or if this is really an exceptional year.

Regardless of whether we are trending above average or not, it’s pretty clear that a lot of companies are experiencing data breaches.

Information security is a series of trade-offs: investment vs. security, ease of use vs. security, operational costs vs. security and so on.  This isn’t a new or revolutionary concept.  Groups like SIRA focus on higher order efforts to quantify information risk to inform security strategy, justify investment in security programs and so on.

At a lower level, making intelligent decisions on the trade-offs involved in IT systems projects requires a well-informed assessment of the risks involved.  However, experiments in cognitive psychology and behavioral economics consistently demonstrate that humans have a raft of cognitive biases which impact decision making.  For instance, we are generally overconfident in our knowledge and abilities and we tend to think about likelihood in the context of what we have had personal experience with.  Uncertainty, inexperience or ignorance into exactly how IT system security can fail may lead to an improper assessment of risk.  If risks are not clearly understood, decisions made using these assessments will not be as accurate as expected.

Douglas Hubbard writes extensively on the topic of “expert calibration” in his book “How To Measure Anything”.  In this book, calibration involves training experts to more clearly understand and articulate their level of uncertainty when making assessments of likelihoods or impacts of events.  While it doesn’t eliminate error from subjective assessments, Mr. Hubbard claims that it demonstrably improves estimates provided by calibrated experts.  This calibration process likely makes these “experts” more aware of their cognitive biases.  Regardless of the exact mechanism, measurably improving estimates used in decision making is a good thing.

Information security could benefit from a similar calibration concept.  Understanding the mechanisms through which IT systems can be breached underpins our ability to make reasonable assessments about the risks and likelihood of a breach in a given environment.

To pick on Target for a minute:

Would having a clear understanding of the mechanisms by which the external vendor application change the decision to have the server authenticate against the company’s Active Directory system?  An application to coordinate the activities of the myriad vendors a company the size of Target has is almost certainly a necessity, but would a better understanding of the ways that a vendor management server could be exploited have made a case to have the application isolated from the rest of the Target network with the tradeoff of higher operational costs?  Clearly, that question can only be answered by those present when the decision was made.

Daniel Kahneman, in his book “Thinking, Fast and Slow”, describes a cognitive bias he call the availability heuristic. Essentially this idea posits that people judge concepts and likelihoods based on their ability to recall something from memory, and if it can’t be recalled, it is not important. Similarly, Thomas Schelling, a Nobel Prize-winning economist wrote:

There is a tendency in our planning to confuse the unfamiliar with the improbable. The contingency we have not considered seriously looks strange; what looks strange is thought improbable; what is improbable need not be considered seriously.

Nate Silver’s book “The Signal and the Noise” has an excellent chapter on this concept (Chapter 13).

To become calibrated experts who can clearly assess security risks arising from systems, the IT industry seemingly would benefit from a more broad understanding of the methods used to penetrate systems and networks.  Certainly this will not “solve” the problem of breaches, however it should help to make better inform decisions regarding IT security tradeoffs.

Nor does this mean that organizations will or should always choose the least risky or most secure path.  Businesses have to deal with risk all the time and often have to accept risk in order to move forward.  The point here is that organizations are often seemingly not fully cognizant of risks they accept when making IT decisions, due to human biases, conflicts and ignorance.

A popular blog post by Wendy Nather recently pushed back on the offensive security effort; pointing out that things will not get better by continuing to point out what is wrong.  Rather, the way forward is to start fixing things.  My view is that both the offensive and defensive sides are important to the security ecosystem.  Certainly things will NOT get better until we start fixing them.  However, “we” is a limited population.  To tackle the fundamental problems with security, we need to engage the IT industry – not just those people with “security” in their titles.  And we need those that do have “security” in their titles to be more consistently aware of threats.  Focusing solely on defense, as this blog post urges, will yield some short term improvements in some organizations.  However, building consistent awareness of IT security risks, particularly in those people responsible for assessing such risks, should help all organizations not be surprised when Brian Krebs calls them up with unfortunate news.