Why Putting Tape Over Your Webcam Might Make Sense

I will admit that I roll my eyes, even if it is only on the inside some times, when I see people with tape or some other device covering the webcam on their laptop.  My self-righteous logic goes like this: most people I interact with are using the computers I see them using for business purposes, and they likely aren’t perched on a night stand or bathroom counter in the evenings or early mornings.  If the webcam on my laptop was hijacked, the perpetrator would be exposed to hours upon hours of me making faces in reaction to emails and instant messages from co-workers.  Audio is a much, much larger threat to confidentiality, and I have yet to see anyone taking action against the built in microphone on their laptops.  Maybe as humans, we feed that someone secretly watching us is more of an invasion of privacy, but it doesn’t take a lot of thought to conclude that an attacker would obtain far more value from listening, than from looking.  That is, unless the attacker is a doing it for blackmail or out of some twisted, possibly perverted obsession with spying on people.

A few days back, The Verge posted the following video on Twitter:

A casual listen to the video left me laughing it off: haha – tape on the webcam won’t really do a lot, but it may make you feel better.  I listened to it again though, and caught something I missed the first time.  The narrator interviewed Todd Beardsley from Rapid7.  Kudos to Todd for giving what I thought was an amazingly insightful reason for covering a webcam.  In the video, I believe Todd called it “superstitious”, however the point he was making is very important and accurate.  If we believe something about ourselves, we generally act accordingly.  Todd is explaining that if I put a piece of tape on my webcam, that tape serves as a constant reminder throughout my work day that I am a security minded person.  One of the really interesting findings in behavioral psychology is that our mindset is often based on a perception we have of ourselves that according to what we have previously done.  I have to be – I put tape on my webcam after all.  And that constant reminder will permeate into decisions I make that have security consequences, such as picking a better password than I otherwise would, or thinking twice before clicking on a link.  As technologists, that idea probably doesn’t sit well because we expect that it wouldn’t work on “us”.  However, in the world of psychology unlike the world of computers, things are not deterministic and are more about averages.  So yes, this phenomenon will not work every time for every one or to the same extent every time, but on average, it likely does have some beneficial effect, and therefore I am going to stop rolling my eyes when I see tape over webcams.

As I learn more about behavioral psychology, it’s clear that there is a lot of opportunity to explore potential benefits for making security improvements.  If you are interested in learning more, I recommend reading books by Dan Ariely, Daniel Kahneman, Richard Thaler, and Tom Gilovich.

*note: some of my twitter friends pointed out that that tape their webcam to ensure they are not caught by surprise when joining webex-style meetings. That makes sense.

Applying Science To Cyber Security

How do we know something works?  The debate about security awareness training continues to drag on, with proponents citing remarkable reductions in losses when training is applied and detractors pointing out that training doesn’t actually stop employees from falling victim to common security problems.  Why is it so hard to tell if security awareness training “works”?  Why do we continue to have this discussion?

My view is, as I’ve written previously, cyber security is an art, not a science.  We collectively “do stuff” because we think it’s the right thing to do.  One night last week, over dinner I was talking to my friend Bob, who works for a large company.  His employer recently performed a phishing test of all employees after each receiving training on identifying and avoiding phishing emails. Just over 20% of all employees fell for the test after being trained.  I ask Bob how effective his company found the training was at reducing the failure rate.  He didn’t know, since there wasn’t a test performed prior to the training.  That’s a significant opportunity to gain insight into the value of training lost.

Bob’s company spent a considerable amount of money on the training and the test, but they don’t know if the training made a difference, and if so, by how much.  Would 60% of employees had fallen for the phishes prior to training?  If so, that would likely indicate the training was worthwhile.  Or would only 21% have fallen for it, and the money spent on the training would have been much better spent on some other program to address the risks associated with phishing?  Should Bob’s employer run the training again this year?  If they do, at least they will be able to compare the test results to last year’s results and hopefully derive some insight into the effectiveness of the program.

But that is not the end of the story.  We do not have only two options available to us: to train or not to train.  There are many, many variations, on the content of the training, the delivery mechanism, the frequency, and the duration, to name a few.  Security awareness training seems to be a great candidate for randomized control tests.  Do employees who are trained cause less security related problems than those who are not trained? Are some kinds of training more effective than other kinds of training? Do some kinds of employees benefit from training or specific types of training more than other types of employees?  Is the training effective against some kinds of attacks and not others, indicating that the testing approach should be more comprehensive?

I don’t know because we either don’t do this kind of science, or we don’t talk about it if we are doing it.  Instead, we impute benefits from tangentially related reports and surveys interpreted by vendors who are trying to impart the importance of having a training regiment, or by vendors who are trying to impart the importance of a technical security solution.

My own view by the way, which is fraught with biases but based on experience, is that security awareness training is good for reducing the frequency of, but not eliminating, employee-induced security incidents.  Keeping this in mind serves two important purposes:

  1. We understand that there is significant risk which must be addressed despite even the best security training.
  2. When an employee is the victim of some attack, we don’t fall into the trap of assuming the training was effective and the employee simply wasn’t paying attention or chose to disregard the training delivered.

We wring our hands about so many aspects of security: how effective is anti-virus and is it even worth the cost, given it’s poor track record? Does removing local administrator rights really reduce the instances of security incidents?  How fast do we need to patch our systems?

These are all answerable questions.  And yes, the answers often rely at least in part on specific attributes of the environment they operate in.  But we have to know to ask the questions.

Human Nature And Selling Passwords

A new report by Sailpoint indicating that one in seven employees would sell company passwords for $150 is garnering a lot of news coverage in the past few days.  The report also finds that 20% of employees share passwords with coworkers.  The report is based on a survey of 1,000 employees from organizations with over 3,000 employees.  It isn’t clear whether the survey was conducted using statistically valid methods, so we must keep in mind the possibility for significant error when evaluating the results.

While one in seven seems like an alarming number, what isn’t stated in the report is how many would sell a password for $500 or $1,000.  Not to mention $10,000,000.  The issue here is one of human nature.  Effectively, the report finds that one in seven employees are willing to trade $150 for a spin of a roulette wheel where some spaces result in termination of employment or end his or her career.

Way back in 2004, an unscientific survey found that 70% of those surveyed would trade passwords for a chocolate bar, so this is by no means a new development.

As security practitioners, this is the control environment we work in.  The problem here is not one of improper training, but rather the limitations of human judgement.

Incentives matter greatly.  Unfortunately for us, the potential negative consequences associated with violating security policy, risking company information and even being fired are offset by more immediate gratification: $150 or helping a coworker by sharing a password.  We shouldn’t be surprised by this: humans sacrifice long term well being for short term gain all the time, whether smoking, drinking, eating poorly, not exercising and so on.  Humans know the long term consequences of these actions, but generally act against their own long term best interest for short term gain.

We, in the information security world, need to be aware of the limitations of human judgement.  Our goal should not be to give employees “enough rope to hang themselves”, but rather to develop control schemes that accommodate limitations of human judgement.  For this reason, I encourage those in the information security field to become familiar with the emerging studies under the banner of cognitive psychology/behavioral economics.  Better understanding the “irrationalities” in human judgement, we can design better incentive systems and security control schemes.

Day 2: Awareness of Common Attack Patterns When Designing IT Systems

One of the most common traits underlying the worst breaches I’ve seen, and indeed many that are publicly disclosed, is related to external attackers connecting to a server on the organization’s Active Directory domain.

It seems that many an IT architect or Windows administrator are blind to the threat this poses. An application vulnerability, misconfiguration and so on can provide a foothold to an attacker to essentially take over the entire network.

This is just an example, but it’s a commonly exploited tactic. Staff members performing architecture-type roles really need to have some awareness and understanding of common attacker tactics in order to intelligently weigh design points in an IT system or network.

Cyber Security Awareness Month

Tomorrow starts National Cyber Security Awareness Month. Many different organizations will be posting security awareness information to help your employees not get cryptolockered and to help your friends and family keep their private selfies private.

I’m going take a different path with this site for the month of October. I’m going to talk about security awareness for US – IT and infosec people.

Crazy, right?

I have been working in this field for a long time. I see stunningly bad decisions by IT behind the worst incidents I’ve been involved in. These decisions weren’t malicious, but rather demonstrate a lack of awareness about how spectacularly IT infrastructures can fail when they are not designed well, when we misunderstand the limitations of technology and when we’re simply careless while exercising our administrative authority.