View James Drake's profile on LinkedIn

Tuesday, April 30, 2013

Security Awareness Training is a Waste of Time

Schneier on Security: Security Awareness Training:

Schneier focuses his essay primarily on IT security training, but the lessons are instructive when it comes to IP security issues such as confidentiality and document management training as well.  Let's look at the parallels.

As Schneier points out training succeeds or fails for a finite variety of reasons:

One basic reason is psychological: we just aren't very good at trading off immediate gratification for long-term benefit. A healthier you is an abstract eventually; sitting in front of the television all afternoon with a McDonald's Super Monster Meal sounds really good right now
This is applicable to IP security as well.  Training everyone in your organization to stamp "Confidential" on every piece of paper that is covered under a confidentiality agreement is, in the abstract, a very good way to protect proprietary information - especially if you keep the information as a trade secret.  However, a customer-facing employee tasked with getting critical information to a customer is likely to place the tangible immediate benefit of an improved customer relationship ahead of the intangible benefit of possibly preventing a leak of the information through lax security procedures.
Another reason health training works poorly is that it's hard to link behaviors with benefits. We can train anyone -- even laboratory rats -- with a simple reward mechanism: push the button, get a food pellet. But with health, the connection is more abstract. If you're unhealthy, what caused it? It might have been something you did or didn't do years ago, it might have been one of the dozen things you have been doing and not doing for months, or it might have been the genes you were born with. Computer security is a lot like this, too.
IP security and management of proprietary information is also a lot like this.  IP is, after all, intangible property.  You can't see somebody walking away with it and it is oftentimes very difficult to pinpoint the source of a leak, especially if it was inadvertent.  As a result, employees make judgments at a personal level regarding the value of the information that they disclose, the risk of making the disclosure and the likelihood that the risk will be realized.  Often, these risks appear minimal when compared to a more immediate, tangible benefit.

One area of health that is a training success is HIV prevention. HIV may be very complicated, but the rules for preventing it are pretty simple. This is important: most lay medical expertise stems from folk models of health. Maybe they're right and maybe they're wrong, but they're how people organize their thinking.  
Another area where training works is driving. We trained, either through formal courses or one-on-one tutoring, and passed a government test, to be allowed to drive a car. One reason that works is because driving is a near-term, really cool, obtainable goal. Another reason is even though the technology of driving has changed dramatically over the past century, that complexity has been largely hidden behind a fairly static interface. You might have learned to drive thirty years ago, but that knowledge is still relevant today. On the other hand, password advice from ten years ago isn't relevant today.

Schneier's examples of training success are similarly instructive, since they involve either a minimal effort in exchange for offsetting an enormous, understandable and personal risk (HIV) or achieving a short-term attainable "cool" goal (attaining the privilege of being permitted to drive a car) that has long-term persistence once achieved.  As important as the underlying commonalities of successful training programs, is the method used to achieve that training success.  As alluded to by Schneier earlier, it has to do with creating a limited decision-making framework comprised of simple rules that do not seek to improve the "average" performance of the organization until we have removed systems and processes that are vulnerable to the behavior of the weakest link:

Even if we could invent an effective computer security training program, there's one last problem. HIV prevention training works because affecting what the average person does is valuable. Even if only half the population practices safe sex, those actions dramatically reduce the spread of HIV. But computer security is often only as strong as the weakest link. If four-fifths of company employees learn to choose better passwords, or not to click on dodgy links, one-fifth still get it wrong and the bad guys still get in. As long as we build systems that are vulnerable to the worst case, raising the average case won't make them more secure. 
The whole concept of security awareness training demonstrates how the computer industry has failed. We should stop trying to teach expertise, and pick a few simple metaphors of security and train people to make decisions using those metaphors. We should be designing systems that won't let users choose lousy passwords and don't care what links a user clicks on. We should be designing systems that conform to their folk beliefs of security, rather than forcing them to learn new ones. 
IP security should look to similar models.  IP managers must assume that employees with access to proprietary information are not experts in handling that proprietary information and that they nonetheless have a need to share information with potential and existing customers.  An overall organizational strategy based on limiting the number of decisions employees are allowed to make with respect to proprietary information will require long-range planning and cross-functional support (IT, legal, commercial), but it will be well worth it.  In the next post, I will look at ways that this can be accomplished.


No comments: