Usability: The Forgotten Key to Security
Employees and security officers often see each other as enemies. On one hand, security administrators are often frustrated at the outright negligence of users when it comes to their own security. Employees often fail to recognise the significance of ignoring security measures, such as writing down passwords and leaving them on their desk, downloading forbidden software to their machines, or bypassing the company security policy. On the other hand, users are generally frustrated at the ever-growing burden placed upon them by IT security personnel.
I recently counted how many passwords I have to remember. This is the list I came up with: one system password at the office, one system password at the university where I also work, one password for my smartcard, one password for my machine at home, two passwords for Internet banking (one for each of my two accounts), two PINs for my two ATM cards (one for each account), one PIN for the front door of my building, and one PIN for the immobilizer for my car. This is before I start counting my Amazon and numerous other Internet accounts – some of which are work related as well. Again, leaving these last passwords out, I have ten different passwords. Now, at each workplace and one of my banks I am forced to change my password at least every three months. Furthermore, the password policies require each to be at least eight characters containing upper-case and lower-case letters, as well as digits and other characters. I understand and believe in security, but that is frustrating!
So, what happens when users get frustrated at unrealistic security demands? The answer is simple. They find solutions that they can live with, but often end up bypassing the security measures in the process. For example, they write down passwords on paper, record them in their Palm and cell phone, use a series of passwords (e.g. Andy001#, Andy002#, etc.), reuse the same password in different accounts, and so on. For this reason, IT security administrators may view users as ‘the enemy’. To be fair, users don’t really have a choice because of the ever-growing number of different user authentication demands.
There is one story that I witnessed firsthand that is extremely telling. In an effort to increase security, the entrance to a company building was guarded using a fingerprint access system. The instruction to all employees (posted on the door) is that no tailgating (following one another through the door without separate verification) is allowed; all employees must pass the fingerprint system in order to enter. The entrance is continually filmed in order to enforce this policy (for financial reasons, the door was not manned by a human security guard). The building in question contained very valuable equipment. Therefore, the fingerprint reader was configured to allow for few false positives. Of course, the result of this is that employees entering the building in the morning often have to try six or seven times before their print is accepted. This quickly resulted in long lines in the morning and intense user frustration. At this point, tailgating became a norm with one user passing his or her print and an entire group entering. (Enforcement of the ‘no tailgating’ rule was also impossible because almost all the employees ignored it at some point.) Needless to say, the strong security measure that was deployed ended up being much less secure than a simple badge system.
The examples above illustrate that usability is not merely a measure to keep users happy. Rather, I argue that usability is a key to security. When the demands on users are unrealistic and security is overly burdensome, users react negatively. As we have seen, employees’ negative reaction affects not only their mood, but more importantly, the intended security of the system. In contrast, when the demands on employees are reasonable, and the security measures show obvious merit, they are far more likely to be co-operative. This leads me to the conclusion that if you want to improve your security, you must choose a system that will make your users happy, as illustrated in these two examples:
IT executives are constantly told that security measures must be chosen according to a well-executed risk analysis that balances risk and cost. However, it’s clear that usability is a key factor often forgotten in the security of the solution. I contend this is a huge, frequent mistake, and one that can result in security measures being rendered close to useless. Of course, this can completely modify the risk-cost trade off and so is a factor that we can no longer afford to ignore.
I conclude with a remark on the ever-growing importance of education. Users are often negligent with respect to security measures because they are simply not aware of the threat. Most families would never leave home without locking their door, but many of these same people are happy to leave their workstation unlocked. This is partially due to the fact that they don’t see computer crime as imminent. In addition, they often don’t view themselves or their organization as a target. This is especially true of people who do not hold managerial positions. Their mistake, of course, is that any entrance into a system can be enough to compromise the entire system. This lack of awareness can and must be addressed via education. Periodic meetings where computer crime statistics and methods are presented can make users far more responsive to security measures. Having said this, I stress that this cooperation is unlikely to be achieved by education alone. If the demands on users are frustrating, as they often are, even the most security-aware users are likely to slip into old patterns that leave them and their organization highly vulnerable.