Beware the Perils of Audit-Driven Design

IMG_5322c

When you work in IT, security audits are par for the course. Like dental check-ups, they’re generally a good idea, but can still be painful (and expensive). They help uncover issues that need fixing, and raise senior exec visibility.

There is however a dark underbelly to security audits – they can drive counterproductive behaviours leading to unintended and undesirable outcomes.

Wouldn’t it be ironic if remediating a security audit item made your organisation less secure…?

Problems arise when audit items are approached with a ‘do the minimum to close‘ attitude. This, when paired with a lack of systems thinking, can lead to design decisions that have negative externalities – costs borne by parties other than those who commissioned or carried out the audit. In many cases the cost is borne by the users and the business as reductions in functionality or usability. In addition to negative externalities, audits can also have unintended consequences.

A common example of unintended consequences from security change, also with negative externalities, is enforcing a regular password change policy. When Ross Anderson, Professor of Security Engineering at University of Cambridge critiqued password change policy he observed that it leads to users cycling passwords with easy to guess incrementing numbers as well as increased calls to the password reset helpdesk. The policy did not increase security as an attacker could easily guess the next increment of the password, and yet there is a negative impact on both the users and the password helpdesk. Worse still, it encourages users to write their passwords down on paper, making it easy for colleagues or intruders to steal them. As Professor Anderson says “…when our university’s auditors write in their annual report every year that we should have a policy of monthly enforced password change, my response is to ask the chair of our Audit Committee when we’ll get a new lot of auditors[1].

Please bear in mind that this is only one example, and a simple one at that, so don’t dwell on the specifics.

I call this type of change ‘audit-driven design‘ and it generally has negative impacts not borne by the team mandating the change.

So, what can we do about this?

If you’re an IT security auditor then pay attention to the wording of IT audit items to ensure they capture the right outcomes, and work with your clients to find the best way to close them. Think about the wider implications of change on all parties and ensure that these are factored into decision making.

If you’re responsible for closing audit items, then ensure you canvas all parties who might be impacted by any changes, consider their feedback, and have a good comms strategy to inform users of what is changing and why it is being changed. Push back on auditors and execs when audit items don’t make sense, and measure the impact of change.

We also need to ensure execs don’t have misaligned incentives. Don’t expect good outcomes if their primary incentive is closing high-rated audit items as fast and cheaply as possible. They need balanced incentives that also ensure they are accountable for the performance of the business and for the impact of security breaches.

Perhaps consider separate auditors to review both the audit and the remediation, producing a report on the effectiveness of the security outcomes, and also the wider impact on the business. This would put an incentive back on the first auditor to ensure the items and their closure actions make sense to the business as a whole, rather than just minimising their own costs by using a one-size-fits-all template.

Security audits shouldn’t be that bad trip to the dentist that leaves you with ongoing pain (and a hole in your wallet) – let’s instead use security audits as a means to drive positive change that reduces risk while minimising other impacts on the business and make it easier for users to do the right thing.

Reach out in the comments if you’ve had an experience with audit issues driving design decisions.

[1] See Ross Anderson’s very good text: Anderson RJ. Security Engineering: A Guide to Building Dependable Distributed Systems (Second Edition 2008).

Leave a Reply

Fill in your details below or click an icon to log in:

WordPress.com Logo

You are commenting using your WordPress.com account. Log Out /  Change )

Facebook photo

You are commenting using your Facebook account. Log Out /  Change )

Connecting to %s