In Richard Thaler’s new book, Misbehaving: The Making of Behavioral Economics, he explains a number of risk management behaviors that are currently frustrating the IT Risk world. It’s a great read, and I highly recommend it if IT risk is part of your job.
One of his discussions surrounds a group of senior executives at a large corporate conglomerate. He offered them the following scenario:
Suppose you were offered an investment opportunity for your division that will yield one of two payoffs. After the investment is made, there is a 50% chance that it will make a profit of $2 million, and a 50% chance that it will lose $1 million.
Only 23 percent of the executives would take on this project, although it’s obviously a good bet mathematically. Talking it through with the executives, they weren’t bad at math. They were just accounting for this on a personal level. A gain of $2 million dollars might mean a small increase in their bonus. A loss would mean no bonus at all and possible termination. It’s a good bet for the corporation but a bad bet for the manager personally.
Personal accounting has a lot to do with many of our more frustrating user behaviors in IT risk. I’m going to give you an example. Please don’t get distracted by the actual technologies, just focus on the policy compliance issue. Let’s say you’re in charge of a development project. The project team says that they need an FTP server to exchange data with the offshore developers. You have three choices:
- Ignore - Tell your team to live without it.
- Rogue - Have your team put up a rogue FTP server.
- Ask - Go through the proper channels and request a secure FTP server.
From an IT Risk perspective, we’re hoping for an A or C here but let’s do a little personal accounting. Let’s say you’re looking at a $10k bonus to do the project on time and that the FTP server increases the chance of project success by 30%, from 40% to 70%. Let’s also state that a project failure gives you a 20% chance of being fired. Playing this out by the options:
- Ignore - 40% chance of $10k or a $4k expected payout. 60% chance of failure. Against the base 20%, that gives us a 12% chance of being fired.
- Rogue - 70% chance of success, so a $7k expected payout. Nice. 20% of 30% is 6% chance of being fired. That’s half the chance of being fired; looking good there too. If the only other thing on the scale here is a nasty email from the risk group, it’s a no brainer. If there is a possibility of being fired for breaking the rules, it would have to be higher than 6% to make this a riskier option from a firing perspective.
- Ask - If my risk team just always says “no” to these requests, this is equivalent to A. What if this just presents a delay? For arguments sake, let’s say the delay makes the project a 55% likely success. $5.5k expected payout. 9% chance of being fired.
What are you seeing in your organization? Does Personal Accounting make sense of this behavior?
In many organizations we’re creating a game that incents cheaters. What can we learn here?
- Users aren’t dumb or chronically deviant when they ignore IT Risk policy; they’re doing personal accounting.
- Stern emails aren’t going to affect the math. In the real world, people get fired for not making their project goals. Unless you’re firing them for breaking IT Risk policy, the math is easy.
- Large amounts of variable compensation make employees more likely to do complex personal accounting. Employees who are paid straight salary will only worry about being fired.
- You can change the game by helping users find secure alternatives in a timely manner.
This isn’t a matter of organizational values. This isn’t a matter of morality. Employees are paid to work for their organization and it’s rational for them to judge their actions based on the organization’s feedback. Compensation and risk of being fired are important pieces of that feedback. Why on earth would we expect anyone to act differently?