Aerial view shows firefighters and emergency doctors working at the site of a train accident near Bad Aibling, southern Germany, on February 9, 2016.
A matter of life and death: firefighters and doctors attend a fatal train accident in Germany this year, which has been blamed on human error © AFP

The 21st century has so far been littered with corporate crises, from accounting frauds to fatal explosions, security breaches and mis-selling scandals. Companies are increasingly aware that the way they manage people is often at the root of these crises. To what extent they are learning lessons from them is a more open question.

Wells Fargo, the US bank, was fined $185m in September after thousands of staff, under pressure to hit sales targets, were found to have set up bogus bank accounts, in some cases faking customer details and signatures. The resulting furore prompted chairman and chief executive John Stumpf to resign.

“People risk” issues stretch far beyond banking. Samsung Electronics’ exploding Galaxy Note 7 smartphone, for example, looks on the surface to have been caused by a technical fault — defective software or hardware. But critics accuse Samsung of not giving its engineers sufficient testing time or opportunities to provide feedback.

Other reminders of the human risk element include: Mitsubishi Motors’ admission that employees inflated fuel efficiency data; Yahoo’s two-year failure to spot that it had been subject to the biggest known data breach; and a fatal Bavarian rail crash in which the traffic controller was allegedly playing a game on his mobile phone.

People risk can range from sabotage or fraud to poor training, strategic miscalculations, lax safety rules or simple mistakes such as opening a virus-infected email. One of the biggest risks is a reckless boss who ignores warning voices: arguably, the decision by former UK prime minister David Cameron to call a referendum on EU membership because he was overconfident of winning fits that profile.

The problem multiplies as a business becomes bigger, more complex and more global. “When you have complexity, there is potential for things to spiral out of control in more varied and sometimes more dramatic ways,” says Anthony Fitzsimmons, chairman of consultancy Reputability.

Many industries are trying to tackle the problem. The finance sector, for example, has made changes to how it recruits, trains, pays and monitors people. But overall, says Mr Fitzsimmons, “I don’t think leaders realise that there are things they have to do and that only they can do in order to get people risk properly tackled.”

Research by Cranfield School of Management for Airmic, a UK association of corporate risk managers, has identified five principles needed to achieve resilience: the ability to anticipate problems; adequate resources to respond to changing conditions; a free flow of information up to board level; the capacity to respond quick­ly to an incident; and willingness to learn from experience.

The airline industry is often cited for good practice in this regard. Although accidents happen, the industry has achieved an enviable safety record by encouraging pilots and others to confess to mistakes and allowing these to be publicised and studied worldwide.

Keith Blacker, consultant and co-author of People Risk Management, says that addressing the issue starts at the top — by “making sure you have got the right person leading the organisation and setting the values”. He suggests companies could give staff a personalised code of conduct, against which they will be measured. “There’s no reason why you can’t train people about what the culture entails and what their role is and try to get everybody to maintain a sense of vulnerability . . . to assume that something might go wrong.”

Boards often focus on corporate culture, although that can be a slippery concept and takes years to change. Pat McConnell, honorary fellow at Macquarie University’s Applied Finance Centre in Australia, points to three levels identified by Edgar Schein, an expert on corporate culture: local, organisation-wide and industry-wide. “The payment protection insurance mis-selling scandal was the result of a culture that pervaded the retail banking environment in the UK,” Mr McConnell says. “If it pervades the industry, then individual managers can do very little about it.”

A survey by Australian insurer QBE found that only three in 10 businesses felt there was a shared understanding in their organisation of risk management’s importance. Deborah O’Riordan, practice leader for risk solutions in QBE’S European operations, says companies need to focus on staff attitudes, using face-to-face meetings to make sure every­one understands they have responsibility for risk management.

She cites instances of law firms that have been subject to fraud in property transactions, in which criminals intercept emails between the firm and clients and in­sert their own bank details. Occasionally, staff fail to follow the rules on verifying bank details by phone, and so money is siphoned off to a fraudster’s account. “The message isn’t getting through because it just isn’t taken seriously enough,” she says.

Companies have put a lot of effort into processes to protect themselves, including enterprise risk management, or ERM, systems and there is a global standard for risk management, ISO 31000. These can give a false sense of security, however, because they do not take full account of the unpredictability of human behaviour. “The only way you are going to manage risk effectively is to use your eyes and ears and get round the organisation,” says Mr Blacker.

Sayed Sadjady, who heads human resources consulting for EY in the Americas, has a more optimistic view than many people of the progress being made in tackling risk culture, even if it is still “embryonic”: “Certain organisations are beginning to think more about leadership attributes, behaviours and how they translate culture into its various constituents.” Risk management, he adds, must be carried out in a way that does not stifle innovation, “because innovation has become critical”.

Mr Fitzsimmons, co-author of Rethinking Reputational Risk, says leaders must understand the psychological factors that blind people to risks — such as bias towards optimism and overconfidence in their own actions — and recognise that these apply to themselves too. They should set up a system that captures, analyses and widely shares lessons from mistakes, including those made by the most senior leaders. They also need to ensure that bringers of bad news, and those who report mistakes including their own, are welcomed.

He quotes Richard Feynman, the physics Nobel laureate who examined the root causes of Nasa’s Challenger rocket disaster: “The first principle is that you must not fool yourself — and you are the easiest person to fool.”

Copyright The Financial Times Limited 2024. All rights reserved.
Reuse this content (opens in new window) CommentsJump to comments section

Follow the topics in this article

Comments