Banks adopt AI to manage sanctions and compliance risk
Roula Khalaf, Editor of the FT, selects her favourite stories in this weekly newsletter.
Accidentally lending money to the wrong Russian oligarch or company trading with Iran has never been a good look for a bank — but now it can be crippling.
Sanctions enforcers — especially in the US — have been flexing their muscles in recent years with huge fines and banks have been feeling the pain.
While no lender has come close to the record sanctions fine of $8.9bn that BNP Paribas agreed to pay in 2014, the pressure has not eased off since then. Standard Chartered paid $1.1bn last year to settle charges that it violated sanctions, while the previous year Société Générale agreed to pay US authorities more than $1.3bn to resolve an investigation. Italian bank UniCredit also agreed a $1.3bn settlement last year over allegations that it violated multiple US sanctions programmes.
It is therefore no surprise that, aside from the usual griping about being subject to too much regulation, bankers are now trying to automate their systems more by turning to artificial intelligence. Sven Bates, a partner at law firm Baker McKenzie, says that banks are using technology more and more to help them manage sanctions and compliance risk.
Part of the problem comes when trying to identify whether a prospective client — an individual or company — is subject to sanctions restrictions. It is easy enough to look up the name of a person. But if a company is owned by a shell company in a tax haven that is ultimately controlled by a sanctioned individual, that may not be so easy to discover.
This process of checking whether a new client is acceptable to take on — what the banks call “onboarding” — can be speeded up using software that can identify whether customers have links to sanctioned parties or countries.
Such packages, Mr Bates says, are becoming more sophisticated at identifying whether there is a sanctioned shareholder or director with some financial interest in the company with which the bank is considering dealing. Bankers at one US bank’s trade finance team say they have met with regulators to discuss how AI can be used to track the movement of goods, for example, by using technology to highlight red flags such as an auto supplier selling food that could suggest some form of fraud or money laundering.
There are some thorny exceptions to this system in the world of trade, however. So-called dual-use goods — those that are in theory used for benign purposes, such as electronics or chemicals, but which can be adapted for use as weapons — still require manual analysis due to the complexity involved.
“Technology . . . can help with efficiency and compliance costs and all firms will continuously review their infrastructure and processes,” says Neil Whiley, director of sanctions at UK Finance, a trade body. “However, it cannot mitigate all risks and cost factors.”
Like other companies, when faced with the complexities of trying to work out whether a client might be in breach of sanctions rules, many banks have simply stopped lending in certain countries. “The banks rightly or wrongly tend to look at this in quite a binary way, you have blanket blocks on particular countries or industry sectors,” notes Mr Bates.
Banks take the view that companies or individuals in a country subject to sanctions could come with a whole host of other issues — such as fraud, human trafficking or terrorist financing — that could also play a role in determining whether or not the bank does business with them.
A grey area is where the borrower has some form of sales revenue in countries subject to sanctions: if a company has 90 per cent of its global sales in Iran, a bank is likely to refuse them; but sales of 1 per cent or even as high as 10 per cent might be allowed, trade lawyers say.
Banks are also structuring loans to companies with clauses that state that if a company breaches sanctions rules, they are in breach of contract and have to repay the loan. This would not fully exonerate the bank in the eyes of the regulator, but might be seen as mitigating their responsibility. Banks also like such clauses as they incentivise borrowers to invest more in compliance to ensure they do not breach sanctions laws.
Mr Bates says he hears bankers complain about sanctions regulations a lot. “In some ways they’re between a rock and a hard place: they’re under so much scrutiny, but at the same time they’re one step removed from the situation — they don’t have all the information at their fingertips like customers do,” he says.
Trade Secrets is the FT’s must-read daily briefing on the changing face of international trade and globalisation.
Sign up here to understand which countries, companies and technologies are shaping the new global economy
One banker complains: “For some reason, banks have been put at the forefront of fighting financial crime. They’re not the police force, they are a commercial entity, but because they are seen to have the access to the data, when there’s a customer [sanctions] breach they are the ones that get fined for it, which seems a bit harsh — and these fines are quite significant.”
Another problem for banks is when technology throws up false positives — one of the biggest flaws with AI at present, according to the US bank’s trade finance team. That can lead to conundrums: if a bank freezes funds to a company or individual because it wrongly believes they are subject to sanctions, it could get sued. If it does not freeze the funds when it ought to have, the regulator will fine them.
Bankers and trade lawyers suggest that regulators could give them more of a steer in certain cases rather than staying neutral. UK Finance says it would like to see the British government introduce enhanced information sharing powers so that banks and other sectors can share information in a secure environment.
But regulators, by contrast, do not see this as part of their remit, suggesting the delicate and risky balancing act that banks face is here to stay.
Letter in response to this article:
Artificial intelligence is no substitute for human experience / From Eoin O’Shea, London, UK