There is a saying in technology: Garbage in, garbage out. If the data fed into AI and other systems for financial services contains biases about gender and other areas, then the solutions such technology produces could make certain problems worse.
As International Women’s Day is marked today, a report from fintech firm Finastra warns that some of the mathematically-driven rules used by financial institutions to govern multi-trillion dollar borrowing, credit and other areas could widen inequalities rather than narrow them.
Using research commissioned from consultancy and accountancy firm KPMG, Finastra said the findings have prompted it to identify what it dubs “algorithmic bias”.
As private banks, wealth managers and other players use artificial intelligence and data tools to automate parts of the value chain - such as with the rise of “robo-advisors” - the danger arises that the results will only be as beneficial as the quality of the data that is fed in.
To illustrate what’s at stake, Finastra noted that in 2020, consumer lending and transactions across financial products (credit card, other consumer product lending and mortgage/home lending) amounted to $6.11 trillion in the US; HK$1.27 trillion ($164 billion) in Hong Kong; £440 billion ($608.7 billion) in the UK; €280 billion ($333.6 billion) in France and S$110 billion ($81.9 billion) in Singapore. The provision and costs, such as interest rates charged, to consumers of credit will be informed in many cases by the algorithms that are used, the report said.
“Without this being a priority in the financial industry, AI will become a flywheel that will accelerate the negative impact on human lives. Finastra doesn’t have all the answers but we believe that the industry must first acknowledge that there is a problem with algorithmic bias – only then can we work together to find a solution,” Simon Paris, CEO at Finastra, said.
In the past decade, the financial world has been “industrialised” and digitalised – to some extent – via AI, particularly forms of machine learning, as a way of making banking more efficient and cut costs, often by shedding certain types of staff. The pandemic has accelerated such tech trends.
“The industry must check if the biases that exist in society are being repeated through the design and deployment of these technologies,” Finastra said of its report.
“The findings in our report for Finastra make it clear that providers need to take care when designing, building and implementing these algorithms to ensure innovation can continue to advance in a safe and ethical way. The report brings together recent thinking on algorithmic bias, with specific applications to financial services and the potential for biased decision-making. Mitigating bias is vitally important in our digital and data-led world. Not doing so could run the risk of serious financial harm to the consumers who use these services,” Dr Leanne Allen, director at KPMG Financial Services Tech Consulting, said.
Among its ideas for improving how AI is used, Finastra said it has updated its developer terms and conditions for FusionFabric.cloud, its open platform and marketplace for developers. This means that developers and partners will be expected to account for algorithmic bias and Finastra has the right to inspect for this bias within any new application, it said.
It is also creating new proof of concept technologies, such as FinEqual, a digital tool that enables bias-free lending, to give users the technology to empower them to tackle algorithmic bias within their own businesses.
The firm said it wants to reach 50:50 male to female ratios across all its teams. This includes increasing women amongst its top 200 leaders and engineers from 30 per cent to 40 per cent by 2025 and 50 per cent by 2030.