Data Amplified 2019: Explainable AI in Finance
Maria Mora, Principal Engineer on Artificial Intelligence and XBRL Expert at Fujitsu Laboratories of Europe, demonstrated the exciting work Fujitsu are doing developing explainable AI for finance.
Financial services are demanding more advanced AI solutions. They want to apply AI to numerous situations that require high credibility, such as sales predictions, risk management for investment, and detection of money laundering and illegal transactions.
Companies need to know they can trust AI results. They need transparency and credibility, and they need to know how to effectively use AI results to transform decision making.
With traditional AI, the process is essentially a black box. Customers are left asking why the results are what they are. For example, when using AI to predict the risk of an investment, a firm wants to know what the criteria behind the predictions are, and, if the risk is high, what could be altered to reduce the risk.
The explainable AI being developed by Fujitsu answers these questions. Explainable AI allows the user to understand the criteria behind a prediction, which factors increase risk, and the actions that can be taken. Ultimately this fosters more trust and credibility.
The impact of this increased transparency is significant. There are savings costs that come from better understanding potential vulnerabilities. Firms can reduce their time costs by being able to understand where to prioritise and focus on, and firms can implement more effective strategy and allocation of resources based on the ability explainable AI offers to modify behaviour and outcomes.
This is all very exciting research, but as Mora pointed out, AI solutions need a huge amount of good quality data. Mora noted that currently a huge proportion of the work developing AI goes into simply accessing the necessary data – something XBRL can vastly speed up. With a wider proliferation of high-quality structured data these kind of innovative AI solutions will be able to fulfil their proper potential.
The take-home: financial services are demanding AI solutions with a high credibility. To offer this, explainable AI uses transparency, showing the reasons behind results to build trust. But for this to take off, structured data is crucial.