Central banks going in for big data in a big way
The BIS (Bank of International Settlements) has published a new report with the results of a 2020 survey on the use of big data by central banks. It finds that adoption of big data in policy decisions has been rapid and widespread in the last five years, with many central banks developing platforms for data storage and analysis.
But what exactly do banks mean by ‘big data’? It’s an evolving concept that certainly includes unstructured data, often characterised by high volume, velocity and variety, and processed using innovative technologies. But for two thirds of respondents, big data also covers large, structured data sets that often have an ‘organic’ element, being collected as a by-product of commercial, financial and administrative activities.
So it comes from diverse sources and takes diverse forms – and around 80% of the responding central banks now use big data regularly, contrasting with only a third in 2015. More than 60% rated big data as “very important” at the senior policy level, compared with less than 10% five years ago. Big data input in ‘nowcasting’ and short-term forecasting has been particularly useful in times of heightened uncertainty or economic upheaval, such as during the Covid-19 pandemic. The survey did also, however, underscore the need for adequate IT infrastructure and human capital, as well as concerns around accuracy, ethics, legality and transparency in big data.
For us, this survey highlights central banks’ needs for all kinds of data, from dynamic snapshots to the structured data and deep insights provided by more traditional off site surveillance. As digitisation and machine processing progress across the board, it will be increasingly important to analyse, standardise and integrate these different data streams, as well as to share and implement best practices.
Read much more in the full report here.