top of page

RegTech & Regulatory Reporting: Part 3: Harmonising Challenges with Technology


The Financial Crisis of 2008 had far reaching social and economic ramifications. Within the Financial Services industry, the collapse of Lehman Brothers and the ensuing mess that unfolded resulted in governments scratching their heads for solutions to guarantee such a situation never happens again. This manifested itself in vast amounts of regulation forcing compliance and transparency which was deemed the best way of avoiding another crisis.


This response posed a significant challenge to banks used to operating in a more liberal regulatory environment. The extent of the change in policy is best presented in numbers. In 2017, banks dealt with an average of 185 regulatory changes every day. Compare this to 2004 where banks only needed to deal with 10 changes per day and you can envisage the extent of the challenge. Research by Thompson Reuters noted that more than 33% of financial firms spend one day per week simply tracking these changes in regulation. However, this isn't even the most significant challenge for banks. The actual challenge is one of data and its aggregation and management ensuring its accuracy and attributability.


Regulatory changes, such as Singapore's MAS 610 which states that each bank must submit 63 separate reports, cause headaches for banks. The huge increase in reporting templates that the new 610 rules necessitates manifests a daunting 75 fold increase in the number of data cells the regulator requires resulting in an enormous increase in data granularity with new schedules and classifications. In general the data field and data cell requirements are increasing all of the time while 20% of required data fields change every year. The increase in required data along with the rate of change in the data required, increases the difficulty of data management.


In terms of spending on regulatory compliance, there are numerous factors that determine the level of spend necessary to remain compliant including, inter alia, size, financial services offered, number of operational jurisdictions and data architecture. However, the stringency of regulation and the number of rules that banks must adhere to does not change depending on size. Hence the regulatory reporting burden is arguably more of a challenge for smaller banks that may not have the resources or know-how to be able to deal with the increase, and rate of change in regulation.


The larger banks currently spend between 10-15% of their IT budget on finance, risk and regulatory reporting technology. This equates to an annual, industry spend total of 70 billion USD. In terms of regulatory reporting alone, that makes up 5-10% of the total. However this segment is set to grow significantly by as much as 25% in the next 3-5 years. This results in an annual industry spend on regulatory reporting of 15 billion USD. If the regulatory reporting challenge wasn't already clear, allow it to be put a different way. More money is expected to be spent to help overcome regulatory reporting challenges than the total GDP of Jamaica.


Each aspect of Regulatory Reporting requires a high level of accuracy and consistency across periods. From the preparation of financial statements, risk measurement and statistical reporting, many banks have manual processes to ensure regulatory requirements are being upheld.


Over the past decade, many banks invested huge sums in risk, finance and performance measurement systems to help ease the reporting burden. However these systems have been built on technologies that aren't able to keep up with the swathes of updates to reporting requirements and ever-evolving financial legislation. All of this results in a situation where banks are struggling to ensure timely, accurate and consistent reports both for internal analysis and external regulators.


It is undeniable that the vast increase and proliferation of data caused by ever-evolving regulatory updates poses a significant challenge for financial institutions. However, new technologies are seeking to improve data precision and consonance while simplifying and reducing the resource-intense regulatory reporting process. Those banks that are ahead of the curve and have already invested in such technology are seeing vast cost-efficiencies and envious ROIs.


In order to help leverage technology, standard industry data models could be used which not only offer consistency, but offer advanced analytic capabilities that are far more efficient than manual processes. Especially through the use of semantic layers which allow for data quality, recon and analysis via machine learning. This ensures no time is wasted by identifying overlapping or redundant regulation. Therefore embracing new technologies help FI's meet current obligations while being prepared for the future.


For example, BIAN promotes a common architectural framework for enabling banking interoperability by creating a standard semantic banking services landscape, while ensuring consistent service definitions, levels of detail and boundaries. This allows financial institutions to reduce integration costs and leverage the benefits of a service-oriented architecture of implementing commercial off-the-shelf (COTS) software.


There is general consensus in the industry that AI and machine learning will also have their role to play in reducing the reporting challenge. AI, for example, can help financial institutions manage their data ensuring it is 100% accurate. Furthermore, machine learning can help reduce the variance in regulation interpretation ensuring all banks understand what is required of them as it is intended. There is also talk from regulators such as the FCA, that data must be machine readable therefore allowing for the whole regulatory process to be automated in the future. Automation would allow regulators real-time access to accurate and consistent data which would greatly increase the transparency of the industry all while lowering cost.


The pace of regulatory change shows no sign of letting up, it is crucial that firms are proactive in their efforts to manage what is expected of them from regulators. The aggregation and management of data will be crucial, and firms cannot continue to rely on manual processes due to the increased volume of data required and soaring costs. Firms must therefore adopt industry-wide practices to ensure consistency and employ intelligence in data management to ensure accuracy.


Such discussion will be elaborated in our webinar, Revolutionising Regulatory Reporting, at 5pm (SGT) on June 4th.


For more information and registration details, click here.

bottom of page