Transform banking. Transform core.
Slowly but surely, the bank’s entire technology ecosystem – from the programming languages, the operating systems, the hardware, to the way their IT organizations operate – will all be in need of modernization. The situation is worsened by the shrinking talent pool needed to keep the legacy systems alive.
The scale of the problem
If you compare banks to companies like Google, it is evident that banks are still at the nascent stage of digital and data revolution.
Vik Atal
Independent Board Director, Goldman Sachs
Decades of M&A activities are further exacerbating the challenges facing the banking technology infrastructure as disparate, incompatible back-end systems collide. The term “Spaghetti systems” is often used to describe the tangled mess of interconnected systems and applications within a bank’s technology stack. These complex systems can cost a bank billions of dollars to simply maintain. It is projected that banks globally will spend US$309 billion on IT cost in 2022, with US$120 billion in North America1
On top of ballooning operating costs, banks are facing the threat of upheaval: frequent outages due to system upgrades, limited digital banking offerings, slow response and problem resolution, etc. As the pandemic has compelled more bank customers to go digital, overall customer satisfaction has further declined, as indicated by the 2021 U.S. Direct Banking Satisfaction Study by Troy’s J.D. Power, down 12 points from previous year’s study.
Banks are also facing increasing competition from technology companies and neobanks as they begin offering banking products. These new competitors often act with agility and speed by capitalizing on their access to customer data, relationships with merchants and vast modern technology resources. Admittedly, many of them are also partnering with banks, often described as “collaborative competition”.
The recently announced upcoming launch of Google Plex is a case in point. Google is partnering with 11 banks to launch its mobile-first bank account integrated into Google Pay. There is no doubt that the boundary of banking is blurred and the banks’ customer base will continue to erode, if they don’t take action.
1 Source: https://www.statista.com/statistics/554889/it-expenses-of-banks-by-region/.
Focusing on products alone only exacerbates the problem
Many global banks with deep pockets are racing to double down on their investment in technology innovation. The proliferation of fintech companies is providing them with a lifeline and fueling the advancement of banking digitalization worldwide, especially in the payments sector. However, banks are quickly realizing that their hands are tied when implementing many new digital banking solutions.
Their efforts to plug in new products are crippled by legacy mainframe-based, monolithic architecture. While legacy core banking systems have largely focused on reliability and security, they are not designed to process transactions in real time, or possess the interoperability to quickly connect to digital banking applications without opening the core and going through lengthy development cycles. In addition, the inability to access data from disparate systems hinders their AI and analytics programs.
Legacy core technology is simply not capable of delivering innovation in today’s digital society. To get to the root of the problem, banks need to get to the “core” of the banking system – the platform that underpins all of the banking applications.
The evolution of core banking systems
To understand why core banking is where it is today it helps to look at where it began.
To address the 24/7 requirements, banks have changed systems to run in “stand-in” mode so that payments are buffered as the bank transitions over the end-of-day process. This approach added considerable complexity into the system as banks finding themselves building a bank within a bank to handle stand-in. To meet the changing consumer and regulatory demands, some banks developed new features and products using modern programming languages, which paved the way for banks to ditch their expensive mainframes. These systems still use batch-based processing and are monolithic.
Banks that didn’t want to risk moving away from their mainframes began to adopt a “hollow out the core” strategy, which involves pulling the product engine, along with other key capabilities, out of the core. The result is that banks can rely on more modern products to solve some of the shortcomings of the legacy core, however the downside is that there is an increased operational complexity and integration challenge. Additionally, the proliferation of these tactical systems causes data silos, which pose additional challenges in banks’ efforts to satisfy changing consumer behaviors and regulatory requirements.
Introducing the fourth generation
With the efforts to fix legacy systems, there is one common limitation – given that they are all inherently monolith, they can only be scaled vertically. This fundamentally inhibits the banks’ agility and ability to reduce cost. Banks need to move away from legacy core systems, which stifle competitiveness in a market driven by technology advancement.
The next generation of core banking systems must be built using a cloud-first approach in mind. Elastic scalability, mechanisms to ensure the integrity of data, and hyper flexible configurability must be baked in. These systems must be able to scale out to handle massive throughput, and scale in to run on a minimal footprint in periods of low activity. There must be zero data loss, real-time access to data and zero planned downtime. And critically: never shut the core.
Moving to the cloud
Cloud technology enables banks to manage their resources on demand, enhance the accessibility of customer data, while also offering the agility needed to process data in real time. Capacity is effectively limitless in the cloud. Banks that want to take full advantage of cloud infrastructure need to adopt cloud native principles – building a core that is written in the cloud and for the cloud.
Building cloud-native systems requires a microservices architecture approach, with which an application is split into autonomous chunks called microservices that communicate via APIs. Microservices are inherently more robust: if a problem arises in one, it is isolated and contained so the system can keep running.
Microservices scale individually. They sit on the cloud, and expand and contract individually according to demand. This eliminates wasted infrastructure. And, massive spikes in demand can be handled flawlessly. Cloud native software can be updated hundreds of times a day with no downtime. Changes are tested and deployed automatically.
Real-time access to data
Data, often characterized by volume, velocity and variety, is the new gold in today’s digital economy. Technically, banks can always pour money into its legacy systems to handle large data volumes. Facing velocity and variety in today’s digital age, banks struggle to offer customers hyper-personalized experiences and insights, as well as real-time balance and transaction details. Bank relationship managers often get exasperated as they lack the data needed to develop targeted recommendations for their clients and to identify prospective small business customers exhibiting switching behaviors.
Taking a page from many technology giants, using streaming supporting architecture to achieve real-time access to data is quickly gaining traction among banks. Core systems built with streaming APIs offer banks the ability to process data in real time using modern AI and analytics technologies, enabling them to respond to both customer and regulatory demands effectively and efficiently.
Gaining control of the product roadmap
The product roadmap for third generation core systems includes both enhancements for the technology platform, as well as buildout of the financial products. When a bank desires to add a new, unique product they have to rely on the third generation fintech to do that (and then the bank customizes using parameters).
With fourth generation systems, banks gain full ownership of their product roadmaps by separating the financial product layer from the platform layer within the core. They can update products and add new products without having to wait for changes to be made by a fintech, gaining tremendous flexibility and agility as they respond to fast evolving customer and regulatory demands. The more control the banks have with their products, the more power they possess in increasing customer satisfaction and controlling costs at the same time. Additionally, with the reliance on a fintech’s product roadmap removed, a bank no longer has to worry that their product differentiators will land in the hands of a competitor.
In summary
Consider a future where next-generation core service providers offer an end-to-end digital banking experience to their partner banks. These future core providers will develop their own innovative solutions for their financial institution clients. But they will also allow institutions to develop their own technology or partner with fintechs – all while providing flexible access to the data on the core provider’s systems. These shared data and software interface standards will support a marketplace of innovative technology, providing creative freedom to banks and new products and services for consumers. We are not there yet.
Jelena McWilliams
FDIC Chair
Envision the bank of the future: innovation is made easy. The business and product designers can make valuable changes to the customer experience on the fly, A/B test, launch new products in weeks, not months. No longer will a bank risk being marooned on an out-of-date application, unable to upgrade because the gap has grown too large. Costs are determined by consumption, not by peak capacity.
The gains are astronomical. To take control of the future, banks need to start with rebuilding their core in the cloud.
Media enquiries:
press@thoughtmachine.net
Head office:
7 Herbrand Street
London WC1N 1EX
United Kingdom