ReportWire

Tag: Core Banking Transformation

  • Solving the puzzle: How interoperability eases banks’ core modernization dilemma | Accenture Banking Blog

    Solving the puzzle: How interoperability eases banks’ core modernization dilemma | Accenture Banking Blog

    [ad_1]


    If it wasn’t already apparent, the sudden emergence of generative AI has certainly confirmed it: the ability to quickly respond to new challenges and opportunities is a critical attribute of any successful organization, including banks. The agility that allows an organization to respond effectively is influenced by several factors, such as skills, culture and corporate structure. But a critical enabler is a modern digital core. 

    Most banks know all about the constraints of an inflexible, monolithic core. Fortunately, new core system architectures are easing these constraints. These architectures allow a hybrid approach in which legacy and new core systems are integrated seamlessly. They coexist not only with each other but also with technologies from various vendors across different hosting models. This approach significantly reduces the risk that inevitably accompanies the challenging task of retiring the bank’s legacy core. 

    Several emerging concepts underpin this approach: ‘interoperability’, ‘composability’ and ‘coreless banking.’ These extend beyond technology, enabling new ways to develop business models, products, services and experiences by leveraging the best capabilities available in-house, within the banking industry and in the broader technology ecosystem. The result is an organization whose structure and component parts allow it to evolve relatively effortlessly with market changes. Together, these three concepts make the age-old aspiration of a ‘bank of one’ attainable. 

     What do these terms mean? 

    A composable core architecture consists of smaller, self-contained components. These components can be built, tested, and deployed on their own, and they can be quickly and easily assembled in various configurations to meet different use cases. This avoids the cost and risk of changing the entire core system. 

    Interoperability is the ease with which the various parts of the architecture can connect and communicate with each other. Examples include data exchange and collaboration. 

    A composable / interoperable architecture reduces technical debt and improves organizational agility. Instead of relying on a single monolithic system it enables a decoupled, orchestrated configuration of best-of-breed solutions that can be sourced both internally and externally. It affects much more than just the bank’s IT systems, enhancing its operating model, products, experiences and talent management—including even its business model. 

    Gartner in 2021 researched more than 2,300 organizations. In its paper titled Composable Applications Accelerate Digital Business, it said those firms characterized by high interoperability and composability reported significantly greater business performance and revenue, and lower risk and cost, than their peers. 

    For banks to embrace interoperability and composability and to adopt this hybrid approach, they must: 

    1. Re-visit their organizations, the principles underpinning their talent and culture, and their operating models. 
    2. Define the North Star for their quest for interoperability and composability. They should think big and then start small, building the minimum foundational architecture for interoperability. 
    3. Create a pipeline of prioritized initiatives, aligned with their North Star, to minimize regrettable steps and ensure the timely delivery of targeted business outcomes. 

    Building blocks of an interoperable / composable IT architecture 

    To understand the interoperable/composable IT architecture, we must consider its various aspects. It starts with decoupling, which involves separating the different front-end, middleware and back-end engines through specific abstraction layers and patterns. This separation allows for easier movement and scaling of processes, domains, products and/or packages. 

    However, the approach extends beyond decoupling. It also includes technical integration, which means connecting diverse technologies that may come from a variety of sources, both in-house and third-party. These technologies may use different protocols, operate on different hosting models (such as on-premise and cloud), and function under different commercial models (including custom-built, licensed, subscription-based, and XaaS).  

    This integration is crucial, but the concept of an interoperable and composable architecture goes even further than this. Orchestration and choreography are other important factors. So too is the need to coordinate and sequence the journeys and IT processes where many different components and technologies play together under synchronous and asynchronous patterns to support every different business service.  

    But the main enabler of interoperability and composability is less obvious than all of these other elements. It is a structural coexistence architecture. This concept includes all the logical and technical cross-product components that should be abstracted from the specific domain / product engines. This includes functional domains, capabilities and data such as the customer information file, product factory, pricing engine, statements and reports engine and pooling and sweeping capabilities. These cross-product functions will be ingested as required and retro-fed by all product processing engines regardless of their nature or source, but they should not be hosted in the core of any of them. 

    Click to view larger

    From the technical point of view, we also need cross-product capabilities such as single sign-on, sessions handling, identity management, logs, network management, observability, saga patterns, errors handling and catching capabilities. The devil is in the details, and performance and reliability depend greatly on having answers to all these technical challenges. While there is no single solution, the most important lesson we have learned is that flexibility and fit-for-purpose will always be the critical attributes. 

    A successful transformation journey does not require having all these architecture components in place as a starting point. However, it is essential that you have a clear high-level vision of your target IT model and architecture—that you think big.  

    It’s also advisable that you begin with the minimum foundational architecture components that will create the path to your North Star. Starting small in this way avoids regrettable moves and other waste. As you add other on-going initiatives to this journey, you will be able to prioritize your migration roadmap and construct additional lanes in your interoperable / composable architecture. Together, these will ensure that at the same time as modernizing your core, you also enable the business, reduce your TCO and mitigate your IT risk. 

    The benefits of interoperability / composability for banks 

    Interoperability and composability enable banks to side-step the longstanding dilemma of either moving their core systems from their mainframe to a modern platform at significant risk and expense, or retaining their legacy infrastructure at the cost of flexibility, innovation and competitiveness. Because interoperability ensures new systems can work alongside on-premise mainframe systems, accessing vital data as and when required, there is less pressure to replace older systems. This coexistence allows the gradual evolution of the bank’s core systems, as they could be maintained either indefinitely or for a planned duration leading up to a pivot.  

    While the technology behind interoperability and composability may be difficult to grasp, the benefits are more obvious. Our own work with a large number of leading organizations has revealed improvements in the performance of their IT enterprise. We found that, on average, firms that embrace interoperability and composability can cut their time to market for new products by 15% and lower the effort and cost of integration by 20%. They can increase asset reusability by 10% by making them more observable, and they can increase component standardization by 20%*. 

    It’s no surprise that the adoption of interoperability is regarded by many—including rating agencies and many banking clients—as a turning point in business performance. 

    Real solutions for core banking modernization 

    In a sector that has seen more than its fair share of false dawns, there is bound to be some scepticism. What is it about this approach that should give banks real hope that core modernization can be achieved at an acceptable speed, risk and cost?  

    Three key trends have combined to unlock the promise of interoperable and composable technology solutions: the growing appeal of cloud, the proliferation of APIs and the maturity of open architectures. This includes hyper-scaled computing at both the core and the edge, and the roll-out of high-speed networks. It is characterized by mature, efficient and resilient architectures in which microservices and APIs play a key role. It embraces cloud, open source and market standards, ensuring versatility and coexistence with on-premise and SaaS components. And it offers a variety of modernization roadmaps to suit every bank’s requirements. 

    An outdated, under-performing digital core has long been a thorn in the side of most banks. This new approach to core modernization changes everything. If you would like to find out more about how your organization could become interoperable and composable, and how it might benefit, we would be happy to talk to you—simply get in touch with me here 

    In the meantime, you may be interested in our recent report on banking transformation and the importance of a modern digital core in capitalizing on opportunities such as generative AI. In addition, our Top 10 Trends for Banking in 2024 explores this topic in the section The Key to the Core. 

     

    * Source: Accenture experience working with leading global clients. 


    This makes descriptive reference to trademarks that may be owned by others. The use of such trademarks herein is not an assertion of ownership of such trademarks by Accenture and is not intended to represent or imply the existence of an association between Accenture and the lawful owners of such trademarks.

    Disclaimer: This content is provided for general information purposes and is not intended to be used in place of consultation with our professional advisors. Copyright© 2024 Accenture. All rights reserved. Accenture and its logo are registered trademarks of Accenture.

    [ad_2]

    Alvaro Ruiz

    Source link

  • Core banking modernization: Unlocking legacy code with generative AI | Accenture Banking Blog

    Core banking modernization: Unlocking legacy code with generative AI | Accenture Banking Blog

    [ad_1]

     Early trials with generative AI show that the tangle of legacy code that hinders every aspect of the transformation of banks’ core products and services could be quickly and effectively resolved. By Alvaro Ruiz, Global Core Banking Lead, Accenture.

    At the heart of most traditional banks is a mainframe computer running the software that defines and drives the organization’s core processes. This is where many of the bank’s problems start. To meet customers’ and regulators’ growing demands, to improve cost efficiency, and to keep up with the rapid pace of change – which includes taking advantage of emerging technologies – banks need the agility to innovate quickly. This is not something the industry is famous for.

    The reason is that the core system behind every banking product comprises many thousands of lines of computer code. Most of it was written decades ago, in languages that can no longer meet the needs of modern banking. Over the years it has become multi-layered, convoluted and fragile. This is due to repeated changes, add-ons and connections with newer applications, and the general tendency to retain rather than decommission outdated applications. These issues, together with the fact that the code tends to be poorly documented, makes it difficult to understand and upgrade it.

    Banks have long wanted to re-engineer or replace this code – convert COBOL to a modern language like Java, for example – but this would be a mammoth task requiring hundreds of engineers and taking several years. Most source code programmers have long since retired, making it difficult and expensive to find and recruit them in sufficient numbers. This, in turn, adds to the risk of the exercise. And so it remains on the back burner, an important priority but one that is hard to justify in the current circumstances.

     

    Legacy tech: A costly roadblock to banking innovation

    At last, however, we are seeing light at the end of the tunnel. We believe that generative AI, combined with new composable, interoperable and coreless architectures, could offer the solution to this decades-old problem. We are working with a handful of banking clients to test this approach, and the early results are very promising. As my colleague Michael Abbott, Accenture’s global banking lead, told American Banker: “It’s early days, but we’re seeing 80% to 85% accuracy.”

    The process starts by using specialist generative AI models and a process called retrieval augmented generation (RAG) to reverse-engineer the code. This allows us to understand and document the requirements which the code is designed to meet.

    The next step is forward-engineering, for which we see two main paths. The first is the automatic recoding of the software into a modern and versatile language, using an ISO-functional approach and/or specialized generative AI models, to re-imagine the functionality that is required to meet the current objectives of the technology and/or the business. We then use generative AI to automatically test every part of the new code and its performance, and to facilitate the transition from a mainframe hardware and software stack to a modern set of frameworks and compute technology.

    Put simply, we replace the old code with new programs that are simpler and more flexible, and that support the bank’s modernization strategy.

     

    “It’s early days, but we’re seeing 80% to 85% accuracy.”

     

    Having composable, interoperable and coreless architectures is key to most banks’ modernization, as it allows parts of their legacy applications to co-exist frictionlessly with modernized parts and even with third-party products from different sources. It also allows banks to employ different hosting models at the same time, which is essential to meeting new business requirements and generating timely business outcomes while legacy modernization is under way.

    The benefits of this approach are potentially game-changing. A core modernization project that in some reported cases cost more than US$700 million and caused years-long business bottlenecks during their execution could now be completed in a fraction of the time with no negative impact on the business. The cost and risk advantages are obvious. In recent work with a global bank we converted 25,000 lines of legacy code, cutting the reverse-engineering effort by 50% and boosting testing efficiency by 30%. This saved more than 50% of the original budget.

    More importantly, our analysis indicates that banks that modernize their core could potentially increase their return on equity by 8.3 percentage points by improving their manufacturing, distribution and servicing. They would also dramatically facilitate risk management and regulatory compliance.

    It is of course vital that we guard against the tendency of generative AI to plagiarize and, if it doesn’t know the answer, to fabricate. RAG helps in this regard by drawing on the bank’s knowledge base, including its existing code repository, rather than unverified external sources. However, until generative AI matures and these failings are remedied, the new code does need to be carefully checked and tested.

    From legacy to leading with a modern digital core

    Every bank knows that a modern digital core is critical to its ability to compete and meet customer needs. I believe we are on the cusp of putting this elusive goal within the reach of every financial services organization – quickly and affordably, while keeping a tight control on risk.

    To find out more about this topic we have a section titled The Key to the Core in our Top 10 Trends for Banking in 2024. We have also just published a new report, The Age of AI – Banking’s New Reality, which explores the potential role of generative AI throughout the bank.

    If you would like to find out how our code conversion trials are progressing, please contact me directly at LinkedIn.

    Disclaimer: This content is provided for general information purposes and is not intended to be used in place of consultation with our professional advisors. Copyright© 2024 Accenture. All rights reserved. Accenture and its logo are registered trademarks of Accenture.

    [ad_2]

    Alvaro Ruiz

    Source link