The payments industry is evolving rapidly, fueled by technological advancements, changing consumer behaviors, and a growing appetite for real-time transactions. As this transformation unfolds, new standards have been introduced to ensure the payments ecosystem's safety, security, and efficiency.
One of the pivotal standards in this domain is ISO 20022. Developed by the ISO Technical Committee TC68 Financial Services, this multi-part International Standard seeks to introduce a unified language and model for payments data. The objective? To elevate the quality of payments for everyone in the industry. While ISO 20022 is yet another standard for data interchange, its ultimate goal is universal convergence. With other standards like MDDL, FIX, FinXML, and many more still in play, the importance of ISO 20022's unifying implementation becomes even more pronounced.
By November 2025, adherence to ISO 20022 will be mandatory, necessitating an extensive overhaul of payments architectures and infrastructure. This transition will reshape message formats, business processes, and data elements essential for electronic data exchange among financial institutions. It's a non-trivial task, and banks and financial service entities must face the challenge. Non-compliance isn't just limited to penalties—it's about reputational risk. On the flip side, this regulation also paves the way for these institutions to revamp their payments platforms and harness real-time applications like fraud monitoring, SLA monitoring, and security & risk monitoring.
With many legacy systems still operational and a plethora of data interchange standards still in use, the payments industry needs a bridge. Confluent's data streaming platform is that bridge, enabling seamless data translation between old and new formats without prematurely sidelining legacy systems.
As we edge closer to the 2025 deadline, this blog will delve into how Confluent facilitates a smooth transition to the ISO 20022 standard, ensuring legacy messages remain interoperable during this shift.
Confluent features have played an instrumental role in various payment architectures across the industry. The platform's ability to handle high volumes of data streams in real time has particularly benefited payment processors and financial institutions. Confluent's event-driven architecture and support for multiple data sources enable integration with existing payment systems. Its scalable and reliable distributed architecture ensures that payment data is processed efficiently and accurately while providing enhanced security and compliance features.
Confluent has become the central nervous system of many major global payment ecosystem participants. This includes networks, gateways, online payment platforms, and banks.
It has been particularly popular with FinTech firms launching new cloud-native platforms, as they depend on real-time access to data to support their microservices-based applications. That’s because Confluent:
Connects all the payments networks that banks have relationships with to the bank’s core infrastructure, using adaptors and APIs. This enables aggregations to support multiple views of different payment rails.
Delivers an aggregate view of the bank’s payments activities, regardless of how or where they originate.
Tracks and retains all the key payments data as the payments are received, processed, and settled. Payments can be kept in Confluent briefly or indefinitely; it’s up to the individual company.
Allows a broad range of relevant applications to leverage that data for contextual real-time applications. Payments are normally fed into the bank’s mainframe before processing, whereas data sent to Confluent can be available moments after creation.
Uses a platform approach that takes the load—and critically, the cost—away from the mainframe while making data that would otherwise be hard to obtain available to business-critical services like fraud detection and security operations.
Below is a typical real-time payments platform reference architecture built using Confluent Platform.
As the diagram highlights, Confluent is a data integration platform for literally any kind of data. For payments teams, this allows you to build a consolidated view of all payment systems, ACH, Cards, BACS, Swift, and FedNow while standardizing on the ISO 20022 formats as real-time data streams. These data streams will allow your teams to build contextual, real-time applications on this platform, including fraud monitoring/AML, SLA monitoring, compliance, and security & risk monitoring while modernizing their payments platform for the future.
The ISO 20022 standard aims to harmonize and streamline the communication of financial data between different systems and organizations, promoting interoperability and reducing complexity. Let’s consider the critical factors that will determine your payment platform’s readiness for compliance:
1. Business process models define the flow of financial transactions and the associated business processes, such as payment initiation, clearing, and settlement. They provide a consistent and clear understanding of a financial transaction's various steps and roles. These process models can be implemented by composing event-driven microservices that communicate and exchange event data via Kafka topics.
2. Data dictionary: The ISO 20022 data dictionary is a comprehensive repository of standardized financial data elements and their definitions. It ensures consistency in representing and interpreting financial information across different message formats and systems. These data elements can be represented as schemas in Confluent’s Stream Governance product.
3. Message formats: The standard specifies message definitions for various financial transactions belonging to various business domains. These definitions have a standard schema and are designed to be extensible, accommodating the evolving needs of the financial services industry.
The message formats are established after a thorough review by the industry group standards body and the Registration Authority, ensuring consistency across domains.
Confluent’s Stream Governance defines message formats that evolve as the business needs change and ensures data compatibility across services.
One can transmit XML data (FedNow data in FedISO format and/or data from Swift in SwiftISO format) to Confluent and create services that convert it to contemporary functional formats such as Avro, JSON, or ProtoBuf. Newer microservices and functionalities are developed using Avro, JSON, or ProtoBuf instead of XML to take advantage of these formats' ecosystems and compatibility. It is worth noting that Confluent does not include an XML serializer, but open-source options are available in the community.
4. Syntax and validation rules: ISO 20022 defines a set of rules for constructing and validating financial messages, which are documented as standard schema definitions (XSD). These rules help ensure that messages are correctly structured and conform to the standard, promoting interoperability and reducing errors in message processing. It’s common to write stream processing applications that monitor for data quality and compatibility.
Confluent can help enforce schema compatibility at the broker after the messages are converted into a format supported by Confluent (Avro, JSON, or Protobuf), rejecting incompatible records and ensuring data quality.
5. Metadata repository: The ISO 20022 metadata repository is a central source of information about the standard, containing the data dictionary, message formats, and other related information. It enables users to access and understand the standard's components, making developing and maintaining compliant applications easier. Stream Governance provides a searchable repository for schemas, topics, connectors, and how they relate.
Building a next-generation real-time payments platform that supports ISO 20022 on Confluent is ideal for several reasons.
1. Scalability: Confluent Cloud and Confluent Platform, Confluent’s enterprise-grade and cloud-native distributions of Apache Kafka®, are designed to scale horizontally. This means your organization can handle large volumes of ISO 20022 financial messages and events with ease. As your financial data needs grow, you can easily add more capacity, ensuring that your implementation can keep up with increasing data throughput.
2. High availability and fault tolerance: Confluent provides built-in replication and partitioning features that ensure high availability and fault tolerance for your ISO 20022 data streams. This means that even in the event of hardware or network failures, your financial messaging system can continue to operate without significant disruption. Confluent Cloud has a 99.99% uptime SLA guarantee.
3. Real-time data processing: Confluent Platform is designed for real-time data processing and streaming. This enables you to process and analyze ISO 20022 financial messages as they are generated, enabling valuable insights and faster decision-making. Specifically, Confluent supports Kafka Streams, a powerful stream-processing library, and ksqlDB, a powerful stream-processing engine that allows you to perform real-time analytics and transformations on your ISO 20022 financial data streams. This can help you derive valuable insights and react quickly to changing business conditions.
4. Hybrid or Multicloud Architecture: We have designed our platform to run anywhere you need it to. Confluent Platform runs on-premises and can be seamlessly linked to Confluent Cloud using Confluent Cluster Linking. Cluster Linking can keep data in sync no matter where it originates.
5. Data Governance: Confluent Stream Governance is a centralized service for managing and storing your ISO 20022 message schemas. It enables you to handle multiple versions of message schemas and maintain compatibility between them, which is essential for evolving ISO 20022 standards.
6. Connectors and integrations: Confluent offers a wide range of pre-built connectors and integrations that enable seamless data exchange between your ISO 20022 implementation and various external systems, such as databases, APIs, and third-party services. This simplifies integrating your ISO 20022 messaging system with your existing infrastructure.
7. Security and compliance: Confluent provides robust security features, such as encryption, authentication, and role-based access controls, ensuring your ISO 20022 implementation complies with industry regulations and best practices for financial data security.
8. Support and services: Confluent offers expert and professional services to help you design, develop, and manage your ISO 20022 implementation. With our deep knowledge of Kafka and the financial industry, Confluent's team can provide valuable guidance and assistance throughout your project.
So now let’s take a closer look at this implementation. The sample reference architecture below illustrates how Confluent’s data streaming platform can ensure interoperability between the different payments messaging formats as companies standardize for the new ISO 20022 requirements.
ISO 20022 messages are in XML, or ASN.1 format. Both these formats are supported in Confluent when tackled as binary objects only. Reasons to view them as binary objects, along with other considerations, are explained below.
XML, and ASN.1 formats are not natively supported in Confluent Schema Registry by default.
However, formatting messages to and from these formats is possible using custom serializers, and deserializers (open-source options are available in the community)
XML, and ASN.1 serializers and deserializers are available in multiple programming languages supported by Confluent’s APIs
Similarly, automatic message validation, which uses Schema Registry at the producer client or broker level, must use an intermediate but modern format like Avro, JSON, or Protobuf.
This conversion of formats, while not optimal, provides immense flexibility in terms of working not only with Confluent’s vast set of tools and products like Kafka Streams and ksqlDB, but also with modern technologies like Elastic and MongoDB, using Confluent’s connectors.
Confluent’s publish subscribe architecture with support for Schema Evolution provides the ability for the migration to happen simultaneously while current workloads continue to be supported.
Therefore, consider setting up the following:
Standard ISO 20022 serializers, and deserializers in single point of entry Java classes (POJOs) included in the classpath for every Confluent component.
Domain translation code to translate domain model objects (internal) to ISO 20022 standard objects (external).
A Single Message Transform (SMT) which can convert messages to and from a modern format like JSON, or Avro.
This SMT can be deployed in multiple connectors, which need to work with these messages, and take advantage of the custom serializers.
A custom Kafka source connector that can take an ISO 20022 message, and post it to a Confluent topic
A custom Kafka sink connector that takes a message from a Confluent topic and responds over HTTP or a similar protocol with its ISO 20022 equivalent
A Schema Registry sync-up component using the SR REST APIs to ensure full compatibility of SR with any changes in the message dictionary.
Any organization trying to adopt a standard will find several areas of operations which can either not immediately be used, or can never use it because of a competing standard.
Fortunately, Confluent publish/subscribe, and event-driven architectures provide an easy way to achieve interoperability between formats.
In identifying topic formats, suffixing names with formats is a helpful approach. For example, a FIX topic that holds retail payments could be called retail.payments.fix
Consumers consuming from the topic can immediately identify the format used, and deploy the correct serializers/deserializers.
A Kafka streams processor can be written to take messages written to one such topic (say retail.payments.fix
) and simultaneously write to other topics (say retail.payments.iso20022
)
Establishing a data governance approach controlled by a functional or operations persona can be helpful while building serialization libraries to enable interoperability.
Such a person could identify the various nuances, mapping between domain objects, and industry formats like FIX, FpML, and XBRL and keep them up to date.
Building a developer experience that accounts for these format changes can also be a beneficial long-term investment.
Confluent has various architectures that enable data transport between enterprises, including:
Using Confluent Replicator which can move data in a controlled set of topics to a different cluster. This is also a very useful tool because it copies data byte-by-byte and is therefore format, and schema agnostic.
Shared clusters which can be created across organizations
Using REST proxy, or HTTP connectors to publish or subscribe to REST endpoints
In all cases, access can be appropriately restricted and data can be secured and encrypted as necessary.
Modernizing payments architecture to meet the ISO 20022 standard for data interchange is no longer an option or an advantage—it’s a competitive imperative. Financial companies that risk non-compliance won’t just face penalties and reputational damages, they’ll also be missing out on valuable business opportunities (e.g., real-time use cases like fraud detection, security operations, open banking APIs).
Confluent provides organizations in the payments industry with the scalability, high availability, and seamless integrations they need to meet ever-increasing consumer demand for real-time transactions, all while overhauling payments infrastructure ahead of the November 2025 deadline.
However, real-time data streaming alone won’t be enough to make this transition successfully. These organizations will also need to leverage Confluent Cloud’s real-time processing capabilities, schema management, security features, and expert support to build a robust, efficient, and flexible ISO 20022 messaging system that can adapt to each organization’s evolving needs.
Reach out to our team today to get compliance ready. We can help you design, develop, and manage your ISO 20022 implementation before the new regulation comes into effect.
This blog explores how cloud service providers (CSPs) and managed service providers (MSPs) increasingly recognize the advantages of leveraging Confluent to deliver fully managed Kafka services to their clients. Confluent enables these service providers to deliver higher value offerings to wider...
With Confluent sitting at the core of their data infrastructure, Atomic Tessellator provides a powerful platform for molecular research backed by computational methods, focusing on catalyst discovery. Read on to learn how data streaming plays a central role in their technology.