As tokenisation moves from experimentation into live market infrastructure, attention is increasingly shifting toward the operational workflows supporting modern capital markets. Richard Baker, Founder and CEO of Tokenovate, explores why firms are rethinking post-trade coordination, reconciliation and settlement in increasingly real-time environments.
Capital markets infrastructure is under growing pressure. Settlement cycles are shortening, regulators expect greater operational resilience, and firms have far less tolerance for delays or processing errors than they did even a few years ago.
At the same time, trading environments are becoming more continuous. Markets that once operated within clearly defined trading and settlement windows are moving toward near real-time environments, exposing the limitations of post-trade systems built around slower operational processes.
Much of the post-trade environment still operates through fragmented systems and duplicated records. Counterparties, custodians, settlement venues and internal operations teams often maintain their own separate version of the same transaction, creating a constant need for reconciliation throughout the trade lifecycle.
In practice, that means firms are still spending significant time coordinating lifecycle events across disconnected infrastructure, delayed messaging systems and manually managed expectations. Operational consistency is achieved through repeated reconciliation between participants rather than through a shared, synchronised view of the transaction itself.
The operational cost of that fragmentation remains significant. More than 20% of settlement failures are linked to data and matching issues, while broader post-trade processing costs are estimated to reach between $6 billion and $9 billion annually across the industry. The planned move to T+1 settlement in the UK and Europe by 2027 may reduce settlement windows, but it also leaves firms with less time to identify and resolve operational breaks before trades fail.
Post-Trade Infrastructure is under increasing pressure
Many post-trade workflows were designed for an environment built around batch processing, limited trading hours and longer settlement windows. While trading infrastructure has modernised rapidly over the past two decades, post-trade operations have evolved more gradually.
These pressures are visible across multiple post-trade functions:
- Trade matching and affirmation still frequently rely on fragmented platforms, proprietary schemas and manual exception handling, increasing the risk of discrepancies and settlement delays.
- Regulatory reporting is often assembled downstream from multiple systems, requiring retrospective reconciliation to establish an accurate trade state and increasing operational overhead.
- Corporate actions and distributions continue to involve complex coordination across systems and teams, creating delays, disputes and processing inefficiencies.
- Novations, amendments and resets require lifecycle changes to be synchronised consistently across counterparties and infrastructures, often under significant operational pressure.
- Settlement transfers and market event processing remain heavily dependent on manual coordination across fragmented infrastructures, increasing the risk of failed trades, liquidity inefficiencies and operational uncertainty during periods of market stress.
Those processes were workable when firms had longer settlement windows and more time to resolve breaks manually. As markets move faster, the operational margin for error is shrinking.
From tokenised assets to tokenised workflows
It is within this context that tokenisation is beginning to move from experimentation into live market infrastructure.
Over the past year, institutions including Swift, Lloyds, NYSE and LSEG have all announced initiatives involving tokenised settlement, funding and post-trade operations. What was previously confined to pilots is now being integrated into regulated market environments and core infrastructure layers.
The conversation is also moving beyond the tokenisation of assets themselves. Most market participants already accept that assets can be represented digitally. The bigger challenge is whether the operational infrastructure surrounding those assets can support markets that are becoming faster, more connected and increasingly continuous.
This is driving greater focus on workflows, interoperability and standards. As markets become increasingly continuous, firms are placing greater emphasis on standardised lifecycle data and synchronised workflow processing capable of operating consistently across systems and participants.
Industry initiatives such as the Common Domain Model (CDM) are becoming increasingly important in supporting shared operational standards across post-trade infrastructure. The CDM is already being used as the basis for ISDA’s Digital Regulatory Reporting (DRR) initiative, which transforms industry interpretations of derivatives reporting requirements into machine-executable code capable of operating consistently across participants and jurisdictions.
Without common standards and interoperable lifecycle data, tokenised markets risk recreating the same fragmentation that already exists across traditional post-trade infrastructure.
Instead of each participant maintaining and reconciling its own version of a trade, firms are starting to explore models built around shared lifecycle data and coordinated processing across the transaction lifecycle. The objective is to reduce the need for retrospective reconciliation by improving synchronisation from the outset.
The next phase of market infrastructure
Importantly, this evolution is taking place within existing legal and regulatory frameworks. Institutions are not replacing market structures overnight. They are introducing infrastructure capable of supporting greater interoperability, operational resilience and automation within existing environments.
That is what makes recent developments across exchanges, banks and market utilities particularly significant. They reflect growing confidence that tokenisation can support live settlement and post-trade environments at an institutional scale, provided the operational workflows surrounding those assets evolve alongside them.
The next phase of market modernisation is likely to depend less on the tokenisation of assets alone and more on whether post-trade infrastructure can operate with the same level of synchronisation, interoperability and resilience as the markets it supports.
About Richard Baker
Founder and Chief Executive Officer at Tokenovate
Richard Baker is the Founder and CEO of Tokenovate, a UK fintech company that unifies legal and trade data into executable workflows that enables efficient collateral and post-trade lifecycle management. Previously, Baker co-founded Cleartrade Exchange, a regulated commodities futures exchange in Singapore, which was acquired by the European Energy Exchange, a Deutsche Bourse company, in 2016.
He is an electronics engineer with a long career establishing foundational technologies, including optical networks, voice-over-IP, streaming audio/video protocols, blockchain technologies, domain-specific language models and messaging systems in modern financial markets. He holds several patents, and is actively coaching and advising start-up and scale-up founders in the UK tech industry.
About Tokenovate
Tokenovate is a UK fintech providing programmable post-trade infrastructure for capital markets. Its platform automates the lifecycle of derivatives and securities financing, transforming legal agreements and trade data into standardised, machine-readable workflows.Text goes here
