tokenization

Technology Neutral Tokenization?

author by Manoj Ramia, General Counsel May 1, 2024

 

From a certain perspective, to debate the regulatory treatment of tokenized traditional financial instruments is odd. No one (as far as can be discovered) debated whether the regulatory treatment of a financial instrument should change when the shift was made from paper records to electronic databases. The financial instrument remained the same, so its regulatory treatment remained the same. Today, another shift is happening in the way financial instruments are recorded, from static, siloed databases to dynamic, distributed databases (i.e. blockchains) through tokenization.

Broadly, the term tokenization captures the idea of creating natively digital financial instruments. But tokenization is not a simple analog-to-digital conversion. It involves instead using blockchain technology to, first, encode the workflows of an asset to automate what was previously manual and, second, ensure these workflows and their underlying data are synchronized among the various parties to a transaction to eliminate costly and inefficient reconciliation. An asset is no longer separate from its workflows but rather, the workflows are now part of the asset itself. With tokenization, an asset goes from being static to dynamic.

Why should this technological shift warrant a new regulatory approach? Why is this technological innovation so special as to demand regulatory innovation? Embedded in these questions is a critical assumption built on a seemingly anodyne idea: technology neutrality. Like Switzerland in the geopolitical arena, regulators studiously avoid picking sides when it comes to the technology choices of the institutions they regulate. And that is as it should be. Financial regulators don’t regulate technology choices. Financial regulators regulate financial risk.

But discussions emerging around the proper regulatory treatment of tokenized financial instruments challenge this understanding. The most obvious example is the Basel Committee on Banking Supervision’s December 2022 Prudential Standards for the Treatment of Cryptoasset Exposures. These standards make clear that choice of technology will impact the capital treatment of a tokenized financial instrument. Moreover, notably, in December 2023, the Basel Committee stated that financial instruments tokenized using “permissionless” blockchain networks would receive unfavorable regulatory treatment. Viewed against the ingrained backdrop of technology neutrality, it seems fair to ask whether the Basel Committee made a serious error in its approach.

To answer this, we should take a closer look at what tokenization entails. Because tokenization is enabled by blockchain technology, we need to unpack the term “blockchain,” a label that belies meaningfully different technological approaches.

Blockchains can range from those where all participants are in possession of (and able to see) all data (i.e. permissionless blockchains) to those where only parties permissioned to see certain data are in possession of that data (i.e. permissioned blockchains). For regulated financial institutions working with tokenized financial instruments, permissioning is critical. This can be seen by the fact that even where institutions use blockchains that are at their foundation permissionless to tokenize financial instruments, they only do so after bolting on additional layers that provide necessary permissioning (commonly referred to as sidechains, rollups, or other “Layer 2s”).

But though a permissionless blockchain with bolted-on permissioning layers may look to be on equal footing with a blockchain that is at its foundation permissioned—they are both, after all, “permissioned”—these are quite divergent paths and the consequences compound as the network grows.

Bolting on a permissioning layer means, in essence, that you are creating a new island. But on the island itself, there is no granular permissioning; you are either on the island or off of it. And like with any island, connecting back to the “mainland” or primary layer requires a bridge. Unfortunately, bridges preclude individual transactions from settling atomically because of the reconciliation required between the layers, reintroducing the very financial risk tokenization was supposed to eliminate. And with data siloed across multiple islands, the promise of composing transactions to reduce cost and risk is lost. Moreover, even with a permissioned layer, an institution is still ultimately dependent on the pool of unknown validators and their consensus mechanism at the core, permissionless layer. All of this not just undermines the ability of blockchain to solve problems for capital markets but calls into question the ability of financial institutions using these networks to achieve critical requirements such as settlement finality and AML/CFT compliance.

When a blockchain is permissioned at its foundation, however, connecting various applications and moving tokenized financial instruments across those applications—only to the extent the operators of these applications seek to connect and move those financial instruments—is built-in to the blockchain architecture. No additional layers are required and thus no bridges are required. Individual transactions can be easily composed and settled atomically. Moreover, institutions can choose which parties validate their transactions. Settlement finality and AML/CFT compliance can be easily demonstrated.

How does all of this relate back to tokenization? What’s novel about tokenized financial instruments is an intrinsic relationship between the financial instrument and its underlying technology. A tokenized financial instrument is ultimately a product of the blockchain technology utilized to create it, and as seen above, not all blockchain technologies are the same. This is true even when the same label (such as “permissioned”) is used to describe them. If not all blockchain technologies are the same, and if certain characteristics of tokenized financial instruments are dependent on their underlying blockchain technology, it follows that not all tokenized financial instruments are the same even if the underlying financial instruments are. Rather, the risk profile of a tokenized financial instrument is a product of its underlying technology.

With all this said, we come back to the term that started this discussion: technology neutrality. Though historically this may have been an unobjectionable financial regulatory principle, transposing this principle to the new domain of tokenization may not be a simple application of old rules. The principle of technology neutrality is no doubt still important, but given the intrinsic relationship between tokenized financial instruments and their underlying blockchain technology, financial risk is no longer independent of technology choice. Rather, the interdependence between technology choice and financial risk means that regulators should be expected to take into account technological architectures when constructing regulatory frameworks for tokenized financial instruments. Thus, the Basel Committee was right in its approach and should not be criticized for violating technology neutrality.

As we continue to progress discussions on the proper regulatory treatment of tokenized financial instruments, we should make sure to not wield technology neutrality as a shield against legitimate concerns over technological architectures underlying tokenized financial instruments. Technology neutrality cannot and should not mean technology blindness.