Part I — Foundations Chapter 01 Free to Read v1.0 · March 2026

What Is Securities Tokenization?

~1,620 words ~8 min read by Dave Hendricks

1.1 Definition and Core Concepts

Let's start with what a token actually is, because the industry has spent 8 years confusing itself on this point and the confusion has cost real money.

A token is a cryptographic record of ownership rights maintained on a distributed ledger. It is not the security. It is not the asset. It is a representation of the asset, recorded in a format that a blockchain network can read, verify, and transfer without requiring a central intermediary to update a proprietary database. Think of it as the digital equivalent of a stock certificate, except the ledger that records who owns it is maintained by a distributed network rather than a single custodian's back-office system.

This distinction matters enormously in practice. When Vertalo closed what we believe was the first natively tokenized Reg D offering in March 2018, the security itself was a membership interest in a Delaware LLC. The token was an ERC-20 smart contract on Ethereum that recorded who held that membership interest, in what amount, and subject to what transfer restrictions. The security was the LLC interest. The token was the record. Conflating the two produces bad legal analysis and bad product decisions.

Tokenized security, security token, and cryptocurrency are three different things, and getting them wrong in a legal document or a pitch deck can create serious problems. A tokenized security is a traditional financial instrument, already recognized as a security under applicable law, that has been represented on a blockchain. A security token is a term the industry used during the 2018 STO era to describe crypto assets that were structured to comply with securities law, sometimes involving new instruments and sometimes involving tokenized versions of existing ones. The term has fallen somewhat out of fashion precisely because it conflated two distinct things. Cryptocurrency refers to crypto assets like Bitcoin and Ether that are not securities, that exist natively on blockchain networks, and that derive their value from network effects and utility rather than from the managerial efforts of an identifiable enterprise. These are structurally different instruments, regulated under different legal frameworks, and treated differently by custodians, exchanges, and tax authorities. The terminology matters for every document you draft, every system you build, and every regulatory conversation you have.

1.2 Why Tokenize? The Practical Case

The practical case for tokenization rests on 5 concrete capabilities that distributed ledger technology makes possible. Each of them represents a real improvement over the analog alternative, though none of them is magic, and not all of them are fully realized today.

Fractional ownership means that a $10 million commercial real estate asset can be divided into 10,000 tokens at $1,000 each, or 100,000 tokens at $100 each, without the legal and administrative complexity that fractional ownership traditionally required. The token itself enforces the fractional interest; the smart contract holds the cap table. In practice, this expands the addressable investor base for illiquid assets dramatically. A Reg A+ offering for a tokenized fund can reach retail investors who could never have accessed the asset class through traditional private placement mechanics.

24/7 settlement is the capability that institutional players care about most. Traditional securities settlement runs on T+1 or T+2 cycles, operates during business hours, and stops at weekends and holidays. Tokenized securities can settle in minutes, any time, because the settlement mechanism is the blockchain itself rather than a clearinghouse that operates on Wall Street hours. For cross-border transactions, this is transformative. For domestic institutional trading, it reduces counterparty risk and frees up capital that would otherwise sit in settlement limbo.

Programmable compliance is where tokenization gets genuinely novel. A smart contract can encode transfer restrictions directly into the token: accredited investor verification, jurisdictional restrictions, lockup periods, maximum holder counts, and KYC/AML gates. When a transfer is attempted, the smart contract checks the conditions before the transfer executes. The compliance isn't enforced by a back-office team reviewing spreadsheets after the fact; it's enforced by code that runs before the transfer happens. Vertalo's platform is built around this capability. When we manage the cap table for a Reg D issuer with 506(b) restrictions, those restrictions are encoded in the token's transfer logic, not just in a spreadsheet that somebody has to check manually.

Global distribution means that a tokenized security can be offered to investors in multiple jurisdictions through a single issuance, with jurisdiction-specific compliance rules enforced by the smart contract rather than by parallel administrative processes. A Reg D/S offering, which combines US accredited investor exemptions with offshore Reg S provisions for non-US persons, is a natural fit for tokenization because the compliance bifurcation can be built into the token's transfer restrictions from day one.

Reduced intermediary cost is the promise that's probably been oversold most aggressively. Tokenization does reduce the need for some intermediaries in some workflows. It does not eliminate all intermediaries. You still need a transfer agent. You still need a broker-dealer for the distribution of most offerings. You still need legal counsel. You still need a custodian that can hold tokenized securities in a way that satisfies your investors' compliance requirements. What tokenization reduces is the friction and cost in the workflows where those intermediaries are duplicating each other's record-keeping. A transfer agent who manages a tokenized cap table isn't reconciling against a broker's records and a custodian's records and a clearing firm's records. The blockchain is the single source of truth.

1.3 The "So What?" Test: What Has Actually Changed

Here's the honest version. In 2026, tokenization has demonstrably changed several things for early-adopting issuers and intermediaries. It has not yet changed the fundamental experience for most retail investors, and anyone who tells you the liquidity problem is solved is selling you something.

What has genuinely changed: issuers using tokenized cap table management have faster, cheaper, and more accurate shareholder records than issuers using legacy systems. Transfer agents running on tokenized infrastructure can process corporate actions, distributions, and transfers in real time rather than in batch processes that take days. Institutional asset managers like BlackRock and Franklin Templeton have demonstrated that tokenized fund structures can attract real capital at scale and that the operational advantages are not theoretical. The DTC's December 2025 no-action letter, which created a 3-year pilot for tokenized securities on existing settlement infrastructure, is a genuine inflection point for mainstream adoption.

What hasn't changed: most retail investors can't buy tokenized securities through their existing brokerage accounts. Secondary market liquidity for tokenized private securities remains thin. The custody infrastructure for tokenized securities is still immature relative to the custody infrastructure for traditional securities or for mainstream cryptocurrencies. Mainstream adoption is a 5-to-10-year project, not a 2026 event.

What I learned in March 2018, when we closed that first natively tokenized Reg D offering, is that tokenization makes the record-keeping better before it makes anything else better. The offering itself didn't settle faster, because the investors weren't using blockchain wallets as their primary financial accounts. The compliance was better because the transfer restrictions were encoded in the token. The cap table was cleaner because the blockchain was the authoritative record. But the investor experience, the distribution mechanics, and the secondary market were essentially unchanged from a traditional Reg D offering. That's still largely true today, with meaningful but not transformative exceptions.

The value of tokenization, today, is most fully realized at the infrastructure layer: cleaner cap tables, programmable compliance, real-time corporate actions, and interoperability between platforms. The investor-facing benefits, including genuine secondary liquidity and seamless global distribution, are coming. They are not yet here at scale.

Understanding that gap — between what tokenization can do in theory and what it does in practice today — is the foundation of everything that follows in this book. Part I is about the regulatory and structural environment that makes the gap narrower or wider. Part II is about the market infrastructure that determines whether the benefits actually reach issuers and investors. Part III is about what practitioners need to know to operate compliantly and effectively in the world as it is, not as the pitch decks promised it would be.

The history of how we got here is essential context. It explains why the regulatory framework looks the way it does, why certain early players succeeded or failed, and why 2025 and 2026 represent a genuine shift rather than another hype cycle.

Chapter 1 covers ~1,620 words. Reading time approximately 8 minutes. This chapter is free to read without subscription.