Current State and Future of Fund Tokenization in 2026

Tokenized funds set for significant growth by 2029

  • 15% of asset managers already offer tokenized funds, and 41% plan to launch them soon (Broadridge survey).
  • Tokenized fund AUM is projected to reach $235 billion by 2029, a 58× increase from 2024 (Calastone).
  • Early traction is strongest in money-market funds and private-asset vehicles, where fractionalization and automation are especially valuable.
  • Regulatory and industry initiatives—such as Project Guardian, the UK Digital Securities Sandbox, and Europe’s evolving rulebooks—are pushing pilots into live-market trials.

Adoption Rates of Tokenized Funds

Tokenized funds are no longer a fringe experiment confined to innovation labs. By early 2026, market signals suggest a meaningful—if still early—shift toward operational deployment. A Broadridge survey indicates that 15% of asset managers already offer tokenized funds, while 41% say they plan to launch them soon. Taken together, those figures point to a market where tokenization is moving from “possible” to “planned,” and from “planned” to “shipped.”

That adoption curve matters because fund tokenization is not simply a new distribution wrapper. It changes how fund units can be issued, held, transferred, and serviced—potentially compressing timelines and reducing manual steps across the lifecycle. The willingness of asset managers to commit to launches implies they see tokenization as more than a marketing narrative; it is increasingly treated as a product and operating-model decision.

Still, the gap between “offering” and “operating at scale” remains wide. Many firms can launch a tokenized share class or a limited-access product, but scaling requires coordination across custodians, transfer agents, fund administrators, and distributors. In practice, adoption is as much about ecosystem readiness as it is about an individual manager’s ambition.

The current adoption numbers also hint at a two-speed market. A minority has already crossed the threshold into live offerings, while a much larger cohort is preparing to follow—suggesting competitive pressure may build quickly once early movers demonstrate repeatable workflows. If tokenized funds deliver measurable improvements in settlement, servicing, or investor access, the “plan to launch” segment could accelerate into implementation.

Clave: The most important signal in 2026 is not just that tokenized funds exist—it’s that a large share of managers are actively preparing launches, implying tokenization is entering mainstream product roadmaps.

Projected Growth of Tokenized Assets

The most concrete forward-looking indicator in the market is the projection from Calastone: assets under management in tokenized funds could reach $235 billion by 2029, representing a 58× increase from 2024. Even allowing for the uncertainty inherent in forecasts, the magnitude of that multiple is striking. It suggests tokenized funds are expected to move from a niche base to a material segment of digital-asset-enabled investment products within a few years.

What would drive that kind of expansion? The underlying logic is that tokenization can introduce efficiencies and new capabilities across the value chain. For investors, tokenization can support features such as fractional ownership and more automated processes. For asset managers, it can streamline issuance and servicing. For service providers, it can modernize workflows and enable new infrastructure services. The projection implicitly assumes that these benefits will translate into real adoption—not just pilots.

The 2029 figure also aligns with what the industry is doing structurally: shifting from experimentation toward commercial deployment across more asset classes. Early tokenization efforts often focused on alternatives, but interest is now described as expanding into tokenized money-market funds, UCITS funds, private credit, and ETFs. A broader asset-class footprint increases the addressable market for tokenized fund structures, which is consistent with a steep growth trajectory.

However, growth is unlikely to be uniform across regions. The regulatory environment is described as evolving but fragmented, and that fragmentation can slow cross-border scaling. The most plausible path to the projected AUM is therefore a patchwork of growth: faster in jurisdictions and market segments where regulatory clarity and infrastructure readiness converge, slower where classification and supervision questions remain open.

A final point: the projection is specifically about tokenized funds AUM, not tokenized assets in general. That distinction matters because funds sit at the intersection of securities regulation, investor protection, and operational servicing. If tokenized funds reach that scale, it would signal not only investor demand but also that the supporting market infrastructure—administration, custody, distribution, and compliance—has matured enough to handle tokenized units as a routine instrument.

Early Adopters: Money-Market Funds and Private Assets

Two categories stand out as early adopters: money-market funds and private-asset vehicles. The reason is less about hype and more about fit. These products map well to what tokenization can do: improve liquidity mechanics, enable fractionalization, and support automation in processes that are often operationally heavy.

Money-market funds are particularly suited because they are designed around liquidity and cash-like behavior. Tokenization can make it easier to represent fund units digitally and potentially streamline how investors subscribe, redeem, and transfer holdings—especially when paired with automated workflows. In a market where operational efficiency and speed are prized, money-market funds provide a natural proving ground for tokenized fund rails.

Private assets, meanwhile, have long faced structural frictions: limited liquidity, complex ownership records, and high minimums that restrict access. Tokenization’s ability to fractionalize ownership and automate parts of the lifecycle makes private-asset vehicles a logical early target. Even when underlying assets remain illiquid, tokenized fund units can improve how ownership is recorded and serviced, and can support more flexible participation structures—within the constraints of the product and its rules.

The broader trend described in the market is a shift “quickly from experimentation toward commercial deployment across more asset classes.” That matters because early adopters often define the playbook. If money-market and private-asset tokenized vehicles demonstrate repeatable operational models—covering issuance, transfer, administration, and investor reporting—other fund types can follow with less uncertainty.

Interest is also described as expanding into UCITS funds, private credit, and ETFs. That expansion suggests tokenization is not being treated as a one-off solution for alternatives, but as a potential operating layer for both active and passive strategies. UCITS and ETFs, in particular, sit within established regulatory and distribution frameworks; tokenization in these categories would require careful alignment with existing rules and market practices, but it also offers the potential for scale if the model works.

Nota: Early adoption does not necessarily mean immediate mass-market access. Many tokenized fund structures can begin in controlled channels—limited investor groups, specific platforms, or defined jurisdictions—before expanding distribution.

Project Guardian: Enhancing Market Robustness

Among the most prominent institutional initiatives is Project Guardian, led by the Monetary Authority of Singapore (MAS). It is described as a flagship effort that brings together over 40 financial institutions, policymakers, and associations, with a focus on enhancing liquidity, efficiency, and market robustness. In a field where fragmented standards can stall progress, the scale and composition of this group is itself a signal: tokenization is being treated as market infrastructure, not merely product innovation.

Project Guardian’s relevance to fund tokenization lies in its emphasis on real-world use cases and institutional-grade design. The initiative’s Asset & Wealth Management group has delivered multiple real-world use cases across both private and public blockchains. That dual approach matters. Private blockchains can offer controlled environments and governance structures that institutions often prefer, while public blockchains can offer broader interoperability and network effects. Demonstrating use cases across both suggests the industry is exploring practical trade-offs rather than committing to a single ideological model.

The initiative also reflects a broader pattern: tokenized markets require collaboration across competitors and across the public-private boundary. Asset managers alone cannot “tokenize the market” if custodians, administrators, distributors, and regulators are not aligned on how tokenized units are issued, held, transferred, and supervised. Project Guardian’s multi-stakeholder structure is designed to address that coordination problem.

In practical terms, initiatives like this can accelerate adoption by reducing uncertainty. When market participants see that regulators and major institutions are actively testing and validating approaches, it becomes easier to justify investment in modernization. It also helps shape emerging standards—technical, operational, and governance-related—that can later be reused across products and jurisdictions.

Project Guardian also sits alongside other programs—such as the UK Digital Securities Sandbox and the EU Pilot Regime—that collectively push the market toward clearer rules and more mature infrastructure. The combined effect is not a single global standard overnight, but a set of credible pathways for firms to move from pilots to production.

UK Digital Securities Sandbox: A Testing Ground

The UK Digital Securities Sandbox is positioned as a regulatory framework that allows market participants to trial tokenized securities and associated market infrastructure in live environments. That “live” element is crucial. Many tokenization projects can succeed in controlled pilots but stumble when exposed to real operational complexity: onboarding, compliance checks, corporate actions, reconciliations, and the messy edge cases that appear only at scale.

A sandbox structure aims to bridge that gap by enabling experimentation without requiring the entire market to change at once. For tokenized funds and tokenized securities more broadly, this can provide a pathway to validate how blockchain-based issuance, trading, and settlement interact with existing legal and operational frameworks. It also creates a venue where regulators can observe real behavior and risks, rather than regulating purely from theory.

The sandbox’s focus on “associated market infrastructure” is also telling. Tokenization is not just about putting an asset on a blockchain; it is about the plumbing around it—custody models, settlement processes, reporting, and the interfaces that connect tokenized instruments to the rest of the financial system. A live testing ground can help identify where interoperability breaks down and where standards are needed.

In the context of fund tokenization, the UK initiative complements other global efforts by providing a jurisdiction-specific route to market. Firms operating in or through the UK can test tokenized instruments in a supervised setting, potentially accelerating confidence among institutional participants. It also contributes to the broader theme of cross-border alignment: as different jurisdictions run their own pilots and sandboxes, lessons learned can inform future harmonization.

At the same time, a sandbox is not a guarantee of immediate scale. It is a mechanism for learning and de-risking. The market still needs clarity on classification, investor protection expectations, and operational responsibilities—issues that remain open in several regions. But by enabling real trials, the UK framework can help convert abstract debates into concrete implementation patterns.

Regulatory Framework: The MiCA Initiative

In Europe, the Markets in Crypto-Assets (MiCA) framework is described as providing “more structured guidance for digital assets and service providers.” In a global environment that remains fragmented, that structure is meaningful: it offers a clearer baseline for how certain digital-asset activities are expected to be conducted and supervised.

For fund tokenization, however, the brief highlights that gaps still exist, particularly around how tokenized fund units should be classified and supervised. That is not a minor detail. Classification determines which rules apply, which authorities oversee activity, and what obligations fall on issuers and intermediaries. If tokenized fund units sit ambiguously between existing categories, firms may hesitate to scale offerings beyond limited pilots.

The broader regulatory landscape is evolving in multiple regions. In the United States, proposals such as the GENIUS Act aim to establish comprehensive rules for stablecoins, while the CLARITY Act seeks to delineate oversight responsibilities between the SEC and CFTC. Even though those proposals are not European, they illustrate a shared global challenge: digital-asset markets touch multiple regulatory domains, and clarity often requires explicit boundary-setting.

MiCA’s structured approach can help reduce uncertainty for service providers operating in Europe, but tokenized funds sit at the intersection of crypto-asset rules and traditional fund regulation. That intersection is where the “open questions” remain most acute: risk management expectations, investor protection, interoperability requirements, and compliance obligations across the lifecycle.

From an industry perspective, the key issue is not whether regulation exists, but whether it is sufficiently specific to support repeatable, scalable operating models. Tokenized funds require coordinated processes among asset managers, custodians, transfer agents, administrators, and distributors. If regulatory expectations differ across jurisdictions—or if they are unclear within a jurisdiction—interoperability and cross-border distribution become harder.

Clave: MiCA provides structure for digital assets in Europe, but tokenized fund units still raise classification and supervision questions—precisely the areas that matter most for scaling beyond pilots.

Operational Challenges in Tokenization

Tokenization’s promise is often described in terms of efficiency and automation, but the operational reality is that realizing those benefits requires “foundational change across the operational lifecycle.” Asset managers and their partners—including custodians, transfer agents, fund administrators, and distributors—must modernize technology stacks and workflows. This is where many tokenization initiatives either stall or differentiate: the technology may work, but the operating model must be rebuilt to use it.

One reason operational change is so demanding is that funds are not standalone products. They are serviced through a network of specialized entities, each with established systems and responsibilities. Tokenization alters how records are maintained, how transfers are processed, and how reconciliation is performed. If one part of the chain remains on legacy processes while another moves to tokenized rails, friction can increase rather than decrease.

The brief also emphasizes the enabling role of technology providers. Because tokenization represents a fundamental shift in how products are manufactured and operated, vendors are not merely supplying tools—they are shaping the architecture that determines whether tokenized funds can be interoperable and scalable. A “collaborative approach” is described as essential for achieving interoperability and scalable adoption, underscoring that no single firm can solve the entire stack alone.

Operational readiness also includes the ability to support multiple blockchain environments. Project Guardian’s work across private and public blockchains hints at a market where different infrastructures may coexist. That increases the need for standardized processes and interfaces, so that tokenized fund units can be serviced consistently even if the underlying ledger technology differs.

Another operational challenge is the transition from experimentation to commercial deployment. Pilots can tolerate manual workarounds; production cannot. Commercial tokenized funds must integrate with compliance processes, investor onboarding, reporting, and ongoing servicing. The market’s shift toward deployment across more asset classes—money-market funds, UCITS, private credit, ETFs—means operational models must handle diverse product rules and investor expectations.

Ultimately, operational modernization is the gating factor for scale. The adoption numbers show intent, and the AUM projections show ambition. Whether the market reaches those targets depends on whether the industry can industrialize tokenized fund operations—making them repeatable, auditable, and compatible with existing financial-market responsibilities.

Security Considerations in Tokenized Markets

Tokenization introduces new cybersecurity requirements, even as it can enhance certain aspects of security when designed well. The brief is explicit: robust safeguards, hardened smart-contract architectures, and continuous monitoring are essential to mitigate vulnerabilities. In other words, moving fund units onto blockchain rails does not eliminate risk; it changes the risk profile.

Smart contracts—where used—become part of the operational core. That elevates the importance of secure design and governance. A flaw in contract logic, weak access controls, or insufficient monitoring can create vulnerabilities that are different from those in traditional systems. The requirement for “continuous monitoring” reflects the reality that tokenized markets can operate with high speed and programmability; detecting anomalies quickly becomes a core control.

At the same time, the brief notes that blockchain-based tokenization can enhance security through design—though it does not enumerate specific mechanisms. The key point is the dual nature of the shift: tokenization can reduce certain categories of risk even as it introduces others. For institutional markets, that means security must be treated as an end-to-end discipline: technology controls, operational controls, and governance.

Security considerations also intersect with regulatory expectations. Market participants are awaiting clearer guidance on risk management, investor protection, interoperability, and compliance expectations. Security is embedded in all of these. Investor protection depends on secure custody and transfer mechanisms. Risk management depends on resilience and monitoring. Interoperability can expand the attack surface if interfaces are poorly designed.

Finally, security is not only about preventing hacks. It is also about ensuring market integrity and operational continuity. Tokenized funds, especially if they scale into money-market products or widely distributed vehicles, will require institutional-grade resilience. That includes incident response readiness and clear accountability across the ecosystem—asset managers, service providers, and technology vendors.

Nota: In tokenized markets, security is not a feature you add at the end. It is a prerequisite for institutional adoption, because the ledger and its smart-contract logic can become part of the system of record.

The direction of travel in 2026 is clear: tokenization is moving from pilots toward commercial deployment, and from a narrow set of experiments toward broader asset-class coverage. The brief points to expanding interest in tokenized money-market funds, UCITS funds, private credit, and ETFs, suggesting that both active and passive strategies may adopt tokenized structures as operational models mature.

A second trend is the growing role of industry working groups and regulatory programs in shaping market structure. Project Guardian, the UK Digital Securities Sandbox, and the EU Pilot Regime are all described as accelerating regulatory clarity, technological standardization, and cross-border alignment. This matters because tokenized funds are not just products; they are instruments that must fit into market infrastructure. The more these initiatives converge on workable patterns, the easier it becomes for firms to scale without reinventing the wheel.

Third, the market is likely to see continued emphasis on interoperability. The brief highlights that collaboration is essential for interoperability and scalable adoption, and it notes that Project Guardian has delivered use cases across both private and public blockchains. That implies a future where tokenized funds may need to operate across multiple ledger environments, or at least connect to multiple networks and service providers. Interoperability becomes a competitive necessity, not a technical nice-to-have.

Fourth, regulatory evolution will remain a defining factor. Europe’s MiCA provides structured guidance for digital assets and service providers, but questions remain around classification and supervision of tokenized fund units. In the US, proposals like the GENIUS Act and CLARITY Act indicate movement toward clearer rules, at least in adjacent areas such as stablecoins and oversight boundaries. The near-term trend is therefore not global uniformity, but incremental clarity—jurisdiction by jurisdiction, topic by topic.

Finally, security and operational resilience will increasingly differentiate serious tokenization efforts from superficial ones. The brief stresses hardened smart-contract architectures and continuous monitoring. As tokenized funds scale, institutional participants will demand controls that match or exceed traditional market standards. The winners are likely to be those who treat tokenization as a full-stack modernization—technology, operations, governance, and compliance—rather than a thin digital wrapper.

Conclusion: Embracing the Future of Fund Tokenization

The Transformative Potential of Tokenization

By 2026, fund tokenization is showing early signals of scale. With 15% of asset managers already offering tokenized funds and 41% planning launches, the market is moving beyond curiosity into execution. The projected rise to $235 billion in tokenized fund AUM by 2029—a 58× increase from 2024—captures the industry’s expectation that tokenization can become a meaningful channel and operating model, not just a niche product format.

The strongest momentum so far is in money-market funds and private-asset vehicles, where liquidity mechanics, fractionalization, and automation align naturally with tokenization’s capabilities. At the same time, expanding interest in UCITS, private credit, and ETFs suggests the concept is broadening toward mainstream fund categories.

The path to scale runs through infrastructure, not slogans. Tokenization requires foundational operational change across custodians, transfer agents, administrators, and distributors, and it depends on technology providers to enable interoperability. Regulatory progress is real but uneven: Europe’s MiCA offers structure, yet tokenized fund units still raise classification and supervision questions, while other jurisdictions continue to develop their own frameworks.

Security is the other non-negotiable. Tokenized markets introduce new cybersecurity requirements—demanding hardened smart contracts and continuous monitoring—even as well-designed blockchain systems can reduce certain risks.

The next phase will be defined by whether the industry can turn successful use cases—supported by initiatives like Project Guardian and the UK Digital Securities Sandbox—into repeatable, institutional-grade operating models. If it can, tokenized funds may shift from “emerging” to “expected” far sooner than traditional market cycles would suggest.

Scroll to Top