BlackRock’s reported engagement with the Office of the Comptroller of the Currency (OCC) over its proposed framework for tokenized reserve assets under the GENIUS Act signals a broader structural negotiation between traditional asset managers and emerging digital financial infrastructure.
At the center of the debate is the OCC’s suggested 20% cap on tokenized reserve assets and the scope of eligible instruments permitted within regulated tokenization frameworks. BlackRock’s push to remove or relax this constraint reflects both strategic positioning and a deeper ideological tension over how rapidly tokenized finance should be integrated into the regulated banking system.
Tokenization, in this context, refers to the representation of real-world financial assets—such as cash equivalents, treasuries, or other reserves—on distributed ledger systems. These instruments are increasingly being explored by large financial institutions as a way to improve settlement efficiency, reduce counterparty friction, and enable programmable liquidity.
However, regulators remain cautious about systemic risk, operational integrity, and liquidity mismatches that could emerge if tokenized instruments scale faster than oversight frameworks. The OCC’s proposed 20% cap appears designed as a prudential safeguard.
By limiting the proportion of tokenized reserves that regulated entities can hold, the regulator aims to contain potential volatility spillovers from digital asset markets into the traditional banking system. From a supervisory perspective, this constraint also provides a controlled environment for experimentation, allowing institutions to adopt tokenized assets incrementally while regulatory tools and risk models mature.
BlackRock’s opposition, however, highlights a different interpretation of risk—one grounded in market structure evolution rather than containment. As one of the largest global asset managers, BlackRock is increasingly embedded in the digital asset ecosystem through tokenized money market funds, ETF innovations, and blockchain-based settlement experiments. From this vantage point, restrictive caps may artificially suppress liquidity, reduce the efficiency gains of tokenization.
The firm’s position suggests that risk is not necessarily amplified by tokenization itself, but by fragmented or constrained implementation that prevents markets from achieving sufficient depth and interoperability. The GENIUS Act framework adds another layer to this policy debate. While still evolving in its legislative interpretation, it is broadly understood as an attempt to create clearer federal guidelines for digital asset issuance, custody, and reserve backing standards.
Within that structure, the definition of eligible assets becomes critical. A narrow definition—favoring only highly liquid, short-duration instruments—would prioritize safety and regulatory clarity. A broader definition would enable innovation in structured tokenized products but could introduce complexity in risk assessment and supervision.
BlackRock’s call to expand eligible assets therefore represents an attempt to widen the design space of tokenized finance. It implicitly argues that regulatory architecture should accommodate market evolution rather than pre-emptively constrain it.
This aligns with a broader trend in financial markets where large incumbents are increasingly advocating for regulatory frameworks that are flexible, principles-based, and interoperable with blockchain-native systems. The dispute is less about a single percentage cap and more about governance philosophy. The OCC’s approach reflects a cautious, incremental integration of tokenization into the banking system.
BlackRock’s stance reflects a conviction that tokenized assets will become foundational to future capital markets and therefore require regulatory structures that scale with, rather than lag behind, adoption. How this balance is struck will shape not only the trajectory of tokenized reserves but also the broader architecture of regulated digital finance in the years ahead.






