Bloomberg and Kaiko Bring Data License On-Chain
Bloomberg and Kaiko have launched a joint initiative to make Bloomberg’s Data License offerings accessible on-chain via Kaiko’s infrastructure. The initial focus is tokenized U.S. Treasuries and repo workflows on the Canton Network, a privacy-enabled blockchain designed for institutional finance.
The goal is to reduce reconciliation headaches in tokenized markets by letting entitled participants reference a single, verifiable data source directly on-chain.
A pathway for Bloomberg data to be used on-chain
Under the initiative, Bloomberg’s trusted reference data would be delivered on-chain through Kaiko’s “data on-ramp” service. Kaiko says the design securely writes off-chain market data on-chain while maintaining intellectual property ownership, licensing compliance, and auditability.
Bloomberg says institutional clients increasingly want the same data they already use in traditional markets to be available in on-chain environments, without lowering standards around governance and entitlements.
Canton + entitlements are the core market-structure play
A key detail here is entitlement control: the initiative is built so only licensed participants can access Bloomberg data, aligning on-chain access with existing data licensing frameworks.
In other words, this isn’t “publish a price feed and hope for the best.” The collaboration is trying to make on-chain workflows compatible with how institutional data distribution actually works today.
What data is included in the first phase
Bloomberg and Kaiko say Bloomberg Data License offerings available through the on-ramp would include security master data and evaluated pricing. The companies say that could allow counterparties to reference one consistent source, helping reduce disputes caused by timing mismatches and fragmented pipelines.
Kaiko frames this as foundational infrastructure for tokenized securities markets, particularly for institutional use cases that require high-quality valuation and reference data.
Expansion beyond Treasuries is on the roadmap
While the pilot starts with tokenized U.S. Treasuries and repo workflows, the initiative is designed to expand to additional asset classes and use cases depending on client demand and how tokenized markets evolve.
Kaiko and Bloomberg also position this as a “build the rails first” effort—creating the data foundation institutions need to run collateral management and repo trading workflows on-chain with confidence in data quality and compliance.
Why it matters for crypto
- Tokenized markets don’t scale without institutional-grade reference data and pricing that counterparties can agree on.
- Entitlement-controlled data access is a big step toward making on-chain workflows compatible with real institutional compliance constraints.
- Treasuries and repo are “serious plumbing” use cases; if this works there, it’s a strong signal for broader tokenization.
- It strengthens Canton’s positioning as an institutional tokenization network focused on privacy and interoperability.
- It’s another example of TradFi-grade vendors extending into on-chain capital markets, rather than crypto building in isolation.
What to watch next
- Beta program updates and signs of real institutional participation in the on-chain data access pilot.
- Whether tokenized Treasury + repo workflows move from proofs-of-concept into repeatable production processes on Canton.
- Expansion into additional asset classes beyond Treasuries, as the companies said is planned based on demand.
- How the market handles auditability and entitlement enforcement in practice (and whether it satisfies data licensors).
- Whether other major data providers follow with similar “licensed on-chain access” models.
Source: Bloomberg Press Release