DD
MM
YYYY

PAGES

DD
MM
YYYY

spot_img

PAGES

Home Blog Page 12

Gnosis and Zisk with Ethereum Foundation announce Ethereum Economic Zone 

0

Gnosis and Zisk with co-funding from the Ethereum Foundation have announced the Ethereum Economic Zone (EEZ) at EthCC in Cannes.

This is a new L1-to-L2 framework designed to tackle Ethereum’s L2 fragmentation—not a pure scaling issue, but the problem of dozens of isolated rollups creating siloed liquidity, duplicated infrastructure, and reliance on slow and costly bridges. A new L2 launches roughly every 19 days on average, fragmenting capital and complicating user/developer experiences across the ecosystem which has over 20 active L2s securing tens of billions in TVL.

The EEZ aims to make participating rollups feel like extensions of a single unified Ethereum rather than separate islands. Smart contracts on an EEZ rollup can directly call contracts on Ethereum mainnet or other EEZ rollups in a single atomic transaction, with the same execution guarantees as on L1. No bridges, no multi-step processes, no trust assumptions for cross-chain calls.

Liquidity pools, stablecoins, DeFi positions, etc., become shared across the zone. Capital can flow and be used composably without fragmentation. Developers deploy once and interact seamlessly; apps don’t need replication across chains. No new tokens introduced; everything settles back to Ethereum’s security.

Real-Time Zero-Knowledge Proofs

Powered by Zisk’s proving stack from Jordi Baylina, Circom creator and Polygon zkEVM co-founder. This enables fast verification of cross-rollup and mainnet interactions without compromising security or introducing intermediaries. Gnosis co-founder Friederike Ernst put it simply: Ethereum doesn’t have a scaling problem. It has a fragmentation problem. The EEZ is designed to do the opposite. One Ethereum, not a hundred islands.

It positions itself as distinct from or complementary to efforts like Optimism’s Superchain, Polygon’s AggLayer, or the Ethereum Foundation’s earlier Interop Layer proposals. The emphasis on real-time ZK proving aims for stronger atomicity and fewer trust assumptions. Existing L2s and new rollups could opt into the framework, and Gnosis Chain itself is mentioned as potentially integrating.

EEZ Alliance: An informal collective to coordinate standards and adoption. Founding members include Aave, Flashbots, Nethermind, Centrifuge, Safe, CoW Swap, Titan, Beaver Build, Monerium, and xStocks spanning DeFi, RWA, infrastructure, etc. Governed as credibly neutral infrastructure via a Swiss non-profit. All software will be free and open-source. Co-funded by the Ethereum Foundation, notable given recent grant pauses.

Technical specifications and performance benchmarks are expected in the coming weeks. If successful, the EEZ could significantly improve UX for users; no more bridge headaches for moving assets or using apps across chains, boost capital efficiency in DeFi and beyond, strengthen Ethereum as the shared settlement and security layer, and make the overall EVM ecosystem more competitive against monolithic or more unified alternatives.

It doesn’t solve all of Ethereum’s challenges like sequencer decentralization debates remain active, but it directly targets one of the biggest pain points in the rollup-centric roadmap. Adoption will depend on how many projects join the alliance and how quickly the ZK tech delivers on low-latency synchronous calls. This fits into broader Ethereum coordination efforts.

Builders are actively working on making the fragmented L2 landscape feel more like one cohesive economic zone. Early reactions on X and in coverage highlight excitement around reduced fragmentation, though execution details will matter. Technical deep dives should emerge soon.

 

Kalshi Secures Regulatory Approval to Offer Margin Trading Targeting Institutional and Professional Investors 

0

Kalshi has secured regulatory approval to offer margin trading, primarily targeting institutional and professional investors.

This development, comes via its affiliate Kinetic Markets LLC, which received registration as a Futures Commission Merchant (FCM) with the National Futures Association (NFA) on or around March 24. Traditional prediction markets like Kalshi’s event contracts typically require fully collateralized positions—traders must post 100% of the potential payout upfront.

Margin trading changes this by allowing users to control larger positions with only a fraction of the capital as collateral similar to leverage in futures or derivatives trading. This improves capital efficiency, which is especially appealing to hedge funds, prop desks, and other sophisticated players who want to deploy more capital without tying it all up.

Kalshi CEO Tarek Mansour has indicated that a margin product is coming soon, with capital efficiency for institutions as a key priority. The feature is expected to launch first for institutional clients and possibly debut on new or non-core products rather than immediately on existing event contracts. Additional CFTC approvals for rulebook changes to permit non-fully collateralized trading are still needed.

This move positions Kalshi more like a traditional derivatives platform, helping it attract larger institutional flows amid growing prediction market volumes. Some reports suggest it could unlock significant additional liquidity. Kalshi operates under CFTC oversight as a prediction market, treating contracts as commodity derivatives.

The FCM status expands its capabilities but keeps it distinct from unregulated or gambling-framed platforms. While this approval is positive, Kalshi faces legal pushback in some states like recent action from Washington state alleging illegal gambling, though the platform maintains it complies with federal commodity rules.

This is a notable step in maturing prediction markets, blending them closer with conventional financial derivatives. Retail users likely won’t see margin right away—it’s focused on pros for now. Details on exact leverage ratios, eligible contracts, or rollout timeline aren’t fully public yet and will depend on further CFTC sign-off.

Margin trading allows you to control larger positions than your available cash would normally permit by borrowing funds or using reduced collateral from the broker or exchange. In traditional stocks, this means borrowing money to buy more shares.

In futures-style markets like the one Kalshi’s affiliate is preparing for via its FCM registration, it means posting only a fraction of the contract’s notional value as initial margin, with the rest effectively leveraged. This boosts capital efficiency—especially useful for institutions hedging or scaling positions on prediction market event contracts—but it significantly increases risk compared to fully collateralized trading.

Gains and losses are magnified by the leverage ratio. A small adverse move in the contract price can wipe out your entire margin and more. Example: With 10x leverage, a 5% move against your position could result in a ~50% loss of your posted margin. A 10–20% adverse move could liquidate your position entirely.

In volatile prediction markets, this happens quickly. If your account equity falls below the maintenance margin level; a minimum equity percentage set by the broker and exchange, you receive a margin call—a demand to deposit additional funds or collateral immediately. Failure to meet it promptly can lead to automatic liquidation of your positions, often at unfavorable prices.

In fast-moving or illiquid markets, you might not even get a chance to respond before positions are closed. You could end up owing money beyond your initial deposit if the deficit is large. Unlike fully funded trades where your maximum loss is typically limited to what you put in, leveraged positions can create a deficit. You remain liable for any shortfall after liquidation. This is explicitly highlighted in CFTC/FCM risk disclosures for futures trading.

Margin loans often carry interest, which accrues daily and can erode profits or deepen losses over time, especially on longer-held positions. Leverage heightens emotional stress, which can lead to poor decisions like over-trading, ignoring stop-losses, or doubling down on losing positions. In prediction markets, where outcomes are binary or event-driven, sudden shifts amplify this.

In thinner markets or during high volatility, exiting or adjusting positions can involve slippage. Leverage makes these small execution gaps far more costly. Even with FCM oversight, issues with clearinghouses or settlement of event contracts could arise though Kalshi emphasizes CFTC regulation. Prediction contracts depend on clear, objective resolution.

Disputes or ambiguous outcomes could affect marked-to-market values and trigger margin issues before final settlement. Kalshi has noted that its current fully collateralized model limits exposure, while margin introduces standard futures-style risks like those in the official CFTC risk disclosure statements.

Today on Kalshi, positions are fully funded, so your risk is generally capped at the amount you allocate; minus any partial recovery if you close early. Margin trading shifts this toward futures-style dynamics: lower upfront capital but ongoing monitoring requirements, potential calls, and higher tail risk of large or total losses.

Set strict stop-losses and position sizes based on account equity. Monitor positions closely, especially around event resolution dates. Understand the exact margin requirements, maintenance levels, and liquidation procedures once Kalshi details them likely starting with institutional clients and specific contracts.

Margin trading is not suitable for everyone and is generally aimed at experienced, well-capitalized traders or institutions. Regulators require clear risk disclosures precisely because losses can be rapid and substantial. Always review the platform’s specific rules, CFTC disclosures, and consider consulting a financial advisor.

Quantum Tech Startups Brave Market Turmoil to Go Public, Betting on Breakthroughs and Commercial Push

0

Even as global markets reel from geopolitical tensions and a fresh wave of volatility, a handful of quantum technology companies are pushing ahead with public listings this year.

They are determined to tap fresh capital and accelerate the long-promised shift from laboratory curiosity to real-world applications, according to a CNBC report.

The latest example came Friday when Toronto-based Xanadu Quantum Technologies, a pioneer in photonic quantum computing and a partner of chipmaker Nvidia, began trading on both the Nasdaq and Toronto Stock Exchange following a merger with blank-check company Crane Harbor Acquisition Corp. Shares opened rocky but rallied to close up about 15 percent in New York trading at around $11.50, though they gave back some ground in after-hours action.

Xanadu’s debut followed closely on the heels of Singapore’s Horizon Quantum Computing, which started trading on Nasdaq under ticker HQ after merging with dMY Squared Technology Group and raising roughly $120 million. Earlier in February, neutral-atom specialist Infleqtion became the first company focused on that approach to list publicly via a SPAC deal with Churchill Capital, pulling in more than $550 million.

These moves echo the path blazed by IonQ back in 2021, when dMY Technology Group took the pure-play quantum computing firm public. SPACs, shell companies that raise money through an IPO and then merge with a private target, have emerged as the favored shortcut for quantum startups seeking faster access to public markets with lighter regulatory hurdles than a traditional IPO.

Markets remain jittery over the ongoing U.S.-Israeli conflict with Iran, oil price spikes, and broader economic uncertainty — precisely the sort of environment that usually punishes speculative, long-horizon bets like quantum computing. Xanadu’s shares slipped more than 10 percent after hours Friday, Horizon Quantum has shed around 18 percent since its debut, and Infleqtion’s stock has plunged over 30 percent since mid-February.

Yet company executives argue the window is actually ideal. “It’s an interesting time to be entering the public markets, of course, with everything happening in the world,” Horizon Quantum founder and CEO Dr. Joe Fitzsimons told CNBC. “But for quantum computing, it’s actually a very ideal time to be coming out. We’re really starting to hit something of an inflection point.”

That inflection stems from a string of tangible scientific advances over the past 18–24 months. Researchers have demonstrated meaningful progress in quantum error correction — the critical technique for protecting fragile quantum information from noise and decoherence. Milestones include higher qubit counts, longer coherence times, and early signs that logical qubits (error-protected units of quantum information) can outperform raw physical ones in real computations.

Industry observers point to demonstrations of up to dozens of logical qubits with error rates low enough to show “beyond break-even” performance. Bain & Company partner Velu Sinha noted that practical quantum advantage, where quantum machines solve useful problems faster or better than classical supercomputers, could emerge around 100 logical qubits by 2028–2029.

Commercially transformative applications in drug discovery, materials science, logistics optimization, or financial modeling will likely require 1,000 to 10,000 logical qubits, a threshold many expect in the mid-2030s.

“The narrative has shifted from science project to commercial trajectory,” Sinha said. “Quantum is one of a small number of technology categories investors view as structurally inevitable.”

At full maturity, the addressable market could reach $100–250 billion, giving patient capital reason to look past near-term volatility.

Early revenue pathways are already appearing. Horizon Quantum has concentrated on software tools that bridge classical and quantum systems, allowing developers to prepare code that can run on future hardware while generating income today. The company plans to expand its research team and roll out an early version of its platform to select users later this year.

Xanadu, which uses photons (particles of light) for its qubits, has developed cloud-based platforms where developers can experiment with quantum algorithms on existing hardware. The firm has set an ambitious goal of helping deliver one of the world’s first practical quantum data centers by 2030 and boasts strong engagement from partners in automotive, aerospace, and finance.

Infleqtion’s CEO Matthew Kinsella emphasized that neutral-atom technology is moving from pure scientific progress toward commercial relevance, particularly in quantum sensing and networking alongside computing.

“Going public gives us the capital to accelerate commercialization and invest behind the markets where we already see customer demand,” he said. “We think commercialization will happen in stages.”

Governments have long bankrolled the heavy lifting in quantum R&D, with the U.S., China, and the European Union each committing billions to secure strategic edges in computing power and cybersecurity. National labs and universities remain central. But the current wave of listings signals a clear pivot: private companies are now chasing market traction, even if widespread consumer-facing quantum devices remain decades away.

As Counterpoint Research director Marc Einstein put it, quantum computers could one day perform trillions of operations instantaneously, revolutionizing fields from cryptography to complex simulations. Large organizations are far more likely to own or access the hardware as a service in the coming years than individuals will ever have desktop quantum machines.

For now, these newly public quantum firms must prove they can convert scientific momentum into sustainable revenue while navigating the unforgiving realities of public markets. The road from lab breakthrough to broad commercialization is still long and capital-intensive. But with error correction improving, qubit scales rising, and real customer interest emerging in optimization and simulation, the sector’s leaders are betting that going public is the best way to fund the next leg of the journey.

Midas Raises $50M in a Series A Funding Led by RRE Ventures and Creandum 

0

Midas, a German startup specializing in tokenized real-world assets (RWA) and on-chain investment products, has raised $50 million in a Series A funding round.

The round was led by RRE Ventures and Creandum, with participation from a strong mix of crypto-native and traditional finance investors, including: Framework Ventures, HV Capital, Ledger Cathay, Franklin Templeton, Coinbase Ventures, M1 Capital, Anchorage Digital, FJ Labs, North Island Ventures and GSR.

This brings Midas’ total funding to about $58.75 million, following an $8.75 million seed round in 2024. Midas turns institutional-grade yield strategies such as those involving treasuries or other assets into blockchain-based tokens often called mTokens. This provides on-chain transparency, composability, and yield while bridging traditional finance with decentralized infrastructure.

The company has already issued over $1.7 billion in assets, with more than $500 million in TVL and $37 million in yield paid out to 20,000+ holders. A major pain point in tokenized RWAs has been liquidity and redemption delays—many vault-style structures lock up capital, forcing slow liquidations or long wait times when investors want to exit.

To address this, Midas is launching the Midas Staked Liquidity (MSL) system—a standalone liquidity layer that enables instant redemptions by using pre-allocated funds, without needing to unwind underlying positions gradually. The initial capacity for MSL is up to $40 million, and the new capital will help scale this infrastructure.

This positions Midas to support broader institutional adoption of on-chain products by offering faster exits while maintaining transparency and yield. The involvement of Coinbase Ventures (crypto infrastructure) and Franklin Templeton (a major traditional asset manager active in digital assets and tokenization) highlights growing convergence between TradFi and crypto in the RWA/tokenization space.

It’s a notable raise for a relatively young European company in a market that’s still maturing. Midas’ CEO and co-founder, Dennis Dinkelmeyer, emphasized advancing on-chain investing with full transparency, instant redemptions, and native composability.

This announcement comes amid broader interest in tokenized assets, where liquidity remains a key bottleneck for scaling beyond early adopters. Midas Staked Liquidity (MSL) is a dedicated, standalone liquidity layer designed to solve one of the biggest pain points in tokenized real-world assets (RWAs) and on-chain yield products: slow or illiquid redemptions.

Traditional vault-style tokenized strategies often require unwinding underlying positions when investors want to exit, which can take days/weeks, introduce slippage, or force queues. MSL decouples liquidity provision from the core investment strategy, enabling instant, atomic redemptions while keeping the strategy capital fully deployed and yield-generating.

MSL acts as a centralized but on-chain and transparent liquidity facility shared across all Midas mTokens. Capital providers (LPs) deposit into MSL. This capital is not idle cash — it is actively allocated to low-risk, investment-grade strategies, primarily U.S. Treasury bills and prime lending markets. This generates a base yield for MSL participants.

Instant (Atomic) Redemptions: When a user chooses instant redemption for an mToken: The smart contract checks the current Net Asset Value (NAV). It burns the user’s mTokens. It immediately transfers stablecoins from the MSL pool to the user’s wallet — all in a single on-chain transaction. This happens at par (NAV), net of a fixed instant redemption fee.

No need to wait for underlying asset settlement or gradual liquidation. It’s atomic: settlement occurs instantly without counterparty or settlement risk. Liquidity comes from the external and shared MSL pool, not by holding unproductive cash inside each product. Higher net yields for token holders and better capital efficiency.

This layered approach prevents over-reliance on any single source and maintains robustness. Instant liquidity for investors without compromising yields. No settlement risk — fully on-chain, no third-party dependency for the instant leg. mTokens become better collateral in DeFi because of reliable redemption.

Shared pool avoids fragmenting liquidity across every product. As of recent data, MSL has around $12.89M in instant liquidity available, with per-token capacities. Initial targeted capacity mentioned in announcements was up to $40M, which the Series A funding helps expand.

Instant redemption: Immediate, subject to available MSL capacity, with a fixed fee. Toggleable in the interface; smart contract enforces capacity limits and compliance checks. Background verifications run continuously and can cause transactions to fail if conditions aren’t met.

MSL itself assumes no underlying investment risk from the mToken portfolios — it functions purely as a liquidity provider. Allocations stay in high-quality, liquid assets. Capacity limits, fees, and layered backstops manage drawdown risk. Full on-chain transparency and audited smart contracts.

This setup positions Midas products as more liquid yield tokens— combining real yield strategies with DeFi-like immediacy and composability, which is attractive for both retail and institutional users.

Starcloud Raises $170m to Hit $1.1bn Valuation as Investors Bet on Orbital AI Infrastructure

0
Fund, money cash dollar

Space-compute startup Starcloud has vaulted into unicorn territory, securing a $1.1 billion valuation in one of the fastest climbs from Y Combinator demo day to billion-dollar status.

The company’s Series A round, led by Benchmark and EQT Ventures, raised $170 million and brought total funding to $200 million, according to the company and investors. The fundraise comes just 17 months after its YC debut, underscoring the surging investor appetite for infrastructure plays tied to the artificial intelligence boom.

The enthusiasm has remarkably been sustained by a bold thesis to move power-hungry AI data centers off Earth and into orbit.

As hyperscalers race to build capacity for generative AI workloads, terrestrial data center expansion is increasingly constrained by power shortages, land scarcity, water use concerns, and regulatory bottlenecks. Starcloud is pitching orbit as a solution, where near-continuous solar power and the vacuum of space could, in theory, transform the economics of compute.

The company has already moved beyond concept.

In November 2025, Starcloud launched its first satellite carrying an Nvidia H100 GPU, becoming one of the first companies to deploy a state-of-the-art terrestrial AI chip in orbit. The satellite was used to perform inference and early AI model training tasks in space, a milestone that helped validate the technical premise behind orbital computing.

“An H100 is probably not the best chip for space, to be honest, but the reason we did it is we wanted to prove that we could run state of the art terrestrial chips in space,” he told TechCrunch.

Later this year, Starcloud plans to launch a second, more advanced spacecraft equipped with multiple GPUs, including an Nvidia Blackwell chip, an Amazon Web Services server blade, and even a bitcoin-mining computer.

That second mission is designed less as a demonstration and more as an engineering testbed, particularly for thermal management and power systems. Cooling remains one of the most difficult problems in orbital computing because high-performance chips generate significant heat and cannot rely on conventional air-based cooling systems.

Chief executive Philip Johnston says the next-generation spacecraft will carry what is expected to be the largest deployable radiator yet flown on a privately owned satellite, a critical step in making space-based computing viable at scale.

The longer-term ambition is far larger.

Starcloud is developing Starcloud 3, a three-ton, 200-kilowatt orbital data center spacecraft intended for deployment via SpaceX’s Starship system. The design is meant to fit the launch company’s “pez dispenser” deployment architecture originally built for Starlink satellites.

If launch costs fall to roughly $500 per kilogram, Johnston believes the platform could deliver electricity costs near five cents per kilowatt-hour, placing it in direct competition with land-based data centers.

That assumption, however, rests heavily on Starship becoming commercially operational by 2028 or 2029. This is where the investment case becomes more speculative.

Starship has yet to begin routine commercial flights, and many analysts believe the high-frequency launch cadence required to make orbital data centers economically viable may not emerge until the 2030s. Until then, the cost of lofting powerful compute hardware into orbit remains a significant barrier.

Johnston acknowledges as much, saying the company will continue deploying smaller systems on SpaceX’s Falcon 9 if Starship timelines slip.

“If it ends up being delayed, we’ll just carry on launching the smaller versions on Falcon 9,” Johnston said. “We’re not going to be competitive on energy costs until Starship is flying frequently.”

The economics also highlight how early this market remains. While Starcloud’s ambitions include an 88,000-satellite compute constellation, the entire global installed base of advanced GPUs in orbit is still measured in the dozens. By contrast, Nvidia is estimated to have shipped nearly four million advanced GPUs to terrestrial hyperscalers in 2025 alone.

The gap is even starker in power terms.

SpaceX’s Starlink constellation, currently the world’s largest satellite network with roughly 10,000 spacecraft, is estimated to generate around 200 megawatts of energy. On Earth, more than 25 gigawatts of data-center capacity are under construction in the United States alone.

This makes Starcloud less a direct competitor to terrestrial hyperscalers today and more a strategic infrastructure bet on where AI computing may go next.

Its near-term business model reflects that reality.

Rather than immediately replacing ground-based cloud services, the company is focused first on selling processing power to other spacecraft operators. One example already in use is the processing of Earth observation data from Capella Space’s radar satellites.

In the longer term, Starcloud hopes to position itself as an energy and compute infrastructure provider to hyperscalers seeking overflow or distributed AI workloads.

Competition is intensifying.

Alongside Starcloud, companies such as Aethero, Aetherflux, and Google’s Project Suncatcher are exploring adjacent orbital infrastructure models. Meanwhile, SpaceX itself has reportedly sought regulatory approval for a million-satellite distributed compute network, potentially making it the most formidable rival in the field.

That looming presence is the elephant in the room.

Still, Johnston argues the two companies are addressing different markets, with SpaceX likely prioritizing internal workloads tied to xAI’s Grok and Tesla systems, while Starcloud positions itself as an independent infrastructure player.

“They are building for a slightly different use case than us,” he told TechCrunch. “They’re mainly planning on serving Grok and Tesla workloads. It may be at some point that they offer a third party cloud service, but what I think they are unlikely to do is what we’re doing [as] an energy and infrastructure player.”

Investors are betting that if launch costs collapse and orbital compute becomes technically scalable, Starcloud could sit at the intersection of two of the decade’s biggest themes, space infrastructure and AI.