Cerebras Systems has officially filed to go public, positioning the startup as one of the most ambitious challengers yet to Nvidia’s near-monopoly in high-performance AI hardware.
CEO Andrew Feldman has long described the company’s technology as “the fastest AI hardware for training and inference,” and the IPO filing marks the latest step in Cerebras’ push to prove that claim in the public markets.
The move comes after an earlier 2024 IPO attempt was delayed by a federal review of an investment from Abu Dhabi-based G42 and ultimately withdrawn. Since then, Cerebras has moved aggressively to strengthen its balance sheet and customer roster.
Register for Tekedia Mini-MBA edition 20 (June 8 – Sept 5, 2026).
Register for Tekedia AI in Business Masterclass.
Join Tekedia Capital Syndicate and co-invest in great global startups.
Register for Tekedia AI Lab.
It closed a $1.1 billion Series G last year and followed that with a $1 billion Series H in February that valued the company at $23 billion, according to the Wall Street Journal. Those back-to-back mega-rounds have given it the resources to compete at the highest levels of the AI infrastructure race.
Two recent deals underscore the momentum. Cerebras reached an agreement with Amazon Web Services to deploy its chips inside Amazon data centers, giving it a foothold with one of the world’s largest cloud providers. Even more striking is its reported pact with OpenAI, said to be worth more than $10 billion.
In a recent interview with the Wall Street Journal, Feldman was characteristically direct about what that win meant. He said: “Obviously, [Nvidia] didn’t want to lose the fast inference business at OpenAI, and we took that from them.”
The financial picture in the filing shows real traction. Cerebras generated $510 million in revenue for 2025. On a GAAP basis, it reported net income of $237.8 million, though on a non-GAAP basis, excluding certain one-time items, it posted a net loss of $75.7 million. The numbers reflect the classic pattern of a high-growth hardware company: heavy investment in research, manufacturing scale-up, and customer deployments today in exchange for what it hopes will be dominant economics tomorrow.
At the heart of Cerebras’ pitch is its Wafer-Scale Engine, a single silicon wafer the size of a dinner plate that packs hundreds of thousands of AI cores. Unlike traditional systems that link dozens or hundreds of smaller GPUs together, with all the attendant latency, power, and software complexity, Cerebras’ approach keeps the entire workload on one massive chip. That design delivers the extreme speed and memory bandwidth required for the largest AI models, a niche where even Nvidia’s powerful clusters can struggle.
The IPO comes at a time when demand for AI compute remains insatiable, and the biggest players are actively hunting for alternatives that can deliver more performance per dollar or per watt. OpenAI’s decision to hand a reported $10 billion-plus contract to a startup rather than stick exclusively with Nvidia sends a powerful signal about the market’s willingness to embrace new architectures.
The AWS partnership further validates that Cerebras is moving beyond lab demonstrations into real production environments.
Still, uncertainties surround the deal. Nvidia’s ecosystem advantage, its CUDA software platform, vast developer community, and decades of optimization, is formidable. Cerebras will need to continue proving that its wafer-scale chips are not only faster but also easier to program and more reliable at scale. Manufacturing such enormous chips at volume also carries technical and supply-chain risks, even with strong foundry partners.
The company has not yet disclosed how much it hopes to raise or the exact timing beyond a target of mid-May. But the filing itself is already a milestone. After navigating regulatory hurdles, raising more than $2 billion in the past year, and landing blue-chip customers, Cerebras is stepping onto the public stage at a moment when investors remain hungry for pure-play AI infrastructure stories.
If the offering succeeds, it could provide the capital needed to accelerate manufacturing scale, expand the software stack, and push deeper into both training and inference workloads.
A successful Cerebras IPO is expected to be more than just another hardware listing for the broader AI ecosystem. It would demonstrate that meaningful competition to Nvidia is not only possible but already winning major contracts from the industry’s most prominent customers.



