Cerebras is seeking a valuation of up to $26.62 billion in its revived U.S. initial public offering, positioning the chipmaker at the center of Wall Street’s accelerating bet that the artificial intelligence infrastructure boom is still in its early stages.
The Sunnyvale, California-based company plans to sell 28 million shares priced between $115 and $125 each, aiming to raise as much as $3.5 billion in what could become one of the year’s most closely watched technology listings.
The offering marks Cerebras’ second attempt to go public after it withdrew an earlier IPO filing last October amid market volatility and investor caution surrounding high-growth technology stocks.
Register for Tekedia Mini-MBA edition 20 (June 8 – Sept 5, 2026).
Register for Tekedia AI in Business Masterclass.
Join Tekedia Capital Syndicate and co-invest in great global startups.
Register for Tekedia AI Lab.
This time, however, the backdrop is dramatically different. Global spending on AI infrastructure has exploded as hyperscalers, governments, and enterprises race to secure the computing power required to train and deploy increasingly sophisticated AI models. That frenzy has transformed advanced semiconductors into one of the world’s most strategically important technologies, pushing investors back toward AI-linked offerings even as concerns persist over valuations and sustainability.
Cerebras is attempting to position itself as one of the few credible challengers to Nvidia, whose dominance in AI chips has made it one of the most valuable companies in history.
Unlike traditional chipmakers, Cerebras has built its identity around wafer-scale engine processors, enormous chips designed to deliver extreme computational performance for AI training and inference workloads. Rather than splitting computing tasks across many smaller processors, the company’s architecture concentrates massive compute and memory resources onto a single piece of silicon.
“By bringing massive compute and memory onto a single piece of silicon and integrating it into a purpose-built system and software stack, we deliver exceptional AI speed for customers on premises and via the cloud,” Cerebras said in its filing.
The company is betting that AI customers increasingly want alternatives to Nvidia’s ecosystem as demand for compute power intensifies and infrastructure costs soar. That strategy is gaining traction because the AI boom has exposed vulnerabilities in the global semiconductor supply chain. Access to advanced AI chips has become a critical bottleneck for technology firms, cloud providers, and governments seeking to compete in generative AI.
“Nvidia remains dominant as the market leader for AI inference as well as training infrastructure, however, Cerebras is pitching the idea that there is room for specialist chip companies if they can offer clear speed or cost advantages,” said IPOX Research Associate Lukas Muehlbauer.
Cerebras’ public debut also reflects a broader shift in investor psychology. Earlier in the AI cycle, much of the market’s enthusiasm centered on applications such as chatbots, copilots, and generative content. Increasingly, however, investors are moving deeper into the foundational layers of the AI economy, including chips, networking equipment, data centers, and energy infrastructure.
That transition has helped fuel enormous capital expenditure commitments across the technology sector. Major cloud providers, including Alphabet, Microsoft, Amazon, and Meta, are collectively expected to spend more than $700 billion this year on AI infrastructure, according to analyst estimates.
Cerebras has sought to capitalize on that spending wave through aggressive partnerships and fundraising. Earlier this year, the company raised $1 billion in a late-stage funding round led by Tiger Global, valuing the business at $23 billion. Investors in the round included AMD, Benchmark, Fidelity Management, and Coatue.
The company also secured a major agreement with OpenAI valued at more than $20 billion, under which the ChatGPT maker agreed to deploy 750 megawatts of Cerebras’ AI compute infrastructure over multiple years. That deal is particularly significant because it signals growing demand for alternatives to Nvidia’s hardware stack among leading AI developers. OpenAI, Anthropic, and other frontier AI companies are increasingly searching for ways to diversify supply chains and reduce dependence on a single chip vendor as global competition for compute intensifies.
The IPO also arrives during a notable rebound in the broader U.S. listings market. Investor sentiment has improved sharply in recent months as equity markets hover near record highs and fears surrounding the Iran war and energy disruptions have eased somewhat. Bankers say AI-focused listings are now viewed as some of the most attractive opportunities in the market because the sector remains tied to long-term structural demand rather than short-term consumer spending cycles.
“Cerebras is an important signal deal for the IPO market as a test of whether public investors are ready to fund high-growth AI infrastructure companies after a softer start to the year,” Muehlbauer said.
“There is also a race to get deals done before SpaceX. The SpaceX IPO will be so large and high-profile that there are concerns it could absorb a lot of investor attention and capital,” he added.
SpaceX reportedly filed to go public last month, setting up what could become one of the largest and most consequential IPO periods in years.
Cerebras is entering public markets at a time when investors are beginning to ask tougher questions about whether AI infrastructure spending can continue at its current pace indefinitely. While demand for compute remains enormous, some analysts warn that valuations across the AI ecosystem increasingly assume years of uninterrupted growth.
That scrutiny has intensified after reports that some AI firms are struggling to meet internal revenue expectations even as they commit tens of billions of dollars to data centers and long-term compute contracts.
Cerebras’ financials, however, are likely to strengthen its pitch to investors. The company reported revenue of $510 million for the year ended December 31, up from $290.3 million a year earlier. It also posted earnings of $1.38 per share, reversing a loss of $9.90 per share the previous year.
The turnaround is notable because many AI infrastructure startups remain deeply unprofitable despite surging valuations. Cerebras is attempting to distinguish itself not only as a technology innovator but also as a company capable of generating real commercial returns from the AI boom.
Still, the company faces formidable challenges. Nvidia’s dominance extends beyond chips into software, developer ecosystems, and customer relationships, areas that are notoriously difficult to disrupt. Large cloud providers are also increasingly developing their own in-house AI chips, intensifying competitive pressure across the industry.



