In 2018, Elon Musk attempted to acquire Cerebras Systems, but the founders—Andrew Feldman, Gary Lauterbach, Michael James, Sean Lie, and Jean-Philippe Fricker—declined the offer. At the time, some viewed this decision as a bold stand, while others saw it as a risky move. The company had already invested three years and substantial resources into developing a chip design that many in the tech industry believed was impossible to achieve.
Fast forward to last week, and Cerebras made headlines by filing for an IPO on Nasdaq with the ticker CBRS. According to CNBC, the founders’ choice not to sell now looks especially wise.
In its IPO submission, Cerebras announced impressive financials, reporting $510 million in revenue for 2025, a remarkable 76% jump from the previous year. Additionally, the company turned a significant corner, moving from a $481 million net loss to $87.9 million in profit. Major financial institutions, including Morgan Stanley, Citigroup, Barclays, and UBS, are leading the underwriting, though the exact share price and size of the offering are yet to be revealed.
Cerebras also disclosed that it has a substantial backlog of $24.6 billion in project commitments as of December 31.
Currently, most AI computing relies on huge clusters of small GPU chips, each about the size of a fingernail, operating in large data centers that consume a lot of power. As the focus shifts to AI inference, these existing limitations are becoming more challenging and costly to ignore. Cerebras has opted for a different route.
Its Wafer-Scale Engine (WSE) is the first processor to utilize an entire silicon wafer. Measuring 46,225 square millimeters, it packs in a staggering 4 trillion transistors and 900,000 cores. This design simplifies connectivity by placing everything on a single surface.
The outcome is a memory bandwidth that is 7,000 times greater than that of standard GPUs, along with inference speeds that have attracted diverse clients, including Argonne National Laboratory and GlaxoSmithKline. The current CS-3 system combines the WSE with essential components like power, cooling, and networking into one deployable unit.
NVIDIA currently dominates the competition, as its GPUs power the majority of global AI infrastructure, boasting a market value exceeding $4.5 trillion by early 2026. AMD and Broadcom are notable competitors, with more startups focusing on AI inference emerging.
Cerebras’ situation is particularly interesting because AMD, a competitor, participated in its latest funding round. Such an investment from a rival can indicate a desire to thwart potential disruption or a belief in Cerebras’ technology—or possibly both.
The Series H funding round closed in February 2026, led by Tiger Global and valuing Cerebras at $23 billion, nearly tripling its $8.1 billion valuation just five months prior after its Series G. Long-time supporter Benchmark Capital, which invested in Cerebras since its initial $27 million Series A in 2016, gathered an extra $225 million for its stake. In total, Cerebras has raised about $2.8 billion across all funding rounds.
This IPO arrives at a time when many AI companies are gauging public interest in the infrastructure supporting the AI surge, making Cerebras the first to take this significant step. How the market reacts to its IPO could set the tone for future AI hardware public offerings.
