
Cerebras Systems‘ monster debut on Thursday didn’t just place it among tech’s biggest-ever IPOs — it was a crystal clear signal of unstoppable demand for chips to power AI, as tech giants scramble to find alternatives to the costly, sold-out graphics processing units made by Nvidia.
Cerebras closed its first day trading on Wall Street with a market cap just below $100 billion, putting it near the few companies to close above that mark, such as Facebook-parent Meta and Alibaba. The stock traded lower on Friday, its first full day of trading.
Here’s what you need to know about this hot Nvidia competitor.
Cerebras makes a different type of chip than the classic Nvidia GPU, and it’s the size of a dinner plate.
“We build the biggest chips in the semiconductor industry,” Cerebras CEO and Co-Founder Andrew Feldman told CNBC on Squawk Box Thursday. “Big chips process more information in less time and deliver results more quickly.”
Until now, Nvidia has been winning the AI chip race because its GPUs serve as general-purpose workhorses, excelling at the parallel math necessary for training large models. But we’ve now arrived at the era of agentic AI, where inference is key. While training teaches the AI model to learn from patterns in large amounts of data, inference uses the AI to make decisions based on new information.
Inference can happen on less powerful chips programmed for more specific tasks, such as Cerebras’ WSE-3. It falls into a category of chips known as custom ASICs — application-specific integrated circuits. It’s an increasingly crowded space, with in-house ASICS now made by the likes of Google, Amazon, Meta and Microsoft.
Cerebras said the WSE-3 is 57 times larger than the largest GPU, and has 50 times the number of transistors.
The most advanced AI chips are made using Taiwan Semiconductor Manufacturing‘s 2-nanometer process node, currently only possible in Taiwan. Cerebras’ chip is also made at TSMC, but on its less advanced 5-nanometer node.
Founded in Silicon Valley in 2016, Cerebras first filed to go public in 2024, but withdrew that submission when it faced scrutiny for its heavy reliance on a single customer, Microsoft-backed AI firm G42 in the United Arab Emirates.
With the firm’s successful IPO on Thursday, Feldman and hardware technology chief Sean Lie, two of its co-founders, became billionaires based on their holdings.
Cerebras one-week stock chart.
For years, Cerebras sought to sell chips to companies, but now largely operates the chips inside its own data centers as a cloud service, pitting it against cloud providers Google, Microsoft, Oracle and CoreWeave.
Cerebras and OpenAI announced a $20 billion cloud deal in January that expires in 2028, while Amazon Web Services announced in March that it’s using Cerebras chips in its data centers.
“For our fast inference product, there’s so much demand that our biggest challenge is actually trying to supply it. We are adding as much manufacturing and data center capacity as we possibly can, and we’re still sold out into 2027,” Cerebras CFO Bob Komin told CNBC Thursday.
While hyperscalers make their own in-house ASICs, Cerebras more closely competes with firms that specialize in making them for others. Chief among those is Groq. In its largest purchase to date, Nvidia paid $20 billion for Groq’s tech in December, then announced custom Groq Language Processing Units at GTC in March.
SambaNova and D-Matrix are two other notable Cerebras competitors looking to capitalize on unprecedented AI chip demand.
SambaNova counts Hugging Face and Meta among the customers of its SN50 chips, while Intel participated in a $350 million funding round for SambaNova in February. Intel CEO Lip-Bu Tan has served as SambaNova’s chairman since 2017.
Cerebras’ IPO also paves the way for other custom ASIC start-ups looking to go public, such as Rebellions.
The South Korean chipmaker raised $400 million from the likes of Samsung, at a valuation of $2.34 billion, in March as it prepares for an IPO.
Watch: Breaking down AI chips, from Nvidia GPUs to ASICs by Google and Amazon




























