Original|Odaily Planet Daily (@OdailyChina)
Author|Wenser (@wenser 2010)
Last night, Cerebras (CBRS), known as the "next NVIDIA," officially opened for trading, and shortly after, its stock price skyrocketed from the issuance price of $185 to $350, reaching a high of $385 during intraday trading, with an increase of over 108%. Although its stock price has now retreated to about $311, it still has an increase of over 68%. Previously, Cerebras CEO Andrew Feldman stated in an interview with CNBC: "Our chips are the size of dinner plates and 20 times faster than NVIDIA's chips."
This chip manufacturer, which raised $5.5 billion, what confidence does it have to make such bold statements as "faster than NVIDIA chips"? How has it managed to secure a $20 billion order from OpenAI amidst fierce competition? Will its stock price continue to rise in the short term? Odaily Planet Daily will provide its answers to these questions in this article.

Cerebras' Confidence Against NVIDIA: Opening a New AI World with Wafer-Scale Chips
As the gap in AI computing power grows, strong market demand has propelled NVIDIA to become the highest-valued publicly listed company in the world today.
Recently, NVIDIA's stock price reached new highs, with its market value breaking $5.5 trillion at one point. In terms of market capitalization, it has become the world's largest economy after the GDPs of the United States and China, far exceeding those of major economies like Germany and Japan, truly making it "wealthy enough to rival nations."

However, unlike long-established "old giants" like NVIDIA, Cerebras (CBRS) is a rising star in the chip manufacturing industry.
In 2016, chip industry veterans Andrew Feldman, Gary Lauterbach, Sean Lie, Michael James, and JP Fricker co-founded Cerebras Systems, headquartered in Sunnyvale, California, USA. Unlike NVIDIA's focus on building general-purpose GPUs to maximize market demands, Cerebras' core innovation is the Wafer Scale Engine (WSE), currently the world's largest AI chip.

Cerebras founding team (2022)
Their core products include:
- WSE-3: approximately 46,225 mm² in area (equivalent to the size of a dinner plate), containing 40 trillion transistors and 900,000 AI-optimized cores, providing 125 petaflops of computing power. Compared to traditional GPUs, it makes the entire wafer into a single giant processor, avoiding the bottleneck of multi-GPU interconnects, with on-chip SRAM reaching up to 44GB and extremely high memory bandwidth.
- CS-3 System: an AI supercomputer based on WSE-3 that supports training and inference; Cerebras currently sells not only chips but also provides cloud services (Cerebras Inference), dedicated data centers, and local deployment technology support.
In terms of business model, Cerebras provides ultra-low latency inference primarily to clients such as OpenAI, Meta, Perplexity, Mistral, GSK, and Mayo Clinic. In 2025, Cerebras is expected to achieve $510 million in annual revenue (a 76% year-over-year increase), has become profitable, and has significant orders supporting it (including a long-term contract with OpenAI for hundreds of megawatts of computing).

Cerebras WSE-3 chip illustration
On the IPO day of May 14, Cerebras CEO Andrew Feldman also responded positively to the company's operational status, technical moat, and future market trends in an interview on CNBC's "Squawk Box":
- First, Feldman stated that the IPO is "the right way to fund our growth," emphasizing that the company is mature and the public market can support enormous growth opportunities. He highlighted that this is the result of a decade of effort and expressed great pride, stating that the market "understands our story and responds positively."
- Second, he repeatedly emphasized that Cerebras is the only company to have successfully manufactured “giant chips” in 70 years, with all other attempts having failed, thus "the technical moat is wide and deep." He mentioned that Cerebras' chips are 58 times larger than competitors' chips such as NVIDIA's and operate at 15-20 times their speed, significantly accelerating AI inference and training.
- Lastly, in response to market concerns about the sustainability of AI spending, Feldman stated that the related demand is "huge and continuously growing." The company's chips enable transformative AI experiences (faster response times, real-time agents, etc.). He mentioned important collaborations with OpenAI and AWS and is optimistic about the overall environment for AI hardware.

On a side note, similar to Elon Musk's previous bet with Anthropic on "space data centers" (recommended reading “Musk and Anthropic, going to space to find power”), Feldman boldly predicts that "within 15 years, data centers in space are likely to become a reality," showing his confidence in long-term AI infrastructure construction and rapid expansion.
Thus, as a "speed geek" in the AI chip field, Cerebras has successfully broken through by focusing on the extreme performance of large-scale models, becoming a strong challenger to NVIDIA in the fields of large model inference and ultra-large-scale training applications.
In this regard, OpenAI's $20 billion order provides sufficient backing for its development, and the two companies' cooperation goes far beyond the simple relationship of "chip manufacturer" and "chip purchaser."
The Complex Relationship Between Cerebras and OpenAI: Customer, Creditor, and Potential Major Shareholder
The relationship between Cerebras and OpenAI can be traced back a long way. Beyond cooperation at the company level, OpenAI founders Sam Altman and Greg Brockman were early angel investors in Cerebras, holding a small stake, which may be an important reason for their deep-seated binding today.
In December 2025, OpenAI provided Cerebras with a $1 billion Working Capital Loan, establishing their creditor relationship.
In January this year, Cerebras and OpenAI officially unveiled their “750MW inference computing power procurement agreement,” which later emphasized the collaboration has an option to expand to 2GW. In April, this was reconfirmed. According to media reports, OpenAI plans to invest over $20 billion in the next three years to purchase servers powered by Cerebras chips and will acquire equity in the company as part of the deal. This makes OpenAI the largest customer of Cerebras, bar none.

Image source: @Xingpt
Cerebras’ subsequent S-1 registration statement and IPO application documents indicate that OpenAI is expected to acquire approximately 33.44 million Cerebras warrants at an extremely low strike price of $0.00001 per share, with some warrants subject to vesting conditions, including the delivery date of computing power and the achievement of a Cerebras market value exceeding $40 billion as milestone requirements.
If all warrants are exercised and conditions are met, OpenAI could acquire approximately 10%-11% equity (the specific ratio depends on the total share capital after the IPO). Based on an estimated valuation of about $56 billion at IPO pricing, the value of this equity stake would be approximately $5-6 billion; if calculated at the current market value (approaching $95 billion after the IPO's first-day closing), the value of this equity stake would be around $10.3 billion. Although it has not fully exercised the warrants yet, it is already undeniable to regard OpenAI as a “potential major shareholder” of Cerebras.

Image source: @Xingpt
Whether Cerebras Can Become the Next NVIDIA is Still Unknown, but the Stock Price May Continue to Rise in the Short Term
Returning to the third question at the beginning, can Cerebras become the next NVIDIA?
From the perspective of industry landscape, the answer is undoubtedly negative. The main reasons are four:
- First, the gap between their ecosystems is immense: As the absolute leader in the chip manufacturing industry, NVIDIA's CUDA software stack is the undisputed industry standard, with countless developers, technology frameworks, and toolchains reliant upon it; while Cerebras also has its software stack, it is far from reaching the maturity and compatibility of CUDA, making the switching cost extremely high for many developers and enterprises.
- Second, the scale differences and diverse development paths: In 2025, NVIDIA's revenue is expected to reach hundreds of billions of dollars, with GPUs covering training, inference, graphics, automotive, data centers, etc., and Jensen Huang boldly claimed at the CES 2026 that "the AI chip and infrastructure market may reach $1 trillion by 2027," with NVIDIA holding the largest share of the pie. In contrast, Cerebras' revenue in 2025 is only expected to be $510 million, and its customer base is relatively concentrated on a single giant like OpenAI, weakening its risk resistance.
- Third, their chip manufacturing and cost control are different: The ultra-large AI chip brings not only faster operating speeds but also greater manufacturing complexity and costs. Cerebras’ wafer-scale chips require an entire wafer, with low production rates, significant yield challenges, and high unit prices (a CS-3 system costs far more than a single GPU); while at NVIDIA, a single wafer can yield dozens of GPUs, achieving a stronger scale effect and higher economic returns.
- Fourth, they face different competitive pressures in the chip industry: Unlike NVIDIA's advantageous position in the industry, Cerebras faces direct competition from multiple players such as Groq, AMD, Google TPU, and AWS Trainium, and although its current momentum is good, limited by time, capital, resources, etc., its current positioning is more like a “high-end niche player” rather than a “market leader.”
Based on the above information, Cerebras will not be able to grow into an industry giant like NVIDIA in the short term and cannot disrupt the existing industry competition; however, in terms of stock price comparison, its price per share has already surpassed NVIDIA. Additionally, thanks to the booming AI trend and the increasing gap in computing power, within this year before OpenAI and Anthropic go public, Cerebras' stock price and market value may still have some upward space.
In the next 2-3 years, if it can convert orders from OpenAI, AWS, etc. into actual revenue as scheduled, Cerebras' stock price may further reach new heights; but if the order performance falls short of market expectations or if demand for AI model inference changes, its stock price may face significant pressure.
In summary, within 1-3 years, Cerebras cannot replace NVIDIA, but it can occupy a certain market share in the AI infrastructure niche market, becoming the “speed king of AI chips.” As for the longer-term competitive landscape, it needs time to verify.
Recommended Reading
A Decade Betting on Cerebras: How the “Wafer-Scale AI Chip” Made It to NASDAQ
免责声明:本文章仅代表作者个人观点,不代表本平台的立场和观点。本文章仅供信息分享,不构成对任何人的任何投资建议。用户与作者之间的任何争议,与本平台无关。如网页中刊载的文章或图片涉及侵权,请提供相关的权利证明和身份证明发送邮件到support@aicoin.com,本平台相关工作人员将会进行核查。