Charts
DataOn-chain
VIP
Market Cap
API
Rankings
CoinOSNew
CoinClaw🦞
Language
  • 简体中文
  • 繁体中文
  • English
Leader in global market data applications, committed to providing valuable information more efficiently.

Features

  • Real-time Data
  • Special Features
  • AI Grid

Services

  • News
  • Open Data(API)
  • Institutional Services

Downloads

  • Desktop
  • Android
  • iOS

Contact Us

  • Chat Room
  • Business Email
  • Official Email
  • Official Verification

Join Community

  • Telegram
  • Twitter
  • Discord

© Copyright 2013-2026. All rights reserved.

简体繁體English
|Legacy

In youth, unaware of Su Ma's goodness, mistakenly took trash for treasure.

CN
BTCdayu
Follow
1 day ago
AI summarizes in 5 seconds.

When young, I didn't know how good Su Ma was, mistakenly treating trash as treasure.

At that time, DEEPSEEK was released, and the AI stocks in the US plummeted; AMD even fell to 80. In just a few months, it has reached 300, which really leaves one speechless.

But there's nothing to be done; missing out is all due to lack of understanding. If you don't understand, then learn and practice.

The AI computing power matter has mainly been consumed in "training" over the past three years—OpenAI trained GPT-4, Anthropic trained Claude, Google trained Gemini; these are all training.

The characteristics of training are one-off, concentrated, and peak value.

But every time you ask ChatGPT a question, every time you use Claude to write a piece of code, every time you use Midjourney to generate an image—you are not consuming training computing power, but rather inference computing power.

The characteristics of inference are continuous, distributed, and long tail.

Once the training is completed, the model goes live. Once online, it responds to hundreds of millions of user requests 24 hours a day. Three months later, that bit of computing power consumed for training has become invisible on the overall account—all that remains is inference.

This shift in scale prompts me to illustrate with a comparison.

In 2023, inference accounted for about 20% of AI computing power expenditure; in 2024, this ratio climbs to 50%, and by 2026, it stands above 55%, continuing to rise.

Some more aggressive predictions believe that by 2030, inference will account for 70-80%. Note that this is not because training demand is shrinking—the absolute expenditure on training is still rising, but inference is growing much faster than training.

The true leader on this big slope of inference is NVIDIA. NVIDIA's data center revenue for fiscal year 2026 (ending January 2026) is $194 billion, while two years ago, this figure was less than $50 billion. Such growth has never been seen in semiconductor history. With a CUDA ecosystem of five million developers, built up over twenty years, both training and inference sides are being embraced simultaneously—this is true monopoly.

Ranking first is NVIDIA, second is AMD, and third are Google TPU, Amazon Trainium, Meta MTIA, and these self-developed ASICs—this is today’s table structure.

What position does AMD hold at this table? It is the second chair. This chair is very important—without the second chair, the first chair has no bargaining pressure. But the second chair is not the first chair.

So the real question becomes two sub-questions:

First, can AMD firmly hold this second chair for ten years?

Second, how much is this chair worth once it is secured?

Additionally, AMD has a severely underestimated angle: the true story behind Meta's 170,000 MI300X units.

“AMD Research Report: Looking Back Over 10 Years, Is $300 Expensive?”

https://mp.weixin.qq.com/s/jOLAESOTfEdm3a4Xcxc-oA


免责声明:本文章仅代表作者个人观点,不代表本平台的立场和观点。本文章仅供信息分享,不构成对任何人的任何投资建议。用户与作者之间的任何争议,与本平台无关。如网页中刊载的文章或图片涉及侵权,请提供相关的权利证明和身份证明发送邮件到support@aicoin.com,本平台相关工作人员将会进行核查。

|
|
APP
Windows
Mac
Share To

X

Telegram

Facebook

Reddit

CopyLink

|
|
APP
Windows
Mac
Share To

X

Telegram

Facebook

Reddit

CopyLink

Selected Articles by BTCdayu

9 hours ago
It's not that the old ETH is more stable, but rather the natural reasoning.
1 day ago
Recent good product recommendations:
1 day ago
At first
View More

Table of Contents

|
|
APP
Windows
Mac
Share To

X

Telegram

Facebook

Reddit

CopyLink

Related Articles

avatar
avatarBITWU.ETH
1 hour ago
The Ethereum Foundation's unpopularity this time is entirely self-inflicted.
avatar
avatarIgnas | DeFi Research
1 hour ago
Simple idea how to avoid rsETH-type contagion in DeFi
avatar
avatar小捕手 CHAOS
3 hours ago
River S5 has started.
avatar
avatar普达特
4 hours ago
Hong Kong has fallen into becoming a second-tier city of China; it's just a matter of time.
avatar
avatarBITWU.ETH
7 hours ago
Ordinary people want to invest.
APP
Windows
Mac

X

Telegram

Facebook

Reddit

CopyLink