Author: Nancy, PANews
In the world of cryptocurrency, Tokens are certificates of digital rights, carrying various programmable rights and functions; as large models sweep the globe, a new type of Token is becoming popular, serving as the minimal computational unit of large models, and is forming a new economic narrative, quietly turning into an invisible competition in overseas workplaces.
In business logic, computing power equates to income, and Tokens have ascended to become the new currency in the AI world.
Today, overseas tech companies such as Meta, OpenAI, Amazon, Google, and Microsoft are beginning to incorporate Token usage into their performance evaluation systems. Some companies have even established internal leaderboards, visually showcasing the number of Tokens consumed by various teams or individuals, with those using them insufficiently often labeled as having low productivity or cultural mismatches.
Currency in the AI Era, New Token Economics
The measurement unit of Token is being brought to the forefront of global AI competition. Whether it is the scale of product consumption or the frequency of team calls, Token has evolved into the core and most intuitive value metric of AI products. Higher usage means the model is being used more frequently, and the economic leverage it exerts is greater.

To encourage employees to use AI, major companies are also dispensing Token benefits lavishly. For example, Alibaba is planning to provide employees with a Token quota; Tencent offers an annual Token quota worth up to 220,000 yuan; and NVIDIA even plans to provide engineers with a Token budget equivalent to about half of their base salary.
Moreover, Tokens are being integrated into compensation models. An engineering lead related to OpenAI's AI coding service Codex revealed that more and more job seekers are now concerned during interviews not just about salary, but about “how much inference computing resources they can obtain”; prominent venture capitalist Tomasz Tunguz noted that tech companies have begun to count inference costs as the fourth form of compensation for engineers, in addition to base salary, bonuses, and stock options, suggesting that Tokens could account for about one-fifth of total compensation based on current inference expenses; at the GTC 2026 conference, Jensen Huang explicitly stated that AI Tokens will become an important component of engineer compensation, alongside salaries, bonuses, and equity; Sam Altman even envisioned that “universal basic computing power” could eventually replace “universal basic income.”
Some even suggest that AI companies like OpenAI and Anthropic should establish dedicated recruiting websites, listing Token budgets next to salary ranges.
Even investors are starting to make direct payments in Tokens.
Recently, Zhen Fund and Crossing jointly initiated the Token Grant program, providing selected AI startup projects with a grant of 50,000 Tokens. For AI startup teams, a Token quota often alleviates urgent financial needs more than cash on hand.
In the AI era, a new Token economic narrative is taking shape.
Boosting Value with Tokens? Silicon Valley Engineers Engage in a Volume Game
An expensive new identity game is being played.
Recently, Kevin Roose, a technology columnist for The New York Times, pointed out that Tokenmaxxing is sweeping Silicon Valley, becoming a new kind of volume game for engineers.
He shared that an engineer at OpenAI processed 210 billion Tokens in the past week, equivalent to the text volume of 33 Wikipedia entries, the highest among all employees; at Anthropic, a user utilizing the company's AI coding system Claude Code spent over $150,000 in a month. In this Tokenmaxxing game, programmers are desperately trying to prove their efficiency and capabilities.
Tokens were originally used as tools to measure productivity but seem to be gradually evolving into performances of productivity. Especially with the emergence of Agentic tools like Claude Code and OpenClaw, this computing power competition has been further amplified. Engineers can operate AI sub-agents continuously, working overnight on different tasks, thus greatly increasing Token consumption.
The emergence of numerous Agents has led to a quantum leap in Token consumption, and AI giants are taking advantage of this to reap benefits.
For example, Anthropic's annual recurring revenue (ARR) recently surpassed $19 billion, nearly tripling since the end of last year; OpenAI's official data reveals that Codex's weekly active user count has surpassed 2 million, with user count and usage rising 3 and 5 times respectively since the beginning of this year; OpenClaw's Token consumption has skyrocketed, consuming 13.7 trillion Tokens in just the past month.
Tokens are essentially tools for companies to incentivize employees to deploy AI agents to boost productivity by providing computing resources; they represent an investment in employee capabilities. However, when Tokens become KPIs or symbols of capability, does spending more necessarily mean doing better work? Not necessarily.
Consumption ≠ Success Indicator, Beware of AI-Packaged False Demand
Regarding the introduction of Token billing into compensation, former venture capitalist Jamaal Glenn believes that the notion that “gaining more Tokens leads to greater efficiency and earning more money” only holds true when the interests of employees and employers are completely aligned, which is not the case for most employees. Tokens may seem like benefits but could actually be a means of packaging salaries; they are fundamentally different from cash or equity and do not reflect value during negotiations for the next job. He suggests asking about Token budgets in interviews, just like inquiring about hardware configurations or development tools, but under no circumstances should they be included in an offer as part of compensation.
If AI is made to execute meaningless repetitive tasks or excessively refine perfect projects, not only is there no output, but it also obscures real work efficiency.
Global authoritative research organization Gartner has also poured cold water on this notion. The agency stated that Token consumption is increasingly being viewed by AI vendors as a signal reflecting the scale, adoption, and market leadership of AI, but rapidly growing Token consumption does not imply long-term viability. The number of Tokens is structurally unsuitable for assessing the success of AI and may mislead decision-makers in organizations.
Gartner points out that what truly determines long-term viability are monetization principles, sustainable profit margins, and corporate penetration rates; leaders responsible for AI should de-emphasize Token metrics and instead evaluate AI vendors based on solution capabilities, decision-making empowerment effects, cost predictability, and quantifiable business outcomes.
Furthermore, as Tokens become a hard currency, the explosion of demand and rising costs are occurring simultaneously.
Taking the data from OpenRouter, the world's largest AI model API aggregation platform, as an example, last week, the weekly call volume of China's large models reached 4.69 trillion Tokens, surpassing the US for two consecutive weeks.
JPMorgan projects that from 2025 to 2030, the annual compound growth rate of Token consumption in China will reach 330%, with a five-year growth of about 370 times. IDC predicts that by 2030, the number of active AI agents globally will reach 2.216 billion, with annual Token consumption soaring from 0.0005 Peta Tokens in 2025 (1 Peta = 1 trillion) to 152,000 Peta Tokens, increasing over 300 million times.
As the call volume shifts from experimentation to scaled application, cost pressures are forcing the industry to adjust prices to varying degrees. Overseas giants like Amazon and Google are raising prices; even budget-friendly domestic large models are struggling to cope with the surging call volume, leading Alibaba Cloud, Tencent Cloud, and Zhizhu to successively increase their prices.
With usage continuing to rise, if major model vendors stop subsidizing prices, many startup projects and workflows that rely heavily on extensive Tokens will face a severe cost crisis.
This "Token volume game" will continue for a while, and once the tide recedes, it will be clear which engineers were swimming naked.
免责声明:本文章仅代表作者个人观点,不代表本平台的立场和观点。本文章仅供信息分享,不构成对任何人的任何投资建议。用户与作者之间的任何争议,与本平台无关。如网页中刊载的文章或图片涉及侵权,请提供相关的权利证明和身份证明发送邮件到support@aicoin.com,本平台相关工作人员将会进行核查。