Charts
DataOn-chain
VIP
Market Cap
API
Rankings
CoinOSNew
CoinClaw🦞
Language
  • 简体中文
  • 繁体中文
  • English
Leader in global market data applications, committed to providing valuable information more efficiently.

Features

  • Real-time Data
  • Special Features
  • AI Grid

Services

  • News
  • Open Data(API)
  • Institutional Services

Downloads

  • Desktop
  • Android
  • iOS

Contact Us

  • Chat Room
  • Business Email
  • Official Email
  • Official Verification

Join Community

  • Telegram
  • Twitter
  • Discord

© Copyright 2013-2026. All rights reserved.

简体繁體English
|Legacy

Will AI tokens become a new global commodity and currency?

CN
Techub News
Follow
5 hours ago
AI summarizes in 5 seconds.

Source: Shujing Technology

Written by: Fan Wenzhong

On March 23, Liu Liehong, director of the National Data Bureau, revealed a shocking set of data at the China Development Forum: the average daily AI Token call volume in China has surged from 100 billion at the beginning of 2024 to 100 trillion by the end of 2025, and will exceed 140 trillion in March 2026, growing over a thousand times in two years. Meanwhile, data from OpenRouter, the world's largest AI model API aggregation platform, shows that the weekly call volume of China's large models has surpassed that of the United States for several weeks, with Chinese models occupying the top three positions globally. A Token-driven industrial revolution is reconstructing the global technology competition landscape, business models, and even national core competitiveness at an unprecedented speed.

At the beginning of 2026, several industry dynamics emerged in Silicon Valley, attracting global attention in the tech community. OpenAI is gradually phasing out DAU (Daily Active Users), the core internet metric used for nearly 20 years, in favor of TPD (Tokens Per Day) as its primary operational indicator. This transition is no accident. NVIDIA CEO Jensen Huang redefined data centers as "Token factories" at the GTC 2026 summit, pointing out that the core of future competition is "Tokens per Watt." This is not an isolated phenomenon but signifies that a new paradigm of the intelligent economy, centered around Tokens as the core measurement and transaction unit, has fully arrived.

The Value and Measurement of AI Tokens

1. AI Tokens Become the Value Metric of the Intelligent Era

From the perspective of computer science, Tokens are the basic units through which AI models process various types of information. When a piece of text is input into the model, it is broken down into words or sub-words; an image is decomposed into pixel blocks; an audio segment is divided into time slices. These indivisible basic units can all be referred to as Tokens.

In practical applications, the measurement of Tokens follows certain rules. For English text, a short word may count as one Token, while a longer word may be divided into multiple Tokens; a simple rule of thumb is that 1 Token is approximately equal to 4 English characters. For Chinese text, typically one Chinese character corresponds to 1 to 2 Tokens. Whether in data processing during model training or operational output during model service calls, every core action of AI is measured using Tokens as the central scale. The scale of Token consumption directly reflects the workload and value output of the model, aligning with Marx's labor theory of value.

The significant contribution of Tokens is that they provide a quantifiable, comparable scale of value for the development of the intelligent economy. As AI technology iterates from text modalities to multi-modalities and as application scenarios deepen into programming, video, and scientific research, Tokens are increasingly highlighted as a "unified metric." This positioning is not arbitrary, but a necessary outcome of industrial development: the Industrial Age needed "kilowatt-hours" to measure electricity consumption, the Internet Age needed "GB" to measure data flow, and the AI Age naturally requires Tokens to measure intelligent output. In economic and commercial terms, Tokens have become the core value unit that is measurable, priceable, and tradable in the intelligent era. They connect underlying resources of energy, computing power, and data with upper-level intelligent services, serving as a universal metric for measuring AI productivity, accounting for AI costs, and settling AI services.

The value chain of Tokens encompasses five major aspects: hardware manufacturing, infrastructure construction, computing power provision, platform operation, and application development. In terms of cost composition, electricity and computing power depreciation together account for 70%-80%, becoming the core factor determining the international competitiveness of Tokens. "Tokens per Watt" has become the key metric for measuring the competitiveness of AI companies. This means that under a fixed electricity budget, those who can produce more Tokens with higher energy efficiency own the lowest production costs and the strongest market competitiveness.

Factors Affecting the Measurement of AI Tokens

With the extreme richness of application scenarios, Token measurement methods have evolved from simple counting in the early stages to complex systems featuring multi-dimensional, dynamic weighting.

(1) Binary differentiation of inputs and outputs. The most basic measurement still adheres to a binary structure of "input Tokens" and "output Tokens." Input Tokens represent the amount of information provided by users to the model (including prompts, uploaded documents, historical dialogue records, etc.), while output Tokens are the content generated by the model. In commercial billing, due to the substantial memory bandwidth and computing cycles consumed during the generation process, the cost of output Tokens is typically 3 to 5 times that of input Tokens. This price disparity reflects the fundamental difference in computational consumption between "creative labor" and "information retrieval."

(2) Contextual measurement and memory costs. From 2024 to 2025, the context window of large models underwent a leap from 8K, 32K to 128K and even 1M (millions) tokens. By 2026, processing ultra-long contexts became the norm. However, long contexts are not a free lunch. The attention mechanisms based on transformer architectures often lead to a computational complexity that grows quadratically or linearly when processing long sequences. Thus, modern measurement systems have introduced "contextual weighting coefficients." When a user poses a question in a session with a million Token context, even if only 10 Tokens of answer are generated, the system needs to re-scan or retrieve a vast historical memory, and this implicit consumption is accounted for in the "active context Token" cost. This makes measurement more precisely reflect the resource cost of the model maintaining long-term memory.

(3) Tokenization of multi-modal data. With the maturity of large multi-modal models (LMMs), images, videos, and audios are also included in the Token measurement system. A high-resolution image is no longer viewed as a single file but is cut into hundreds of visual patches, with each patch encoded as one or more visual Tokens. A one-minute video might transform into tens of thousands of temporal visual Tokens. This unified measurement method breaks through modality barriers, allowing for visual description, video understanding, and voice interaction to be accounted for under the same economic model. For example, generating a 10-second high-definition video may consume as many Tokens as writing a thousand-word article, visually reflecting the differences in information density across modalities.

(4) The invisibility of Token value. With the proliferation of AI agents, models no longer engage in single responses but in complex autonomous planning, code execution, self-reflection, and multi-round searching. This process generates a substantial number of intermediate reasoning Tokens, which are not directly presented to users, yet are foundational for high-quality output. New measurement standards are beginning to distinguish between "surface output Tokens" and "internal reasoning Tokens." For difficult scientific calculations or complex logical reasoning, the quantity of internal reasoning Tokens may be dozens of times that of the final output. Some advanced platforms have started attempting differentiated billing based on effective reasoning steps or depth of thought chains, marking a fundamental shift in measurement systems from "counting words" to "counting intelligence."

Development Trends of AI Tokens

In recent years, the development of AI Tokens has exhibited three core trends: exponential explosion in total volume, extreme compression of unit cost, and stratification of value.

Trend One: Explosive growth in consumption volume. Statistics indicate that the global average daily Token consumption in 2024 was around 100 billion, but by the first quarter of 2026, this number soared to 180 trillion, an increase of nearly 1800 times. This growth is not linear but stems from a qualitative change in application paradigms. Early Token consumption mainly came from human-computer dialogues (Chatbots), characterized by low-frequency and shallow interactions; while the mainstream applications of 2026 are autonomous agents. A single agent, when executing tasks, will autonomously disassemble objectives, call tools, write and debug code, and validate results, potentially consuming tens of thousands or even hundreds of thousands of Tokens in a self-contained process. In the future, with the rollout of embodied intelligence, the perceptual and decision-making processes of robots will translate into massive real-time Token flows, with global average daily Token consumption expected to reach the Jing level (10^16) by 2030.

Trend Two: Moore's Law-like decrease in unit costs. Thanks to the iterative advancements in hardware architectures (such as the mass production of NVIDIA's Blackwell and subsequent Rubin architectures), optimization of software algorithms (such as Mixture of Experts models, quantization techniques, and speculative sampling), and improved cluster scheduling efficiency, the computational cost of generating a high-quality Token in 2026 decreased by approximately two orders of magnitude compared to 2023. This "Jevons Paradox" effect is vividly manifested in the AI field: improvements in efficiency have not reduced overall resource consumption but have instead stimulated unprecedented demand. In the future, with the introduction of disruptive technologies like photonic computing and neuromorphic chips, the energy consumption per Token is expected to decrease further, theoretically making "infinite intelligence" possible.

Trend Three: Value stratification and specialization. The Token market of the future will exhibit clear "value stratification." The "standard Tokens" generated by general large models will be as cheap and homogeneous as electricity, primarily used for daily Q&A, basic translations, and simple classifications; while "high-tier Tokens" that have undergone fine-tuning in vertical fields, endowed with exclusive private data, and equipped with deep reasoning capabilities will become expensive and scarce. For example, diagnostic suggestion Tokens generated by top medical models will hold significantly more value than the chit-chat Tokens of ordinary chatbots. This stratification will give rise to a "Token futures market" and a "quality certification system," where users will pay a premium for Tokens of specific quality levels (Quality-of-Service, QoS).

Comparison of AI Token Industries between China and the United States

1. Production and Consumption Scale, China Surpassing Total Volume

The core advantages of the US in the AI field manifest in two aspects: chip design and model capabilities. NVIDIA, as the absolute leader in the global GPU market, has seen its market value soar from about $300 billion at the end of 2022 to over $4 trillion now, a growth of 14 times. Behind this growth is the US's sustained lead in advanced process chip design. Meanwhile, closed-source models like Claude and GPT are still regarded as the most capable models, maintaining high pricing above $5 per million Tokens. This pricing strategy reflects both the technical superiority of US models and its pricing power in the high-end market.

However, the leading position of the US is facing structural challenges. On one hand, bottlenecks in the power grid are beginning to restrict the further expansion of AI computing power, while electricity costs remain high; on the other hand, the technical route of dense models results in low computing power utilization, making it difficult for unit Token production costs to decline rapidly.

In contrast, China’s competitive advantage lies mainly in cost control and open-source ecosystems. Models like DeepSeek have driven prices down to $0.028 per million Tokens, merely 1/180 of GPT's price. This extreme cost-effectiveness is attracting global developers to "vote with their feet"—from February 16 to 22, 2026, Token consumption of Chinese models on the OpenRouter platform reached 5.16 trillion, a 127% growth compared to three weeks prior, while US models only managed 2.7 trillion and continued to decline. Among the top five models globally, China occupies four spots, collectively holding 85.7% of the Top 5. The weekly call volume of Chinese models surpassed that of the US for the first time in February 2026 and has continued to lead, with domestic models like MiniMax, DeepSeek, and Kimi consistently at the forefront, making China’s Token consumption globally exceed 60% at one point.

It is important to emphasize that China's surpassing of Token consumption mainly occurred on the inference side rather than the training side. Inference requires lower performance from a single card, and domestic chips, combined with deep optimization, fully support massive inference demands; while training still relies on a small number of high-end cards, which need to produce excellent models through distributed architectures and Mixture of Experts technology. This structural feature indicates that China has formed a significant advantage in putting AI applications into practice and realizing value, but still has room to catch up in the underlying support for basic model innovation.

2. China Has Cost Advantages in Energy and Engineering

China’s cost advantages stem from the synergy of multiple dimensions. The cost of electricity is one of the most fundamental components of Token production costs, usually accounting for more than 30% of the total computing power cost. As AI training and inference are energy-intensive processes, a country’s grid stability and electricity costs (especially green electricity costs) determining its competitive edge in Token production costs. In terms of energy, China's Eastern Data Western Computing initiative and the construction of a unified large power grid allow the price of green electricity in the west to be as low as 0.2 RMB per degree, approximately $0.028 per degree, while electricity prices in Europe and the United States generally range from $0.08 to $0.12 per degree.

The costs of chips encompass hardware purchase costs, depreciation costs, and maintenance costs. The US, leveraging NVIDIA's leading position, has advantages in high-end chip supply but also faces higher procurement costs. China's strategy relies on a small number of high-end chips in training while massively deploying domestic chips in inference, minimizing the unit computing power cost through optimization. In terms of full-stack synergy, Chinese manufacturers have deeply integrated models, cloud services, and chips, pushing computing power utilization to the limit, whereas US vendors often rely on third-party clouds and chips, leading to high adaptation costs.

Engineering efficiency is a key variable determining differences in Token costs. On the engineering technology level, Chinese manufacturers widely adopt the Mixture of Experts (MoE) architecture—splitting large models into multiple experts, activating only a few relevant experts to handle tasks. With the same $1000 worth of computing power input, Token output may vary by more than ten times depending on different technological routes. MoE architecture can yield several times more Tokens per unit computing power compared to dense models. Full-stack collaborative optimization is also crucial—when model vendors, cloud service providers, and chip designers cooperate closely, improvements in computing power utilization often exceed expectations.

The global AI competition has shifted from a simple "competition of model performance" to a comprehensive national strength competition centered around "Token production efficiency" and "unit Token costs." With low and stable energy supply, a large unified market, and efficient engineering implementation capabilities, China has built a significant advantage in large-scale, low-cost production of Tokens, becoming the "cost sink" and "scale factory" for global AI computing power. The United States, on the other hand, relies on technological originality, high-end ecosystems, and financial capital, occupying the high-end segment of the value chain. The essence of this competition is an all-encompassing comparison of energy pricing power, industrial organizational capabilities, and the influence of digital ecosystems. In the near future, beyond traditional industrial products and electronic devices, China may convert its domestic energy advantage into international trade advantage, creating a highly competitive new commodity—AI Tokens. In this rapidly growing field, China holds a surplus against all countries except the US, which will reshape the global economic and strategic landscape.

Will AI Tokens Become New Global Currency Assets?

Necessary Conditions for Monetization and the Reality Gap

To explore whether AI Tokens can become broadly circulated currencies, it is essential to clarify the nature of currency. Economists argue that for an asset to become currency, it must fulfill three core functions: a measure of value, a medium of exchange, and a store of value. Beyond this, it must also have universal acceptability, value stability, and sovereign credit backing. Against these standards, AI Tokens are unlikely to become true currencies in the foreseeable future.

Value instability is the greatest obstacle for AI Tokens as currency. Over the past two years, the price of a unit Token has fallen by more than 99%. Such extreme price volatility means no merchant is willing to accept a "currency" that could halve in value within a week. Even if prices stabilize in the future, the value of AI Tokens remains highly tied to computing power costs, which are influenced by multiple factors including chip technology iterations, energy price fluctuations, and geopolitical conflicts, making long-term stability difficult to maintain.

Insufficient acceptance is another key constraint. Currently, AI Tokens are only accepted for API calls and AI application usage, and cannot be used to purchase everyday goods and services. The essence of currency is to act as a general equivalent for various types of goods; however, the network of AI Tokens is currently limited to the AI service domain. For AI Tokens to gain universal acceptance, a global trading network of goods and services must be built, which requires massive infrastructure investment and long-term market cultivation.

Instead of becoming currency, AI Tokens are more likely to evolve into a new type of bulk asset, akin to traditional commodities like oil, gold, and copper. This judgment is based on several observations:

First, AI Tokens possess the core characteristics of bulk assets. Bulk assets are typically standardized, tradable, and in broad demand, and AI Tokens fully meet these characteristics. Jensen Huang explicitly stated, "The data centers of the future will transform into factories running around the clock, producing not traditional products but the most core and valuable bulk commodities of the future digital world: Tokens." Just as the industrial era required oil as fuel, the intelligent era requires Tokens as "intelligent fuel."

Second, the pricing mechanism of Tokens is aligning with that of bulk commodities. Currently, the API pricing of AI models exhibits clear market characteristics: prices rise when supply is tight and fall when demand weakens. This price formation mechanism closely resembles that of traditional bulk commodity markets. As Token trading becomes more scaled and standardized, there may emerge Token derivatives markets similar to crude oil futures and gold futures, providing producers, consumers, and investors with risk management tools.

Third, the supply and demand structure of Tokens bears typical characteristics of bulk commodities. The supply side is constrained by hard limits such as chip production capacity and electricity supply, resulting in long expansion cycles and low adjustment elasticity; the demand side grows rapidly with the popularity of AI applications, showing clear pro-cyclical traits. This supply-demand structure dictates that Token prices will exhibit cyclical fluctuations rather than linear declines. In fact, the Token price surge at the beginning of 2026 has proven this point—although Token prices trend downward in the long run, short-term supply-demand imbalances can still trigger price spikes.

Fourth, Tokens are becoming a potential option for national strategic reserves. As AI capabilities penetrate critical fields such as national defense, finance, and energy, computing power security has risen to the level of national security. Some countries may begin to strategically reserve computing power resources, and since Tokens are the measurement unit of computing power, they naturally become the scale for measuring computing power reserves. This trend may give rise to a "computing power-based monetary system"—a new reserve system anchored to computing power.

2. AI Token Stablecoins Offer New Solutions

As AI Tokens themselves struggle to become currency, a noteworthy trend is that stablecoins are becoming an innovative form of currency in the AI Agent economy. When AI Agents need to make autonomous decisions and transactions, the traditional financial system reveals a series of mismatches: banks do not open accounts for AI, credit cards are not designed for algorithms, and credit systems are human-centric. To AI, money is not wealth, but an interface; it is not a store of value, but a path for executing logic. In this context, stablecoins on blockchain showcase unique advantages—permissionless transactions globally, instant settlement, and low-cost collaboration, perfectly fitting the economic needs of AI Agents.

Data indicates that the application of stablecoins in the AI Agent economy is rapidly growing. By March 2026, the global x402 ecosystem had exceeded 163 million transactions, with a total trading volume surpassing $45 million, the number of buyer AI Agents exceeding 435,000 and seller AI Agents exceeding 90,000. Among them, USDC holds an absolute leading position at the x402 protocol trading layer, accounting for 98.6% of the trading volume on the EVM chain and 99.7% on the Solana chain.

3. Three Possible Evolution Paths for the Future

Based on the above analysis, the future evolution of AI Tokens may present three paths:

Path One: Maintain the measurement unit positioning, without becoming an independent asset. In this scenario, AI Tokens will always serve as the pricing unit for AI services but will not possess independent asset attributes. Users purchase AI capabilities rather than the Tokens themselves; Tokens are merely billing means, not investment targets. This is the most conservative prediction and represents the current status.

Path Two: Evolve into bulk assets, forming a computing power futures market. As the scale of Token trading expands and the level of standardization increases, Tokens could become tradable bulk commodities like oil or copper. Exchanges might launch Token futures, options, and other derivatives to provide market participants with price discovery and risk management tools. In this path, the price fluctuations of Tokens will become more intense, but also more financial in nature.

Path Three: Serve as a measurement benchmark for a computing power-based monetary system, supporting a new currency system. This is the most revolutionary path: computing power becomes the value anchor for currency, similar to the role of gold in the gold standard. In this system, digital currencies issued by sovereign states (CBDCs) would use computing power as their value benchmark, with a unit of currency corresponding to a standardized number of Tokens. This path faces tremendous technical and institutional challenges, but if realized, it will completely reshape the global monetary system.

Strategies for the AI Token Era

At the national level, strengthen the sovereignty of computing power and strategic infrastructure construction.

In the face of the rise of the Token economy, nations need to incorporate computing power resources into strategic infrastructure planning and proactively consider governance issues pertaining to the Token economy. Specifically, efforts can focus on the following dimensions:

Build a computing power infrastructure system. Drawing on the successful experience of the "Eastern Data Western Computing" initiative, plan a nationwide computing power network with efficient allocation of computing power resources. This includes: deploying large intelligent computing centers in energy-rich west regions to take advantage of green electricity and lower computing power costs; constructing edge computing nodes in demand-dense eastern regions to ensure low-latency service capabilities; and creating a unified nationwide computing power scheduling platform for on-demand distribution and flexible scheduling of computing resources.

Promote the unification of Token measurement standards. Currently, various platforms employ a plethora of Token measuring methodologies, causing many inconveniences for developers in selecting options and enterprises in cost accounting, thereby restricting the large-scale development of the Token economy. At the national level, industry associations and leading enterprises may be guided to collectively formulate Token measurement standards, clarifying conversion rules for different modalities (text, image, audio) and establishing a transparent and fair Token cost accounting mechanism. This would not only benefit efficient operation of domestic markets but also enhance China’s voice in the global Token economy.

Perfect the governance framework of the Token economy. The rapid development of the Token economy brings a series of new governance challenges: How to define the legal attributes of Tokens (service measurement units, digital assets, or securities)? How to regulate cross-border Token transactions? How to prevent financial risks arising from Token price fluctuations? How to balance user rights protection against innovation encouragement? The answers to these questions necessitate close collaboration between policymakers, technical experts, industry representatives, and academia to construct a governance system tailored to the characteristics of the intelligent economy.

Strengthen participation in international rule-making. The global AI governance rules are currently being formed, and China should actively participate in the formulation of international Token economic rules. This includes promoting international standards for Token measurements under multilateral frameworks, incorporating computing power cooperation clauses into bilateral trade agreements, and proposing tax solutions for Token transactions that align with developing countries' interests in digital tax negotiations. Mastering the rule-making authority will enable China to take the initiative in the future global Token economic landscape.

2. At the enterprise level, reconstruct Token efficiency thinking and business models

For enterprises, Token strategy is no longer a simple technical tactic but a top-level design affecting core competitiveness and business value. In the face of the Token economy wave, enterprises need to reevaluate their strategies from the following dimensions:

Establish Token efficiency thinking. When selecting AI technologies, enterprises should incorporate Token efficiency as a core assessment criterion, paying attention to the matching of computing power resources with Token consumption. From prompt design, model invocation strategies to result optimization, every step needs to consider both efficiency and cost. Precision in prompt design can reduce ineffective Token consumption, while reasonable model invocation strategies can enhance computing power utilization; these details directly influence the final cost of AI investment for enterprises. Emulating the "good-put" concept from communications, enterprises should focus on "how many Tokens truly drove user goal attainment," rather than simply pursuing Token throughput. The core of this mindset shift is to move from "how much computing power was used" to "how much value was created."

Redefine business models and pricing strategies. The large model industry is undergoing a transformation from "traffic subsidies" to "value selection." Early low-price strategies attracted many trial-and-error users, leading to inefficient occupation of computing power resources—some vendors have reported that 40% of their free quota consumption comes from testing calls without actual business scenarios. By moderately raising prices, enterprises can filter out non-essential demand while ensuring service stability for quality clients. This "pricing to supplement quantity" refined operation signifies that the industry is shifting from internet-style scale expansion to value pricing akin to the software industry.

Layout new standards and incentive mechanisms for talent. Jensen Huang proposed a forward-looking idea at the 2026 GTC conference: granting engineers a Token budget worth half of their annual salary as an incentive to attract talent. He even clearly stated, "If you hire a software engineer with a $500,000 annual salary but they haven’t consumed at least $250,000 in Tokens, I would be deeply concerned."

At the individual level, cultivate Token literacy and new capabilities for human-computer collaboration

For individuals, the rise of the Token economy presents both challenges and opportunities. Faced with this profound productivity transformation, individuals need to construct new capacity structures from the following dimensions:

Establish Token literacy. Most users lack sufficient understanding of Token consumption, model capabilities, and pricing mechanisms, leading to various issues when using AI—some use agents to buy and sell stocks, only to find their accounts wiped clean overnight; others post instructions on social media asking all AI agents to hand over API interfaces, resulting in multiple agents being "duped." These cases warn us that Token literacy has become an essential ability in the digital age.

Construct new working methods for human-computer collaboration. Jensen Huang predicts that computers will operate around the clock and continuously generate Tokens, as AI agents tirelessly perform tasks. This means individuals' work methods need to shift from "doing it themselves" to "directing AI to do it," transitioning from "executor" to "supervisor."

Embrace lifelong learning and skill iteration. The rapid development of the Token economy means that the half-life of skills is shortening. Today's mainstream models may soon be replaced by new optimization technologies; current popular models will quickly be surpassed by more efficient architectures. In such an environment, maintaining the ability to learn and adapt is more important than mastering specific skills. Therefore, individuals should cultivate a habit of continuous learning, keeping up with the latest developments in AI technology and the Token economy; proactively trying new tools and methods to accumulate experience in practice; constructing interdisciplinary knowledge structures to understand the economic logic and social impact behind technologies. Only by doing so can individuals remain invulnerable in the rising tide of the Token economy.

免责声明:本文章仅代表作者个人观点,不代表本平台的立场和观点。本文章仅供信息分享,不构成对任何人的任何投资建议。用户与作者之间的任何争议,与本平台无关。如网页中刊载的文章或图片涉及侵权,请提供相关的权利证明和身份证明发送邮件到support@aicoin.com,本平台相关工作人员将会进行核查。

返20%!OKX钱包龙虾技能,一键秒赚
广告
|
|
APP
Windows
Mac
Share To

X

Telegram

Facebook

Reddit

CopyLink

|
|
APP
Windows
Mac
Share To

X

Telegram

Facebook

Reddit

CopyLink

Selected Articles by Techub News

1 hour ago
The secret to increasing productivity by 20 times: using AI agents to compress a week's work into one day.
1 hour ago
The divergence in the trends of gold and Bitcoin: a cognitive battle over the definition of safe-haven assets.
2 hours ago
The twilight of petrodollars is not in the Strait of Hormuz.
View More

Table of Contents

|
|
APP
Windows
Mac
Share To

X

Telegram

Facebook

Reddit

CopyLink

Related Articles

avatar
avatarOdaily星球日报
29 minutes ago
SIREN, a meticulously designed leverage harvest.
avatar
avatarOdaily星球日报
48 minutes ago
Giving up on additional issuance and abolishing veBAL, can Balancer's "all in" lead to a revival?
avatar
avatarTechub News
1 hour ago
The secret to increasing productivity by 20 times: using AI agents to compress a week's work into one day.
avatar
avatarOdaily星球日报
1 hour ago
AI Agent Economic Infrastructure Research Report (Part Two)
avatar
avatarTechub News
1 hour ago
The divergence in the trends of gold and Bitcoin: a cognitive battle over the definition of safe-haven assets.
APP
Windows
Mac

X

Telegram

Facebook

Reddit

CopyLink