Source: Letter AI
Written by: Miao Zheng
OpenClaw can be said to be the hottest topic in the AI circle; even a slight change can trigger the nerves of all AI companies worldwide. Countless product managers are holding creative competitions on OpenClaw.
Chinese companies with a keen sense of smell have all begun to take action. What they see is not just the OpenClaw product itself, but the entire agent market it represents behind it.
This market needs cloud servers, model APIs, localized products, and lower-threshold deployment solutions.
If Chinese AI wants to take a shortcut, it must seize the opportunity, especially as Anthropic and Google have successively suppressed it.
Tencent Cloud and Alibaba Cloud have launched one-click deployment services. They want to take this opportunity to become the "shovel sellers" in the AI circle.
Moon's Dark Side launched the cloud version of Kimi Claw, and MiniMax quickly followed up with the release of MaxClaw. The reasoning is straightforward: a localized OpenClaw still represents a significant gap in the market.
Zhipu and ByteDance, although they have not explicitly stated their position on OpenClaw, have not been idle. The success of OpenClaw has given Zhipu and ByteDance more confidence in agent products.
01 The Cloud Version of OpenClaw from Moon's Dark Side
Before the appearance of OpenClaw, the use of large models was primarily "conversational," where users would ask a question, and the model would provide an answer, resulting in limited token consumption for each call.
But OpenClaw has created an entirely new "model consumption scenario."
A well-configured OpenClaw can make hundreds or even thousands of calls to the model every day, with each call carrying complete contextual information. This means that the token consumption generated by a single OpenClaw user could be dozens or even hundreds of times that of traditional chat users.

Therefore, OpenClaw has become the application with the highest token consumption on OpenRouter. Any model that can integrate into the OpenClaw ecosystem essentially gains a demand pipeline with exponential growth.
When OpenClaw users choose an underlying model, they are not actually selecting based on model performance or knowledge reserves; users are choosing a provider that is consistent, stable, and high-frequency.
The one that is durable, cheap, and user-friendly is the big winner in the OpenClaw ecosystem.
Hence, Moon's Dark Side and MiniMax have reaped significant rewards amid the tidal wave of OpenClaw.
Let’s first talk about the former; their role in the OpenClaw ecosystem has evolved through two stages.
In January 2026, Kimi K2.5 became the model with the highest call volume on the OpenRouter platform due to its affordability and strong agent capabilities.
Data from OpenRouter shows that Kimi K2.5's weekly token usage increased by up to 261% month-over-month. Its calling scenarios primarily come from OpenClaw.
The reason lies in Kimi K2.5's ability to support up to 100 sub-agents performing parallel executions and over 1500 tool calls, capabilities that enable it to excel in agent scenarios.
As a result, after version 1.30, OpenClaw officially designated Kimi K2.5 as the "first officially free main model," allowing users to select the MoonshotAI channel when installing OpenClaw, even permitting them to leave the API Key blank to continue using it, with OpenClaw providing computational subsidies.
This explosive growth directly led to commercial returns.
Driven by a surge in global paid users and API call volume, within less than a month after its release, Kimi K2.5's cumulative revenue over nearly 20 days exceeded the total revenue for the entire year of 2025.
Thanks to OpenClaw, Kimi's overseas paying user base has rapidly increased, and its overseas revenue has first surpassed that of the domestic market. According to SimilarWeb, Kimi's access rate reached 33 million last month, with the proportion of visits from China decreasing from 77% last year to over 60% this year.
At this stage, Moon's Dark Side played the role of a "model supplier," passively providing API services.
However, the Kimi team soon realized that rather than passively providing APIs, they could actively lower the usage threshold for users. A month later, Kimi officially launched Kimi Claw.
This is a cloud-hosted OpenClaw service. Users do not need to deploy locally; they can directly use the complete OpenClaw functionality in their browser.
Kimi Claw eliminates all the complex steps of the original OpenClaw. Users only need to have a Kimi Allegretto or higher membership to create their "cloud OpenClaw" with one click on the web, and the entire process takes less than a minute.
Kimi Claw is built with the Kimi K2.5 model, automatically linking users' Kimi Code membership benefits. Users do not need to configure an API Key or worry about suddenly burning through a large number of tokens leading to exploding bills.
On the functional level, Kimi Claw directly integrates over 5000 skill libraries from the ClawHub community, allowing users to activate them with one click in the web interface, including high-frequency scenarios such as weather inquiries, web searches, browser operations, and email handling.
The original OpenClaw's skills required users to manually search, install, and configure through command lines, which posed a barrier for average users. Kimi Claw consolidates these skills into the interface, allowing users to activate them with just a click and reducing the difficulty of user adoption.
The core philosophy of the original OpenClaw is "local first," meaning that all dialogue memory and files are stored on the user's own device. This design protects privacy but also brings inconvenience; if a user changes devices, they must reconfigure, and the AI assistant's memory cannot continue.
Kimi Claw, however, provides 40GB of cloud storage space, allowing users to switch seamlessly between their office computers, home computers, and mobile phones, with the AI assistant's memory remaining consistent.
This experience is a practical feature for users who need to switch between multiple devices.
02 MiniMax Did the Same
On February 25, MiniMax also launched its own MaxClaw, taking a route similar to Kimi Claw but even more aggressive.
MaxClaw is based on the MiniMax M2.5 model. Although this is a large model with about 230 billion total parameters, only about 10 billion are activated for single inferences, its API pricing is extremely low.
The performance of M2.5 on OpenRouter is also impressive. Within 12 hours of its release, it topped the OpenRouter popularity list, and within a week, it hit the top of the call volume chart, with weekly call volume surging to 3.07 trillion tokens, surpassing the combined total of Kimi K2.5, GLM-5, and DeepSeek V3.2. Within seven days of release, token usage exceeded 3 trillion, reaching 4.55 trillion in token usage for the month of February, landing it in the top position.
Why can M2.5 achieve such astonishing growth in a short period? The answer is also OpenClaw.
MiniMax's pricing strategy for M2.5 is one of "extreme cost-effectiveness." Operating at a rate of 100 tokens per second, it costs only $1 to run continuously for an hour; whereas at a rate of 50 tokens per second, it only costs $0.3.
This means that running a 7×24 hour OpenClaw instance with M2.5 could cost only about Claude Sonnet's 1/10 to 1/20. For agent scenarios requiring high-frequency tool calls, this cost difference is decisive.
Developers in the OpenClaw community quickly recognized this.
On external forums, a large number of tutorials appeared on "how to configure M2.5 in OpenClaw," and even developers wrote migration guides for "migrating from Claude to M2.5." The spread within the open-source community is more effective than any advertising campaign, and it's worth noting that OpenClaw has come this far due to the support of the open-source community.
MiniMax also integrated MaxClaw into its own MiniMax Agent ecosystem, synchronously upgrading with Expert 2.0, forming a complete product matrix of "Conversational AI + agent workflows."
The reaction from the capital market has been even more direct. On February 20, the first trading day of the Year of the Horse in Hong Kong, MiniMax's stock price surged 14.52% in a single day, and its market value momentarily broke 304.2 billion Hong Kong dollars, setting a new high since the company's IPO.

Since its IPO, MiniMax's stock price has cumulatively risen by over 480%, with the highest increase of over 450% from the beginning of 2026 to now, successfully ranking as a core AI target in the Hong Kong stock market. JPMorgan Chase has given MiniMax an "overweight" rating, with a target price of 700 Hong Kong dollars. On March 2, MiniMax released its financial report, showing a revenue of $79 million for the 2025 fiscal year, representing a year-on-year increase of 158.9%.
The traffic bonus brought by OpenClaw has allowed Moon's Dark Side and MiniMax to convert it into their own user assets.
Think about it: when a user uses Kimi K2.5 through OpenClaw for a month, they have already become accustomed to the output style, response speed, and capability boundaries of this model. At this point, if Kimi Claw is introduced to them and they are told, "You don’t need to maintain the server yourself, you don’t need to configure the API Key anymore, just use it on our platform, and you can also sync on multiple ends," what do you think the conversion rate will be?
Can it be low?
The cloud OpenClaw products from these two companies essentially offer agent services in a SaaS format, where users pay for a low-threshold product. While they may not be as extensible as the original OpenClaw, they excel in affordability and usability.
In fact, the vast majority of users do not have such cutting-edge needs. They might just want the AI to help check emails, organize documents, set reminders, and query information.
Returning to the beginning of the paragraph, Kimi and MiniMax understand this principle: the value of an agent lies not in the quality of a single conversation but in long-term, continuous, and stable task execution capabilities.
Thus, they fill a gap in the market perfectly.
03 Tencent, Alibaba, and Baidu's "Shovel-selling" Business
OpenClaw has a high threshold for average users. You need to have your own server, know how to configure the Node.js environment, learn how to apply for API Keys from various models, and understand how to set up messaging channels.
The entire process takes at least half an hour to an hour; tech novices usually give up after seeing the tutorials. Such a product with strong geek attributes is destined to only be popular in developer circles, making it hard to really reach the mass market.
What BAT sees is precisely this pain point. Instead of letting users struggle, why not provide a set of ready-to-use solutions directly?
So after the explosive popularity of OpenClaw, BAT almost simultaneously introduced one-click deployment services for OpenClaw.
What these cloud service providers offer is not just a simple server. They have packaged the entire operating environment of OpenClaw, including pre-configured images, automated deployment scripts, already debugged dependencies, and even ready-to-use model API access schemes.
Users only need to click a few times, select configurations, make payments, and then wait a few minutes for a complete OpenClaw instance to be up and running. Baidu Intelligent Cloud launched the one-click deployment service for OpenClaw and was the first to lay out for the mobile end. Additionally, it has made Baidu AI search available as a skill on the open community ClawHub. Right before the Spring Festival, the Baidu App also officially supported one-click access to OpenClaw, allowing users to invoke it through the search bar or message center.
Tencent Cloud's solution is relatively concise and straightforward.
They launched the "Cloud Application" function on lightweight application servers, enabling users to finish OpenClaw deployment in three steps. The system defaults to configuring the DeepSeek API as the model provider, but users can freely switch to Kimi, MiniMax, or other domestic models in the Dashboard.
Tencent's official document states clearly: "OpenClaw comes from the open-source community; cloud applications are free," but it quickly adds, "Cloud server and API fees are charged based on actual consumption."
Tencent does not profit from OpenClaw itself but from cloud server rental fees, traffic fees, and costs generated from model API calls. They do not force users to bind to their own models, leaving space for user choice, but at the infrastructure level, there’s no escape.
Alibaba Cloud's approach is even more "ecological."
After deploying OpenClaw on lightweight application servers, the system guides users to the "Alibaba Cloud Forging Model Console" to create an API Key, defaulting to the Tongyi Qianwen series of models.
Alibaba Cloud has also launched a "Coding Plan," a universal subscription plan for all kinds of AI coding tools that is compatible with OpenClaw integration.
In other words, Alibaba intends to promote its own AI programming and model APIs through OpenClaw's installation service.
What Alibaba and Tencent want is to capture the "utilities" of the agent era.

The explosive success of OpenClaw proves a trend: future AI applications are not simply "chatbots," but agents that are online 24/7, can execute complex tasks, and require stable computational support.
When individual users and small to medium-sized enterprises begin to deploy agents, they will need not only model APIs but also cloud servers, storage space, network bandwidth, as well as integration with messaging platforms like Feishu, DingTalk, and WeChat, along with security sandbox environments, and, last but not least, specific execution tools like AI programming.
This is why Tencent Cloud and Alibaba Cloud provide the "one-click deployment for OpenClaw" service, in order to seize the entrance to this emerging market.
Their logic is clear: today users come because of OpenClaw, but tomorrow they may come for other agent products. However, as long as users become accustomed to deploying agents on their cloud platforms, they become long-term customers.
More importantly, when every enterprise needs to deploy its own agent, those who can provide the most convenient, stable, and localized infrastructure will dominate this trillion-dollar market from the ground up.
The cloud vendors understand very well that selling shovels is often more stable and profitable than digging for gold.
Not only that, OpenClaw represents a signal that future agent products will only increase.
The cloud vendors are currently positioning themselves early, establishing user habits, and building ecological barriers. When agents become standard for enterprises and individuals, the ones who can provide more supporting services will have the power of discourse.
04 Why Zhipu and Doubao Are Not Aggressive
In the competition surrounding OpenClaw, Zhipu and ByteDance's attitude seems somewhat subtle.
However, this does not mean they have fallen behind in the agent race; on the contrary, they have chosen a more unique path.
Zhipu's attitude towards OpenClaw can be summarized as "technologically supportive but not strategically promoted."
Zhipu's GLM-5 explicitly provides an OpenClaw integration guide in its official documentation, and the GLM's Coding Plan package also supports OpenClaw configurations.
Zhipu even launched an "AutoGLM version of OpenClaw," supporting integrated configuration of OpenClaw with Feishu. From these actions, it is clear that Zhipu has not ignored OpenClaw and has provided the basic support expected from a Chinese AI vendor.
Zhipu places greater importance on AutoGLM, which is an agent capable of "Phone Use." Open-sourced in December 2025, AutoGLM can complete dozens of complex operations like placing food delivery orders and booking flights, and it supports over 50 high-frequency Chinese applications, including WeChat, Taobao, and Douyin.
The core technology of AutoGLM is a visual language model that does not rely on traditional APIs but "sees" the screen like the human eye, predicting the next action directly by understanding the semantics of UI elements.
The advantage of this method is that as long as it can see the interface, it can perform actions. This means that AutoGLM can operate any applications, including those without open APIs.
The core scenario of OpenClaw is on the desktop and requires configuration of overseas messaging platforms, which are not widely used in China. In comparison, AutoGLM directly executes tasks in the most commonly used Chinese applications without needing dependent overseas messaging platforms, making it more in line with Chinese users' usage habits.
Zhipu's thought is that since OpenClaw has proven the market demand for agents, the agent that is truly suitable for Chinese users should be able to operate WeChat, Taobao, and Douyin—namely, AutoGLM.
ByteDance's attitude towards OpenClaw is even more subtle. On the surface, only the one-click deployment of OpenClaw mentioned earlier by Volcano Engine exists.
This is because ByteDance's focus on agents is on the mobile end.
Last year, ByteDance collaborated with ZTE Nubia to launch the nubia M153 test phone, which has the "Doubao Phone Assistant Technology Preview" built-in. Its core technology is UI-TARS, a purely visually-driven GUI agent model.
Compared to OpenClaw, Doubao Phone Assistant has more advantages.

Doubao is directly integrated into the Android system layer, allowing operations from the lower level without opening applications, completely unaffected by the user's current use.
OpenClaw, on the other hand, requires browser control or API calls to operate applications, with permissions and stability being limited.
At the same time, Doubao Phone Assistant can achieve complex cross-application operations, for example, "Help me compare prices on three food delivery platforms, and then order the cheapest." OpenClaw's cross-application capabilities are limited, and many tasks cannot be completed across applications, with slow switching speeds between different applications.
ByteDance officials consistently emphasize that agents should be integrated into the operating system and capable of directly operating all applications, rather than being standalone programs that require manual configuration and run on servers.
This ideological difference determines that ByteDance will not invest too many resources in OpenClaw. ByteDance maintains a distance from OpenClaw because it is building a higher-dimensional solution.
Of course, the strategic choices made by Zhipu and ByteDance come at a cost. They missed a wave of traffic bonuses during the peak popularity of OpenClaw.
However, from a long-term perspective, which choice is more correct still needs time to verify. The agent track has only just begun, and it is too early to draw conclusions now.
免责声明:本文章仅代表作者个人观点,不代表本平台的立场和观点。本文章仅供信息分享,不构成对任何人的任何投资建议。用户与作者之间的任何争议,与本平台无关。如网页中刊载的文章或图片涉及侵权,请提供相关的权利证明和身份证明发送邮件到support@aicoin.com,本平台相关工作人员将会进行核查。