AI welcomes the "USB-C moment," how can MCP perfectly integrate with Ethereum?

CN
PANews
Follow
1 day ago

Content | Bruce

Editing & Typesetting | Huanhuan

Design | Daisy

The "USB-C moment" in the evolution of AI, in November 2024, the MCP protocol released by Anthropic is causing a seismic shift in Silicon Valley. This open standard, dubbed the "USB-C of the AI world," not only reconstructs the way large models connect with the physical world but also conceals the key to breaking the AI monopoly dilemma and reconstructing the production relations of digital civilization. While we are still debating the parameter scale of GPT-5, MCP has quietly paved the decentralized path to the AGI era…

Bruce: Recently, I have been studying the Model Context Protocol (MCP). This is the second thing that excites me in the AI field after ChatGPT, as it holds the promise of solving three questions I have pondered for years:

  • How can non-scientists and geniuses, ordinary people, participate in the AI industry and earn income?
  • What are the win-win combinations between AI and Ethereum?
  • How to achieve AI decentralization? Avoid the monopoly and censorship of large centralized companies, and prevent AGI from destroying humanity?

01. What is MCP?

MCP is an open standard framework that simplifies the integration of LLMs with external data sources and tools. If we compare LLMs to the Windows operating system, with applications like Cursor being the keyboard and hardware, then MCP is the USB interface, supporting the flexible insertion of external data and tools, allowing users to read and utilize these external resources.

MCP provides three capabilities to extend LLMs:

  • Resources (knowledge extension)
  • Tools (executing functions, calling external systems)
  • Prompts (pre-written prompt templates)

MCP can be developed and hosted by anyone, provided as a server, and can be taken offline at any time.

02. Why is MCP needed?

Currently, LLMs perform extensive computations using as much data as possible to generate a large number of parameters, embedding knowledge into the model to produce conversational outputs. However, there are several significant issues:

  1. The vast amount of data and computations requires considerable time and hardware, and the knowledge used for training is often outdated.
  2. Large parameter models are difficult to deploy and use on local devices, but in reality, most users may not need all the information to meet their needs.
  3. Some models use web scraping to read external information for timely computations, but due to the limitations of scraping and the quality of external data, they may produce more misleading content.
  4. Since AI has not effectively benefited creators, many websites and content are beginning to implement anti-AI measures, generating a large amount of junk information, which will gradually lead to a decline in the quality of LLMs.
  5. LLMs struggle to extend to various external functions and operations, such as accurately calling GitHub interfaces to perform certain actions; they may generate code based on potentially outdated documentation but cannot ensure precise execution.

03. The architectural evolution of fat LLMs and thin LLMs + MCP

We can view the current ultra-large models as fat LLMs, whose architecture can be represented in the following simple diagram:

AI welcomes the "USB-C moment," how does MCP perfectly integrate with Ethereum?

After the user inputs information, it is decomposed and reasoned through the Perception & Reasoning layer, then calls the massive parameters to generate results.

Based on MCP, LLMs may focus on language parsing itself, stripping away knowledge and capabilities, becoming thin LLMs:

AI welcomes the "USB-C moment," how does MCP perfectly integrate with Ethereum?

Under the architecture of thin LLMs, the Perception & Reasoning layer will focus on how to parse comprehensive human physical environment information into tokens, including but not limited to: voice, tone, smell, images, text, gravity, temperature, etc., and then orchestrate and coordinate hundreds of MCP Servers through the MCP Coordinator to complete tasks. The training cost and speed of thin LLMs will increase dramatically, and the requirements for deployment devices will become very low.

04. How MCP solves the three major problems

How can ordinary people participate in the AI industry?

Anyone with unique talents can create their own MCP Server to provide services to LLMs. For example, a bird enthusiast can offer their years of bird notes through MCP. When someone uses an LLM to search for information related to birds, it will call the current bird notes MCP service. Creators will also receive revenue sharing as a result.

This is a more precise and automated creator economy cycle, with service content being more standardized, and the number of calls and output tokens can be accurately counted. LLM providers can even simultaneously call multiple bird notes MCP Servers for users to choose and rate to determine which has better quality and receives higher matching weight.

The win-win combination of AI and Ethereum

a. We can build an OpenMCP.Network creator incentive network based on Ethereum. MCP Servers need to be hosted and provide stable services; users pay LLM providers, who then distribute actual incentives through the network to the called MCP Servers to maintain the sustainability and stability of the entire network, stimulating MCP creators to continuously produce and provide high-quality content. This network will require the use of smart contracts to achieve automation, transparency, trust, and resistance to censorship in incentives. Signature, permission verification, and privacy protection during operation can be implemented using Ethereum wallets, ZK, and other technologies.

b. Develop MCP Servers related to operations on the Ethereum chain, such as AA wallet calling services, allowing users to support wallet payments in LLMs through language without exposing related private keys and permissions to LLMs.

c. Various developer tools will further simplify Ethereum smart contract development and code generation.

Achieving AI decentralization

a. MCP Servers will decentralize AI knowledge and capabilities; anyone can create and host MCP Servers, register on platforms like OpenMCP.Network, and receive incentives based on calls. No single company can control all MCP Servers. If an LLM provider offers unfair incentives to MCP Servers, creators will support blocking that company, and users will switch to other LLM providers for fairer competition when they do not receive quality results.

b. Creators can implement fine-grained permission control on their MCP Servers to protect privacy and copyright. Thin LLM providers should offer reasonable incentives to encourage creators to contribute high-quality MCP Servers.

c. The capability gap of thin LLMs will gradually narrow, as human language has a traversal limit and evolves slowly. LLM providers will need to focus their attention and funding on high-quality MCP Servers rather than repeatedly using more GPUs for mining.

d. The capabilities of AGI will be decentralized and de-emphasized, with LLMs serving only as language processing and user interaction, while specific capabilities are distributed across various MCP Servers. AGI will not pose a threat to humanity, as shutting down MCP Servers will only allow for basic language dialogue.

05. Overall Review

  1. The architectural evolution of LLMs + MCP Servers essentially decentralizes AI capabilities, reducing the risk of AGI destroying humanity.
  2. The usage of LLMs allows for token-level statistics and automation of the number of calls and input/output to MCP Servers, laying the foundation for building an AI creator economy system.
  3. A good economic system can drive creators to actively contribute high-quality MCP Servers, thereby promoting the development of humanity and achieving a positive feedback loop. Creators will no longer resist AI, and AI will also provide more jobs and income, reasonably distributing the profits of monopolistic commercial companies like OpenAI.
  4. This economic system, combined with its characteristics and the needs of creators, is very suitable for implementation based on Ethereum.

06. Future Outlook: Next Steps in Script Evolution

  1. MCP or similar MCP protocols will emerge in abundance, and several large companies will begin to compete for the definition of standards.
  2. MCP Based LLMs will appear, focusing on parsing and processing human language with small models, accompanied by MCP Coordinators connecting to the MCP network. LLMs will support automatic discovery and scheduling of MCP Servers without complex manual configuration.
  3. MCP Network service providers will emerge, each with its own economic incentive system, allowing MCP creators to register and host their Servers to earn income.
  4. If the economic incentive system of the MCP Network is built on Ethereum using smart contracts, the transactions on the Ethereum network are conservatively estimated to increase by about 150 times (based on a very conservative estimate of 100 million calls to MCP Servers per day, currently calculating 100 txs per block every 12 seconds).

免责声明:本文章仅代表作者个人观点,不代表本平台的立场和观点。本文章仅供信息分享,不构成对任何人的任何投资建议。用户与作者之间的任何争议,与本平台无关。如网页中刊载的文章或图片涉及侵权,请提供相关的权利证明和身份证明发送邮件到support@aicoin.com,本平台相关工作人员将会进行核查。

Share To
APP

X

Telegram

Facebook

Reddit

CopyLink