A Brief Analysis of McKinsey's Lilli: What Development Ideas Does It Provide for the Enterprise AI Market?

CN
PANews
Follow
13 hours ago

McKinsey's Lilli case provides key development insights for the enterprise AI market: the potential market opportunity of edge computing + small models. This AI assistant, which integrates 100,000 internal documents, not only achieved a 70% adoption rate among employees but is also used an average of 17 times per week, a level of product stickiness that is rare in enterprise tools. Below are my thoughts:

1) Enterprise data security is a pain point: The core knowledge assets accumulated by McKinsey over 100 years, as well as specific data accumulated by some small and medium-sized enterprises, have strong data sensitivity and should not be processed on public clouds. Exploring a balanced state of "data not leaving the local environment, AI capabilities not compromised" is a pressing market need. Edge computing is a direction to explore;

2) Specialized small models will replace general large models: What enterprise users need is not a "100 billion parameter, all-purpose" general model, but a specialized assistant that can accurately answer specific domain questions. In contrast, there is a natural contradiction between the generality of large models and their professional depth, and in enterprise scenarios, small models are often more valued;

3) Cost balance of self-built AI infrastructure and API calls: Although the combination of edge computing and small models requires a larger initial investment, the long-term operating costs are significantly reduced. Imagine if the AI large model frequently used by 45,000 employees comes from API calls; this dependency, along with the increase in usage scale and feedback, would make self-built AI infrastructure a rational choice for medium and large enterprises;

4) New opportunities in the edge hardware market: Large model training relies on high-end GPUs, but edge inference has completely different hardware requirements. Chip manufacturers like Qualcomm and MediaTek are seizing market opportunities with processors optimized for edge AI. As every enterprise aims to create its own "Lilli," edge AI chips designed for low power consumption and high efficiency will become essential infrastructure;

5) The decentralized web3 AI market is also strengthening: Once enterprises' demands for computing power, fine-tuning, and algorithms on small models are stimulated, balancing resource scheduling will become an issue. Traditional centralized resource scheduling will become problematic, directly creating significant market demand for decentralized web3 AI fine-tuning networks, decentralized computing power service platforms, and more;

While the market is still discussing the boundaries of AGI's general capabilities, it is encouraging to see many enterprise end users already exploring the practical value of AI. Clearly, compared to the past monopolistic leaps in computing power and algorithms, shifting the market focus to edge computing + small models will bring greater market vitality.

免责声明:本文章仅代表作者个人观点,不代表本平台的立场和观点。本文章仅供信息分享,不构成对任何人的任何投资建议。用户与作者之间的任何争议,与本平台无关。如网页中刊载的文章或图片涉及侵权,请提供相关的权利证明和身份证明发送邮件到support@aicoin.com,本平台相关工作人员将会进行核查。

Gate:注册解锁$6666
Ad
Share To
APP

X

Telegram

Facebook

Reddit

CopyLink