How powerful is the "Digital Services Act"? The EU suddenly investigates Musk's X platform.

CN
2 hours ago

Author: Zhang Feng

On January 26, 2026, the European Commission officially announced the initiation of a new formal investigation into X platform (formerly Twitter), owned by Elon Musk, under the Digital Services Act (DSA). The core focus of this investigation is the recommendation algorithm driven by the built-in artificial intelligence model Grok on the X platform, specifically assessing whether this algorithm poses risks of disseminating illegal content, including forged explicit pornographic images, child sexual abuse materials, and other violations.

The European Commission has made it clear that it will continue to collect evidence through various means, including issuing more information requests, conducting interviews, and implementing on-site inspections. If the X platform fails to make substantial adjustments to its services to meet compliance requirements, the EU will take temporary measures in accordance with the law and declare non-compliance, at which point the X platform may face hefty fines and even business restrictions as severe penalties.

I. Background and Main Content of the Digital Services Act (DSA)

The investigation into the X platform by the EU is primarily based on the Digital Services Act (DSA). Starting from August 2023, Amazon's third-party marketplace, Apple's App Store, and 16 other internet services are subject to the EU's Digital Services Act, which will apply to all online platforms in the EU from February 17, 2024. This act is a core component of the EU's digital strategy and is one of the strictest and most systematic digital service regulatory frameworks globally. Its introduction is not coincidental but is based on the numerous governance challenges brought about by the rapid development of digital services, aiming to regulate the behavior of digital service providers, protect user rights, and maintain fairness and security in the digital space.

(1) Background

With the rapid development of internet technology, digital services have deeply integrated into the daily lives of EU citizens, but the chaos in the digital space has also become increasingly prominent, posing urgent challenges for EU regulatory authorities. On one hand, the spread of illegal or inappropriate content on digital platforms has become rampant, endangering citizens' personal rights and dignity, potentially triggering social conflicts, disrupting public order, and even threatening national security. On the other hand, large digital platforms, leveraging their monopolistic positions, abuse market advantages, arbitrarily collect user data, manipulate recommendation algorithms, and restrict competition, harming user privacy rights and hindering innovation in the digital industry.

The formulation of the Digital Services Act is precisely to address these challenges. Its core goal is to establish a safe, transparent, and fair digital single market, balancing the relationship between digital service innovation and the protection of user rights and public interest, while also strengthening the EU's voice in global digital governance. The DSA establishes unified regulatory rules that cover all providers of digital services within the EU, regardless of whether their headquarters are located inside or outside the EU, and must comply with the requirements of the act, effectively curbing "regulatory arbitrage" behavior.

(2) Main Content

The content of the Digital Services Act covers various aspects of digital service regulation, focusing on three main principles: "tiered regulation, clear responsibilities, and transparency and traceability," categorizing digital service providers into different tiers and imposing varying degrees of regulatory obligations based on their scale and influence.

First, it clarifies the scope of regulation. The DSA's regulatory scope is extremely broad, covering all natural persons, legal entities, and other organizations providing digital services to users within the EU, regardless of whether their headquarters are located inside or outside the EU, as long as their services include EU citizens. Specifically, it encompasses various types of digital services, including social platforms, search engines, e-commerce platforms, cloud service providers, and app stores. Among these, "Very Large Online Platforms" (VLOPs) and "Very Large Online Search Engines" (VLOSEs) are subject to stricter regulatory requirements—according to the DSA, platforms with over 45 million monthly active users will be classified as "Very Large Online Platforms," including X platform, Facebook, and Instagram under Meta.

Second, it establishes tiered platform responsibilities. The DSA categorizes digital service providers into four tiers based on their scale and influence, each with different responsibilities: the first tier is basic service providers (e.g., internet access services), which only need to fulfill the most basic compliance obligations, such as cooperating with regulatory investigations; the second tier is ordinary digital service providers (e.g., small social platforms, niche e-commerce), which must fulfill obligations such as content moderation and handling user complaints; the third tier is hosting service providers (e.g., cloud storage, forum hosting), which must conduct preliminary reviews of user-uploaded content and promptly remove obvious illegal content; the fourth tier is Very Large Online Platforms and Very Large Online Search Engines, which must fulfill the strictest regulatory obligations, including conducting regular risk assessments, establishing independent compliance systems, and undergoing third-party audits.

Third, it regulates algorithms and content management. This is one of the core contents of the DSA and a key basis for the EU's investigation into the X platform. The act explicitly requires digital service providers to regulate their recommendation algorithms, ensuring the transparency, explainability, and fairness of the algorithms, and prohibits the use of algorithms to disseminate illegal content, mislead users, or engage in discriminatory practices. For Very Large Online Platforms, they must also regularly disclose the operational principles of their algorithms, recommendation logic, and the risks that may arise from the algorithms, and take effective measures to mitigate systemic risks caused by the algorithms. Additionally, the act requires all digital service providers to establish and improve mechanisms for reviewing illegal content, promptly addressing user complaints about illegal content, and removing or taking down obvious illegal content (such as terrorist propaganda and child sexual abuse materials) within a specified timeframe.

Fourth, it strengthens the protection of user rights. The DSA places significant emphasis on the protection of user rights, explicitly stating that users have the right to be informed, the right to choose, the right to complain, and the right to data privacy. For example, users have the right to understand the basic logic of the platform's recommendation algorithms and the right to refuse personalized recommendations from the platform; platforms must establish convenient channels for user complaints and respond to and address complaints about illegal content or violations within a specified timeframe; platforms must not arbitrarily collect or use user data and must strictly comply with the General Data Protection Regulation (GDPR) to protect users' privacy rights. Furthermore, the act pays special attention to the protection of the rights of minors and vulnerable groups, requiring platforms to take special measures to prevent minors from being exposed to harmful content.

Fifth, it establishes a strict enforcement and penalty mechanism. The DSA clarifies the enforcement powers of the European Commission and regulatory authorities of member states, establishing a unified enforcement coordination mechanism to ensure the effective implementation of regulatory rules. For digital service providers that violate the DSA, the EU will impose different penalties based on the severity of the violations: minor violations will receive warnings and orders for rectification; serious violations will incur fines of up to 6% of their global annual revenue, a penalty rate that far exceeds other regulatory frameworks and carries strong deterrent effects. Additionally, for platforms that continue to violate regulations and refuse to rectify, the EU may take temporary measures, including restricting the platform's functions, suspending the platform's services within the EU, or even forcing the platform to exit the EU market.

II. Analysis of Core Details of the EU Investigation into the X Platform

The new investigation into the X platform initiated by the EU on January 26, 2026, is not the first regulatory action taken by the EU against the X platform—at the end of 2025, the European Commission imposed a fine of 120 million euros on the X platform for misleading users in the "blue check certification" interface design, lack of transparency in the advertising library, and refusal to open data to researchers.

This new investigation focuses on the AI recommendation algorithm of the X platform, representing the EU's routine regulation of Very Large Online Platforms under the DSA and targeted regulation of AI technology applications on digital platforms.

(1) Regulatory Entities

The investigation into the X platform by the EU features a dual-layer structure of "European Commission leading, member state regulatory agencies collaborating," established according to the DSA's provisions to ensure the unity and efficiency of regulatory work.

The core regulatory entity is the European Commission. Authorized by the DSA, the European Commission is responsible for coordinating digital service regulation across the EU, holding direct regulatory and enforcement powers over Very Large Online Platforms and Very Large Online Search Engines. This investigation is directly initiated by the European Commission, which is responsible for formulating the investigation plan, collecting relevant evidence, assessing the compliance status of the X platform, ultimately determining whether there is a violation, and deciding on the penalties to be imposed.

The collaborative regulatory entities are the digital service coordination agencies of each member state. The DSA requires each EU member state to establish a dedicated digital service coordination agency responsible for cooperating with the European Commission in regulatory work and handling digital service complaints and investigations within their respective countries. In this investigation, the Irish digital service coordination agency, Coimisiún na Meán, as the national digital service coordination agency where the X platform is established in the EU, has closely collaborated with the European Commission, participating in relevant work for this investigation, including assisting in evidence collection, conducting on-site inspections, and liaising with the X platform's representative agency in Ireland. Additionally, the digital service coordination agencies of other EU member states will provide necessary support as required by the European Commission to ensure the smooth progress of the investigation across the EU.

(2) Regulatory Targets

The regulatory targets of this investigation are clearly defined, focusing on the X platform owned by Elon Musk, specifically on two core elements of the X platform: the built-in artificial intelligence model Grok and the recommendation algorithm driven by Grok.

From the platform perspective, the X platform, as a globally recognized social platform, has monthly active users far exceeding the 45 million threshold set by the DSA, classifying it as a "Very Large Online Platform," thus requiring it to fulfill the strictest regulatory obligations stipulated by the DSA. This is one of the key reasons the EU has prioritized it for regulation. Since being acquired by Musk, the X platform has undergone multiple functional adjustments, including the integration of the AI model Grok into the platform to drive the recommendation algorithm and generate user content; this investigation specifically targets the compliance risks arising from this adjustment.

From a functional perspective, the core regulatory target is the Grok AI model and its driven recommendation algorithm. Grok is an artificial intelligence tool developed by the X platform provider, and since 2024, the X platform has deployed it in various ways, allowing users to generate text and images and providing contextual information for user-published content, while also optimizing the platform's recommendation algorithm to push personalized content to users. The focus of the EU's investigation is to assess whether the recommendation algorithm driven by the Grok AI model poses risks of disseminating illegal content, including forged explicit pornographic images, child sexual abuse materials, anti-Semitic content, and whether the X platform has conducted sufficient risk assessments regarding the application of Grok and taken effective risk mitigation measures.

Additionally, the investigation scope also covers the submission status of the X platform's risk assessment reports. The European Commission pointed out that by reviewing the risk assessment reports submitted by the X platform under the DSA, it was found that the Grok AI model was not reflected in the reports, indicating that the X platform may not have assessed the risks posed by Grok itself or its integration into the X platform for EU citizens, which is also one of the important focal points of this investigation. Furthermore, the European Commission has expanded the previous investigation scope of the X platform's recommendation system to evaluate the impact of its recent announcement to switch to a "Grok-based recommendation system," determining whether this system has comprehensively identified and mitigated the systemic risks defined by the DSA.

(3) Regulatory Basis

The core regulatory basis for the EU's investigation into the X platform is the Digital Services Act (DSA), which officially came into effect in 2023. Specifically, it mainly relies on the following key provisions in the DSA, which clearly outline the compliance obligations of the X platform and the regulatory authority of the EU.

First, the provisions regarding the regulatory obligations of "Very Large Online Platforms" in the DSA. According to Articles 34 and 35 of the DSA, Very Large Online Platforms are required to conduct regular systematic risk assessments to identify risks associated with the dissemination of illegal content and user rights violations that may arise from their services, and to develop specific risk mitigation measures, compiling special risk assessment reports to be submitted to the European Commission and the regulatory authorities of member states. The EU's investigation found that the X platform did not reflect the Grok AI model in its risk assessment report, which may constitute a violation of the aforementioned provisions and a failure to fulfill its risk assessment obligations.

Second, the provisions regarding algorithm regulation in the DSA. Article 42 of the DSA explicitly requires that Very Large Online Platforms regulate their recommendation algorithms to ensure transparency, explainability, and fairness, and prohibits the use of algorithms to disseminate illegal content, mislead users, or engage in discriminatory practices; at the same time, platforms must take effective measures to mitigate systemic risks posed by algorithms. The core of this investigation is to assess whether the recommendation algorithm based on Grok on the X platform meets these requirements and whether there is a risk of disseminating illegal content; if such issues exist, it would violate this provision of the DSA.

Third, the provisions regarding the handling of illegal content in the DSA. The DSA requires all digital service providers to establish and improve mechanisms for reviewing and handling illegal content, promptly addressing user complaints about illegal content, and removing or taking down obvious illegal content within a specified timeframe. The European Commission pointed out that anti-Semitic content, non-consensual deepfake videos of women, and content involving child sexual abuse were found on the X platform, and the X platform may not have taken timely action to address these issues, which could violate the relevant provisions of the DSA regarding the handling of illegal content.

Fourth, the provisions regarding enforcement and penalties in the DSA. Article 66 and other related provisions of the DSA clarify the enforcement powers of the European Commission, including initiating investigations, collecting evidence, taking temporary measures, making non-compliance decisions, and imposing penalties. Based on these provisions, the European Commission has the authority to initiate a formal investigation into the X platform, collecting evidence through information requests, conducting interviews, and implementing on-site inspections; if violations are found and the X platform has not made substantial corrections, the European Commission has the authority to take temporary measures and make a non-compliance decision, imposing fines of up to 6% of the X platform's global annual revenue.

Additionally, the EU's investigation also references previous regulatory records concerning the X platform—at the end of 2025, the EU imposed a fine of 120 million euros on the X platform for issues such as deceptive design, insufficient advertising transparency, and inadequate data access for researchers. This investigation is also a continuation of the oversight of the X platform's compliance status, ensuring that it fulfills all obligations stipulated by the DSA.

III. The Reasonableness and Fairness of the EU's Investigation

The EU's initiation of an investigation into the X platform under the Digital Services Act is an important measure for the EU to fulfill its regulatory responsibilities in digital services, protect user rights, and maintain security in the digital space, which has a certain degree of reasonableness. However, there are also some aspects regarding the specificity of the investigation, the fairness of regulation, and potential negative impacts that raise questions. Therefore, the market generally believes that if the EU ultimately makes a severe penalty decision against the X platform, the X platform may file a lawsuit in the EU courts, questioning the legality of the EU's investigation and the reasonableness of the penalty decision, which would prolong the entire regulatory process and become a focal event in the global digital regulatory field.

(1) Overall Alignment with Regulatory Goals and Industry Development Needs

The reasonableness of the EU's investigation is mainly reflected in the following aspects, aligning with the legislative spirit of the DSA and the trends in global digital governance.

First, it aligns with the legislative goals of the DSA and can effectively prevent risks posed by AI algorithms. The core goal of the DSA is to regulate the behavior of digital service providers, protect user rights, and maintain security and fairness in the digital space. This investigation focuses on the Grok AI model and its recommendation algorithm on the X platform, targeting the risks of illegal content dissemination that may arise from the application of AI technology on digital platforms, which is highly consistent with the legislative goals of the DSA. With the rapid development of AI technology, the risks posed by AI algorithms are becoming increasingly prominent; without strengthened regulation, they could seriously harm user rights and public order. The EU's timely initiation of this investigation can effectively prevent such risks, urge the platform to fulfill its compliance obligations, and regulate the application of AI algorithms, in line with the legislative spirit of the DSA.

Second, it conforms to the trends in global digital governance and promotes the compliance direction of the digital service industry. Currently, the core trend in global digital governance is to strengthen the regulation of digital service platforms, especially regarding AI technology and recommendation algorithms, to protect user rights and maintain security in the digital space. This investigation by the EU is an important practice in global digital governance, which can promote the digital service industry towards compliance and standardization, providing a reference for digital regulation in other countries and regions. For example, many countries and regions are formulating or improving digital service regulatory laws, and the EU's investigation and the implementation experience of the DSA will provide guidance for these countries and regions, promoting the standardization and unification of global digital regulation.

Third, it has clear regulatory specificity targeting the actual violation risks of the X platform. The EU's investigation is not initiated blindly but is based on the actual violation risks present on the X platform. The European Commission pointed out that anti-Semitic content, non-consensual deepfake videos of women, and content involving child sexual abuse were found on the X platform, and the X platform did not conduct a sufficient risk assessment of the Grok AI model, nor did it reflect the relevant situation of Grok in the risk assessment report, which may violate the relevant provisions of the DSA. Therefore, this investigation targets the actual violations and risks of the X platform, having clear specificity that can effectively urge the X platform to rectify and protect user rights.

Fourth, it is a necessary measure to fulfill regulatory responsibilities, which can strengthen the regulatory authority of the DSA. As a globally recognized Very Large Online Platform, the compliance status of the X platform has significant demonstrative effects on the entire digital service industry. The EU's initiation of an investigation into the X platform is a necessary measure to fulfill its regulatory responsibilities, which can send a clear regulatory signal to global digital service providers, urging all platforms to strictly comply with the provisions of the DSA, strengthening the regulatory authority of the DSA, and ensuring the effective implementation of regulatory rules.

(2) Controversies Regarding Regulatory Specificity, Fairness, and Impact Balance

Although the EU's investigation has a certain degree of reasonableness, there are also some aspects that raise questions in practice, mainly concentrated in the following areas, sparking widespread discussion within the industry.

First, there may be a bias in the specificity of the investigation, potentially over-focusing on the X platform, raising suspicions of "selective regulation." As a company under Musk, the X platform has had multiple frictions with EU regulatory agencies since its acquisition, and the EU previously imposed a fine of 120 million euros on the X platform. The initiation of a new investigation inevitably raises doubts about the EU's investigation being "selective regulation"—that is, overly focusing on the X platform while insufficiently regulating similar violations by other platforms. For example, platforms like Facebook and Instagram under Meta, and YouTube under Google, also have built-in AI models, and their recommendation algorithms may also pose risks of disseminating illegal content, but the EU has not initiated similar investigations against these platforms, raising questions about regulatory fairness. Additionally, the Grok AI model on the X platform has been online for a relatively short time, and the risks it poses have not fully manifested; the EU's initiation of an investigation at this time may suggest excessive regulation, which is not conducive to the platform's innovative development.

Second, the regulatory standards may be too strict, potentially stifling innovation and development in AI technology. The DSA imposes extremely strict regulatory requirements on Very Large Online Platforms, especially regarding AI algorithms, requiring platforms to disclose algorithm logic, conduct risk assessments, and strengthen content review, among other things. While these regulatory requirements can effectively prevent risks, they may also stifle innovation and development in AI technology. The development of AI technology requires a certain amount of trial and error space, and overly strict regulatory requirements may deter platforms from boldly pursuing AI technology research and application, fearing severe penalties for violations. For example, platforms may reduce investment in generative AI and limit the functionality of AI models, which would be detrimental to innovation and progress in AI technology and could impact the vitality of the digital service industry.

Third, the fairness of regulation needs to be considered, as there may be a tendency towards "regional protection." The DSA's regulatory scope covers all platforms providing digital services within the EU, but in actual enforcement, the EU may apply different regulatory standards to EU-based platforms and non-EU-based platforms, showing a tendency towards "regional protection." The X platform is headquartered in the United States, making it a non-EU-based platform, while platforms like Meta and Google, although also headquartered in the U.S., have extensive business operations and greater compliance investments within the EU, which may lead to "special treatment" by EU regulatory agencies; meanwhile, EU-based digital service platforms may enjoy a more lenient regulatory environment, which contradicts the DSA's principle of "unified regulation and fair treatment" and is not conducive to fair competition in the global digital service industry.

Fourth, the potential negative impacts have not been fully considered, affecting user experience and industry development. As mentioned earlier, this investigation may lead to the platform weakening personalized recommendation functions, limiting AI capabilities, and increasing service costs, which would affect user experience and rights; at the same time, excessively high compliance costs may drive small and medium-sized platforms out of the EU market, increasing industry concentration and creating monopolistic patterns, which would be detrimental to fair competition and innovative development in the industry. However, it seems that the EU did not fully consider these potential negative impacts when initiating the investigation, nor did it formulate corresponding countermeasures, which may lead to regulatory outcomes that do not align with expectations, or even produce opposite effects.

Fifth, the lack of transparency in the investigation process may affect the fairness of the investigation results. Although the EU has clearly defined the focus and basis of the investigation into the X platform, the specific procedures of the investigation, methods of evidence collection, and evaluation standards have not been sufficiently disclosed to the public and the X platform, which may affect the fairness of the investigation results. As the subject of the investigation, the X platform has the right to understand the specific circumstances of the investigation and to have ample opportunity for defense, but the insufficient transparency of the EU's investigation process may prevent the X platform from fully exercising its rights, impacting the fairness and reasonableness of the investigation results.

免责声明:本文章仅代表作者个人观点,不代表本平台的立场和观点。本文章仅供信息分享,不构成对任何人的任何投资建议。用户与作者之间的任何争议,与本平台无关。如网页中刊载的文章或图片涉及侵权,请提供相关的权利证明和身份证明发送邮件到support@aicoin.com,本平台相关工作人员将会进行核查。

Share To
APP

X

Telegram

Facebook

Reddit

CopyLink