Digital Services Act: EU Commission Initiates Proceedings Against X (Formerly Twitter)
The European Commission has officially opened formal proceedings against X, formerly known as Twitter, under the Digital Services Act (DSA). This formal action aims to determine if X has violated the DSA in areas such as risk management, content moderation, dark patterns, advertising transparency, and the provision of data access for researchers. This is the first time the commission has launched formal proceedings under the DSA.
Details of the investigation:
Following the preliminary investigation, which involved analyzing X’s risk assessment report, X’s Transparency report, and X’s responses to a request for information on, amongst other things, the dissemination of illegal content in the context of Hamas’ terrorist attacks, the European Commission has decided to start formal infringement proceedings against X.
The investigation will specifically focus on the following areas:
- Compliance with DSA Obligations on Illegal Content: this includes evaluating how effective X’s risk assessment and measures to mitigate risks are, checking the functionality of the Notice and Action (N&A) mechanism implemented by X (required by the DSA to address illegal content), and assessing X’s content moderation resources.
- Effectiveness of Measures Against Information Manipulation: this involves examining the effectiveness of measures taken by X to address information manipulation on the platform, evaluating the functionality of X’s ‘Community Notes’ system in the EU, and assessing related policies aimed at mitigating risks.
- Transparency Measures: this includes investigating the steps taken by X to enhance the transparency of its platform, examining suspected deficiencies in providing researchers access to X’s data (as required by Article 40 of the DSA), and X’s ads repository.
- User Interface Design Concerns: investigating a suspected deceptive design of the user interface, notably in relation to checkmarks linked to certain subscription products, the so-called Blue checks.
Next steps:
After initiating formal proceedings, the EU Commission will continue to gather evidence through requests for information, interviews, and inspections. The investigated actions are suspected of violating Articles 34(1), 34(2), and 35(1), 16(5) and 16(6), 25(1), 39, and 40(12) of the DSA:
- Articles 34(1), 34(2) and 35(1): pursuant to these articles, very large online platforms (VLOPs), such as X, are obliged to diligently identify, analyse, and assess any systemic risks in the Union stemming from the design or functioning of their service and its related systems, or from the use made of their services. VLOPs are obliged to put in place reasonable, proportionate and effective mitigation measures, tailored to the specific systemic risks identified.
- Articles 16(5) and 16(6): according to these articles, online platforms have to notify users of content moderation decisions and actions without undue delay, providing information on the possibilities for redress in respect of that decision. Platforms must also take such decisions in a timely, diligent, non-arbitrary and objective manner.
- Article 25(1): online platforms must not design, organize, or operate their online interfaces in a way that deceives or manipulates their users or in a way that otherwise materially distorts or impairs the ability of the users of their service to make free and informed decisions.
- Article 39: VLOPs must compile and make publicly available a repository containing specific information about advertisements on their platforms, until one year after the advertisement was presented for the last time.
- Article 40(12): VLOPs have to provide researchers with effective access to platform data.
Potential consequences:
If the Commission determines that the actions violate the DSA, it may decide to impose fines of up to 6% of X’s global turnover and instruct the platform to take corrective measures. This decision could also trigger an enhanced supervision period to ensure X adheres to the measures intended to rectify the breach, and periodic penalties of up to 5% of the average daily worldwide turnover for each day of delay in complying with remedies, interim measures, or commitments, following non-compliance with the decision imposing such measures.
As a final resort, if the infringement continues and results in significant harm to users or involves criminal offenses posing a threat to life or safety, the Commission can also request the temporary suspension of the service in the EU through a specific procedure.
Implications for Other Companies Affected by the DSA:
All platforms affected by the DSA, including cloud-service providers and online marketplaces, must adhere to the outlined rules. These companies are obligated to ensure compliance with measures, such as those outlined above, focusing on transparency, consumer protection, and cooperation with national authorities
For smaller platforms not classified as VLOPs like X, the DSA establishes a network of digital services coordinators to determine penalties based on national laws. Each Member State will specify proportionate penalties, with the DSA setting the maximum fine at 6% of the provider’s global turnover. In cases of providing incorrect information or failure to comply, fines may reach 1%. Enforcement includes immediate actions for serious harms, commitments from platforms, and the right for individuals to seek compensation for damages resulting from DSA infringements.
The initiation of formal proceedings against X today underscores the significance of compliance with the DSA. Such proceedings emphasize the imperative for all companies to adhere to the regulatory framework and prepare for the DSA. All affected service providers need to fully comply with their set of obligations under the DSA by 17 February 2024
How can Logan & Partners help?
If you need support with complying with the DSA or checking if your business is affected by the regulation, feel free to reach out to us for a free consultation.
Image by Freepik