{{item.title}}
{{item.text}}
{{item.text}}
Global digital platforms are facing heightened regulatory attention as policymakers work to balance freedom of expression with the need to reduce online harm. Approaches vary by region— ranging from Europe’s active regulatory stance to more decentralized or market-driven models in the Americas and parts of the Asia-Pacific region. Among these, the European Union’s Digital Services Act (DSA) stands out as one of the more mature and broad-reaching regulatory frameworks.
Now in force, the DSA’s impact on the online platform ecosystem presents a case study for how regulatory approaches may affect users and online platforms. One of the DSA’s most significant shifts is its transformation of the user redress process — introducing enforceable rights to appeal content moderation decisions, both within platforms and through independent, out-of-court dispute settlement (ODS) bodies. For online platforms, this has triggered a cascade of policy, operational and product challenges — making compliance not just a legal obligation, but also a key factor in maintaining user trust and operational continuity in global markets.
The DSA, ratified by the European Union in 2022, empowers users to appeal content moderation decisions made by online platforms and search engines. This landmark legislation mandates that platforms:
Over a year into DSA’s enforcement, the scale of platform moderation actions and user redress opportunity is becoming clear. During this time, online platforms have shared more than 20 billion statements of reason for their content moderation decisions with users. From those decisions, a single very large online platform (VLOP) may receive upwards of 10 million annual internal appeals. The sheer number and diversity of violation and restriction types underscores the potential challenge of managing volume at scale. Now, platforms should engage with an emerging ecosystem of Article 21 ODS bodies — presenting users with an opportunity to seek independent, online redress, and presenting platform trust and safety teams with added operational complexity.
Article 21 represents a particularly dramatic change for online platforms to contend with, as platforms are obligated to “engage, in good faith” with these new bodies once they’ve been certified by an EU member state’s digital services coordinator. These bodies must adhere to specific criteria, such as being impartial and independent from platforms based in the EU, having necessary experience, and having clear and fair rules of procedure.
As of mid-2025, six certified ODS bodies — with distinct scopes — have created operational complexity for online platforms, which faced over 4,500 appeals in the first three months of the year.
| ODS body | Date of certification | Certification jurisdiction | Area of expertise | Scope | Fee model |
| Adroit | July 10, 2024 | Malta | Online platforms, marketplaces, gaming and trading platforms (and more) | All online platforms under DSA
|
Subscription- based models and case-by-case pricing |
| User rights GmbH | Aug 12, 2024 | Germany | Online platform disputes | Instagram, TikTok, LinkedIn | Case-by-case pricing |
| Online platform Vitarendező Tanács | Aug 29, 2024 | Hungary | All types of disputes | All online platforms under DSA | Standard flat fee per case |
| Appeals Centre Europe | Sep 26, 2024 | Ireland | Online platform content policy disputes | Facebook, TikTok, YouTube | Case-by-case pricing |
| RTR-GmbH, Fachbereich Medien | Oct 24, 2024 | Germany | Data privacy and protection, fraud, online platform harms (and more) | All online platforms under DSA | Unknown |
| ADR Center | Dec 18, 2024 | Italy | Data privacy and protection, harmful products and services, IP and legal disputes, online platform arms (and more) | All online platforms under DSA | Flexible service packages |
Source: European Commission, Online Platform Statement of Reasons
For years, online platforms have struggled to provide fair, meaningful user redress at scale. ODS bodies have their own procedure rules, which increases complexity and means online platforms have to engage with each one differently (as outlined during the certification process).
This approach has fragmented the engagement mechanisms in several areas, including:
For online platforms, this fragmentation presents additional challenges, including:
For ODS bodies, a widespread lack of understanding is one of the core challenges in guiding users to the appeals process after adverse content moderation outcomes. According to PwC’s 2025 Trust and Safety Survey, 75% of individuals report having limited or no awareness of prominent social media legislation in their country of origin.
“When it comes to dealing with fragmentation in the Article 21 space, there are many types of bodies that have been certified or are in the process of becoming certified. It’s still too early to know which of these models will be successful.”
Richard Early,Manager of Governance Programs, MetaAfter receiving a content moderation decision, users are required to exit the platform and locate an ODS body to assess their case. The European Commission maintains a list of registered ODS bodies and provides links to each intake form. One of these bodies, the Ireland-based Appeals Centre Europe, has taken steps to improve public understanding and access. Its efforts include public-facing media engagement, targeted outreach and encouraging platforms to offer clearer in-platform direction on user rights under Article 21.
“The right to an appeal is an entirely new user right. It hasn’t existed before. People are not aware of its existence…and platforms at the moment are not doing enough to inform their users of it. There’s a real need for clearer signposting.”
Thomas Hughes,CEO, Appeals Centre EuropeUpon receiving appeals, ODS bodies must review and coordinate with online platforms to confirm eligibility, which can be difficult as each ODS body has a unique scope of certification. With social media platforms operating in real time, timely responses are critical. For platforms, designing a model that can scale efficiently presents added considerations, particularly since they are required to fund the operations of ODS bodies through fees, regardless of the outcome of individual cases.
While decisions made by ODS bodies are non-binding, platforms are required to report on decision outcomes as part of mandatory transparency reporting and are required to consider these decisions as part of their risk mitigation requirements under Article 35 of the DSA. This creates operational pressure to evaluate the quality of each review and decide whether to act on the ODS body’s recommendation, such as whether to remove or reinstate content. Article 21 bodies — with their ability to address cross-platform issues — can help surface areas where platform enforcement may be contested by users, contributing insights to civil society through mandated transparency reporting.
Given fragmentation across the appeals ecosystem, regulators, platforms and ODS bodies are placing increased focus on the development of shared standards. The European Board for Digital Services has formed a working group to explore areas related to trusted flaggers and the function of Article 21 ODS bodies. Platforms have expressed interest in collaborating with industry peers to develop standards, and ODS bodies have launched the ODS Network, which advocates for best practices in dispute resolution. The commission clearly expects civil society and academia to be involved in these discussions, and it’s likely that all these parties will play a part in helping to establish standards within the Article 21 ecosystem.
The DSA requirements around content moderation appeals and ODS have intensified the pressure on platform regulatory compliance teams, stretching limited resources. Yet, this evolving regulatory environment also presents strategic opportunities. With DSA leading the way in global trends toward safer online spaces, regulatory compliance teams should act now to help shape safer digital environments and build sustainable, compliant capabilities that reinforce trust and confidence with users.
Organizations should prepare for a set of cross-functional challenges:
Online platforms should align quickly across functions — mobilizing the right experience and guidance to activate and scale programs that meet emerging demands. Embrace DSA compliance. It’s a trust-building, user-first opportunity. Get ahead of the complexity. Build for what’s next.
{{item.text}}
{{item.text}}