On February 4, 2025, the Dutch Authority for Consumers and Markets (ACM) was officially designated as the supervisory body (Digital Services Coordinator) for the enforcement of the Digital Services Act (DSA) in the Netherlands. This development marks a significant step in the enforcement of the European digital regulatory framework.
The DSA aims to create a safer and more transparent digital environment by imposing strict obligations on online service providers, such as hosting providers, online platforms and search engines. While the European Commission has primary responsibility for the largest online platforms and search engines, national regulators, such as the ACM, are responsible for enforcing the DSA with respect to other digital service providers, based on the location of the service provider's main establishment – that is, where the provider has its head office or registered office. Other member states had already appointed their Digital Services Coordinators. Moreover, the ACM is also designated for enforcement of the Platform to Business Regulation, the Digital Markets Act and the Data Governance Act. Therefore, it has a range of regulatory enforcement powers over digital markets.
This article provides an overview of the ACM’s supervisory role, and the challenges associated with the enforcement of the DSA.
The layered structure of the Digital Services Act
The DSA is designed to provide a harmonized legal framework for digital service providers within the European Union. The regulation builds upon the principles of the e-Commerce Directive and introduces stricter rules regarding transparency, consumer protection, and tackling illegal online content. The DSA employs a risk-based approach, subjecting platforms with a greater societal impact – referred to as Very Large Online Platforms (VLOPs) and Very Large Online Search Engines (VLOSEs) – to more stringent obligations than smaller online service providers.
On April 25 2023, the European Commission designated the first cohort of VLOPs and VLOSEs. In the Netherlands, AliExpress, Booking.com, Wikipedia and Snapchat are particularly relevant as their main EU establishment is in this country. These providers have all been designated as VLOPs. The VLOP designation imposes specific due diligence obligations on the largest platforms (and search engines) concerned, such as identifying, analysing, and assessing systemic risks associated with their services. The risks include illegal content, fundamental rights violations, public safety concerns, and electoral integrity. Additionally, designated VLOPs must implement measures to mitigate these risks, including adjustments to their service design, operational mechanisms, and recommendation systems. The European Commission has exclusive competence regarding the enforcement of the due diligence obligations that apply to VLOPs (and VLOSEs) specifically. However, Digital Services Coordinators (i.e. the designated authority in a member state) retain (a shared) competence to enforce the DSA on VLOPs and VLOSEs with respect to other obligations imposed on such service providers under the DSA.
The Supervisory Role of the ACM
With the entry into force of the Dutch Implementation Act for the DSA, the ACM has been designated as the Digital Services Coordinator and is responsible for monitoring compliance with the DSA by digital service providers established in the Netherlands. In order to familiarise businesses with the DSA, the ACM has published guidelines. The role of the Digital Services Coordinator means that the ACM is the one-stop shop for complaints about digital service providers and that it has the authority to enforce compliance with the DSA, the exclusive competence of the European Commission regarding VLOP and VLOSE specific obligations excepted.
Three Focus Areas of Supervision
The ACM has indicated that it will focus on three core areas:
1. Oversight of Web Hosting Services
The ACM will examine the extent to which web hosting providers comply with the DSA’s obligations, such as the duty to take proactive action against illegal content on their servers. A key challenge here is determining the responsibility of hosting providers for the content uploaded by their end users. The DSA makes it clear that hosting providers (including online platforms) must remove illegal content once they become aware of it – or they risk being held liable. Excessively strict requirements could lead to over-removal of content (also known as “overblocking”), potentially restricting freedom of expression. The DSA clearly considers illegal content to be information, irrespective of its form, which under the applicable law is either illegal itself or which the applicable rules render illegal because it relates to illegal activities. Boundaries are blurred when, for instance, an eyewitness records a video of a potential crime. Such content should not automatically be considered to constitute illegal content, merely on the grounds that it depicts an illegal act, especially when recording or disseminating such a video to the public is not illegal under national or European law. The ACM will need to strike a careful balance.
2. Ensuring Basic Compliance by Online Platforms
Service providers, including online platforms, are required to establish a central contact point and provide an effective mechanism for users to report illegal content. The ACM will assess whether companies respond appropriately to such reports and publish transparency reports on content removal. Online platforms are required to take more elaborate measures than other service providers and, consequently, report more extensively on the action taken. This requires service providers to implement clear procedures for receiving and processing reports of illegal content. A critical issue here is that the effectiveness of these procedures largely depends on the resources and priorities of the provider. There is a risk that the service provider may take only minimal measures to comply with the obligations without making substantive improvements. The ACM will therefore need to assess not only the presence of these mechanisms, but also their actual effectiveness in practice. The transparency reports will serve a contributory purpose in doing so as they describe the actions of the service provider in great detail.
3. Protection of Minors
A crucial aspect of the DSA is the protection of minors from being presented with advertisements based on profiling and content that may impair their physical, mental or moral development. The ACM will investigate the extent to which online platforms effectively implement measures to protect minors. Several questions arise in this context. How is the age of users determined? Are the age verification methods used reliable and privacy-friendly? How can it be ensured that platforms do not indirectly serve personalised ads to minors? Addressing these concerns requires a thorough analysis of both the technical and legal implementation of the DSA’s provisions in this area.
Future Challenges and Expectations
The enforcement of the DSA by the ACM presents several challenges. One key issue is compliance by platforms and digital service providers with limited resources, as they may struggle to meet the stringent requirements due to financial, technical, and organisational constraints. These challenges are particularly acute for providers who exceed the threshold of micro and small enterprises and face a disproportionate compliance burden compared to larger, well-resourced competitors.
In contrast, the ACM may face operational and strategic challenges in effectively enforcing the DSA, such as ensuring sufficient expertise and resources to monitor complex digital markets. However, these challenges are to some extent mitigated by the possibility of aligning DSA supervision with existing frameworks under the Platform to Business Regulation, the Digital Markets Act, and the Data Governance Act. This integrated approach can create synergies in enforcement, reducing duplication of efforts and fostering a more coherent regulatory strategy. Furthermore, the ACM can leverage cooperation mechanisms with other Digital Services Coordinators within the EU, facilitating the exchange of best practices and consistent enforcement across Member States. While coordination with national authorities, such as the Dutch Data Protection Authority, remains important – particularly in areas where the DSA overlaps with data protection issues – the structured framework for regulatory cooperation within the DSA helps to minimise regulatory fragmentation and ensures greater legal certainty for businesses.
Technological developments, such as AI-driven recommendation systems and the emergence of new digital platforms, require a flexible and proactive regulatory approach. The rapid evolution of digital services means that regulatory frameworks must continuously adapt to new risks and industry practices. Moreover, cross-border enforcement remains a complex issue, requiring international cooperation between regulatory authorities to ensure coherent and effective oversight.
Finally, the broader impact of the DSA on the digital economy – more specifically, on European digital competitiveness at a global level – will become clearer in the coming years. While the regulation aims to foster a safer and fairer online environment, it also places significant compliance burdens on businesses. Striking the right balance between regulation and innovation will be crucial in determining the long-term success of the DSA. The ACM’s ability to enforce these rules effectively while adapting to an ever-changing digital landscape will be a decisive factor in shaping the future of digital services in the Netherlands and beyond.
For more information about the ACM’s enforcement of the DSA, please contact Pauline Kuipers, Joost van Roosmalen, Roelien van Neck and Frederik Westbroek.