Cullen International has updated a benchmark on national initiatives on age-verification systems to control or restrict the exposure of minors to harmful content (such as pornographic content and gratuitous violence) on internet platforms.
If initiatives exist (beyond the mere transposition of the EU level rules), it shows:
- their aim and scope of application;
- if a specific type of technology is mandated or foreseen (photo identification matching, credit card checks, facial estimation…);
- if the system is applicable to service providers established in other member states;
- a description of the main aspects/ rules.
Background
The Audiovisual Media Services (AVMS) Directive requires audiovisual media services and video-sharing platforms (VSPs) to take appropriate measures to protect minors from content that can impair their physical, mental or moral development. For VSPs, measures should consist of, as appropriate, setting up and operating age verification systems or (easy to use) systems to rate content. It is up to the relevant regulatory authorities to assess whether measures are appropriate considering the size of the video-sharing platform service and the nature of the service that is provided.
The Digital Services Act (DSA) requires platforms that are accessible to minors to take measures to ensure children's safety, security and privacy. It also requires very large online platforms (VLOPs) and very large online search engines (VLOSEs) to assess the impact of their services on children’s rights and safety, and take mitigating measures if needed, which may include age verification.
Findings
The benchmark shows that among the countries covered, France, Germany, Italy, Ireland, Spain and the UK have initiatives (in place or proposed).
Among other findings, the benchmark shows that France is tackling the question from a technical and from a regulatory level. On the technical level, the ministry in charge of the digital sector announced the launch of a test phase of a technical solution (based on the double anonymity principle) specifically aimed to block minors’ access to pornographic content. On the regulatory level, it is a criminal offence in France for any provider not to have a reliable technical process to prevent minors from accessing offending content. Legislation has also been adopted to adapt French law to the DSA which has allowed Arcom to set binding standards on the technical requirements for age-verification systems.
In Germany, legislation prohibits pornographic content (as well as listed or clearly harmful to minors) from being distributed online unless providers ensure that only adults can access this content by creating closed user groups. Providers use age verification systems to control these closed user groups.
In Ireland, the rules apply to VSPs that host adult content and the focus is on ensuring that age verification is effective rather than on specifying a particular method.
For more information and access to the benchmark, please click on “Access the full content” - or on “Request Access”, in case you are not subscribed to our European Media service.
more news
11 September 25
EU initiatives to foster satellite connectivity
Our new Tracker covers the Commission’s flagship initiatives in relation to satellite infrastructure, satellite spectrum and new converged services like direct-to-device (D2D). It also covers the EU’s preparations for the World Radiocommunication Conference 2027 (WRC-27), where satellite is on the agenda.
11 September 25
[INFOGRAPHIC] Top 10 European antitrust fines on Big Tech
The European Commission’s €2.95bn fine against Google is the latest in a string of hefty antitrust penalties that competition authorities in Europe have imposed on US Big Tech firms. Cullen International’s infographic provides an overview of these fines, putting them in context and showing how they are calculated.
10 September 25
Transposition status of key EU environmental directives
Our new benchmark contains summary information on the transposition status of six key EU environmental directives.