Cullen International has updated a benchmark on national initiatives on age-verification systems to control or restrict the exposure of minors to harmful content (such as pornographic content and gratuitous violence) on internet platforms.
If initiatives exist (beyond the mere transposition of the EU level rules), it shows:
- their aim and scope of application;
- if a specific type of technology is mandated or foreseen (photo identification matching, credit card checks, facial estimation…);
- if the system is applicable to service providers established in other member states;
- a description of the main aspects/ rules.
Background
The Audiovisual Media Services (AVMS) Directive requires audiovisual media services and video-sharing platforms (VSPs) to take appropriate measures to protect minors from content that can impair their physical, mental or moral development. For VSPs, measures should consist of, as appropriate, setting up and operating age verification systems or (easy to use) systems to rate content. It is up to the relevant regulatory authorities to assess whether measures are appropriate considering the size of the video-sharing platform service and the nature of the service that is provided.
The Digital Services Act (DSA) requires platforms that are accessible to minors to take measures to ensure children's safety, security and privacy. It also requires very large online platforms (VLOPs) and very large online search engines (VLOSEs) to assess the impact of their services on children’s rights and safety, and take mitigating measures if needed, which may include age verification.
Findings
The benchmark shows that among the countries covered, France, Germany, Italy, Ireland, Spain and the UK have initiatives (in place or proposed).
Among other findings, the benchmark shows that France is tackling the question from a technical and from a regulatory level. On the technical level, the ministry in charge of the digital sector announced the launch of a test phase of a technical solution (based on the double anonymity principle) specifically aimed to block minors’ access to pornographic content. On the regulatory level, it is a criminal offence in France for any provider not to have a reliable technical process to prevent minors from accessing offending content. Legislation has also been adopted to adapt French law to the DSA which has allowed Arcom to set binding standards on the technical requirements for age-verification systems.
In Germany, legislation prohibits pornographic content (as well as listed or clearly harmful to minors) from being distributed online unless providers ensure that only adults can access this content by creating closed user groups. Providers use age verification systems to control these closed user groups.
In Ireland, the rules apply to VSPs that host adult content and the focus is on ensuring that age verification is effective rather than on specifying a particular method.
For more information and access to the benchmark, please click on “Access the full content” - or on “Request Access”, in case you are not subscribed to our European Media service.
more news
29 July 25
New benchmark on provisioning timers and SLAs for wholesale local access over fibre in Europe
Our new benchmark covers service level agreements (SLAs) for the provisioning of wholesale local access (WLA) over fibre unbundling and virtual unbundling (VULA) across 18 European countries.
28 July 25
Privacy in the digital age
This Global Trends benchmark compares key aspects of data protection laws across 14 jurisdictions: Australia, Brazil, Canada, China, the EU, India, Indonesia, Japan, Kenya, Korea, Singapore, South Africa, the UK and the US.
25 July 25
Few EU countries have yet designated authorities to enforce new EU rules on data and AI
Our new benchmark maps the authorities designated to enforce the Data Governance Act, the Data Act and the Artificial Intelligence Act across all the 27 EU member states.