Cullen International has updated a benchmark on national initiatives on age-verification systems to control or restrict the exposure of minors to harmful content (such as pornographic content and gratuitous violence) on internet platforms.
If initiatives exist (beyond the mere transposition of the EU level rules), it shows:
- their aim and scope of application;
- if a specific type of technology is mandated or foreseen (photo identification matching, credit card checks, facial estimation…);
- if the system is applicable to service providers established in other member states;
- a description of the main aspects/ rules.
Background
The Audiovisual Media Services (AVMS) Directive requires audiovisual media services and video-sharing platforms (VSPs) to take appropriate measures to protect minors from content that can impair their physical, mental or moral development. For VSPs, measures should consist of, as appropriate, setting up and operating age verification systems or (easy to use) systems to rate content. It is up to the relevant regulatory authorities to assess whether measures are appropriate considering the size of the video-sharing platform service and the nature of the service that is provided.
The Digital Services Act (DSA) requires platforms that are accessible to minors to take measures to ensure children's safety, security and privacy. It also requires very large online platforms (VLOPs) and very large online search engines (VLOSEs) to assess the impact of their services on children’s rights and safety, and take mitigating measures if needed, which may include age verification.
Findings
The benchmark shows that among the countries covered, France, Germany, Italy, Ireland, Spain and the UK have initiatives (in place or proposed).
Among other findings, the benchmark shows that France is tackling the question from a technical and from a regulatory level. On the technical level, the ministry in charge of the digital sector announced the launch of a test phase of a technical solution (based on the double anonymity principle) specifically aimed to block minors’ access to pornographic content. On the regulatory level, it is a criminal offence in France for any provider not to have a reliable technical process to prevent minors from accessing offending content. Legislation has also been adopted to adapt French law to the DSA which has allowed Arcom to set binding standards on the technical requirements for age-verification systems.
In Germany, legislation prohibits pornographic content (as well as listed or clearly harmful to minors) from being distributed online unless providers ensure that only adults can access this content by creating closed user groups. Providers use age verification systems to control these closed user groups.
In Ireland, the rules apply to VSPs that host adult content and the focus is on ensuring that age verification is effective rather than on specifying a particular method.
For more information and access to the benchmark, please click on “Access the full content” - or on “Request Access”, in case you are not subscribed to our European Media service.
more news
02 December 24
Regulators set more complex mobile coverage obligations in Europe
Our latest benchmark shows the increasing complexity of coverage requirements to be fulfilled by European operators. Whereas old spectrum licences often simply required to cover a certain percentage of households, newer 5G auctions defined detailed requirements to cover underserved areas, roads, railway routes or other infrastructure.
29 November 24
Have member states transposed the EU collective redress rules?
Our latest benchmark shows the status of transposition of the Collective Redress Directive in 17 EU countries.
25 November 24
When generative AI and copyright collide: what is at stake?
Our new Global Trends report explores the legal disputes and regulatory challenges that are arising around the world over the training of generative artificial intelligence (GAI) tools using copyright-protected material.