Cullen International has updated a benchmark on national initiatives on age-verification systems to control or restrict the exposure of minors to harmful content (such as pornographic content and gratuitous violence) on internet platforms.
If initiatives exist (beyond the mere transposition of the EU level rules), it shows:
- their aim and scope of application;
- if a specific type of technology is mandated or foreseen (photo identification matching, credit card checks, facial estimation…);
- if the system is applicable to service providers established in other member states;
- a description of the main aspects/ rules.
Background
The Audiovisual Media Services (AVMS) Directive requires audiovisual media services and video-sharing platforms (VSPs) to take appropriate measures to protect minors from content that can impair their physical, mental or moral development. For VSPs, measures should consist of, as appropriate, setting up and operating age verification systems or (easy to use) systems to rate content. It is up to the relevant regulatory authorities to assess whether measures are appropriate considering the size of the video-sharing platform service and the nature of the service that is provided.
The Digital Services Act (DSA) requires platforms that are accessible to minors to take measures to ensure children's safety, security and privacy. It also requires very large online platforms (VLOPs) and very large online search engines (VLOSEs) to assess the impact of their services on children’s rights and safety, and take mitigating measures if needed, which may include age verification.
Findings
The benchmark shows that among the countries covered, France, Germany, Italy, Ireland, Spain and the UK have initiatives (in place or proposed).
Among other findings, the benchmark shows that France is tackling the question from a technical and from a regulatory level. On the technical level, the ministry in charge of the digital sector announced the launch of a test phase of a technical solution (based on the double anonymity principle) specifically aimed to block minors’ access to pornographic content. On the regulatory level, it is a criminal offence in France for any provider not to have a reliable technical process to prevent minors from accessing offending content. Legislation has also been adopted to adapt French law to the DSA which has allowed Arcom to set binding standards on the technical requirements for age-verification systems.
In Germany, legislation prohibits pornographic content (as well as listed or clearly harmful to minors) from being distributed online unless providers ensure that only adults can access this content by creating closed user groups. Providers use age verification systems to control these closed user groups.
In Ireland, the rules apply to VSPs that host adult content and the focus is on ensuring that age verification is effective rather than on specifying a particular method.
For more information and access to the benchmark, please click on “Access the full content” - or on “Request Access”, in case you are not subscribed to our European Media service.
more news
07 October 25
Communications apps: a global regulatory update
Our benchmark covers the regulatory treatment of apps that are used for interpersonal communications, such as WhatsApp or WeChat across 14 jurisdictions around the world.
06 October 25
Around two-thirds of MENA countries use market analysis to impose ex ante regulations
Cullen International’s updated benchmarks shows how regulators in the Middle East and North Africa use market analysis frameworks to analyse and regulate telecoms markets.
03 October 25
Over 50 cases and counting: a snapshot of European antitrust enforcement against big tech
Our latest Antitrust & Mergers report gives a snapshot of the 50+ antitrust cases that have been brought against big tech companies at the EU and national level.