France is one of five EU countries to test age-verification app for sensitive online content
The European Commission seeks to protect children from risks such as grooming, harmful content, addictive behaviours, and cyberbullying
Users will soon be able to prove their age when accessing restricted online content
View Apart/Shutterstock
Five European countries, including France, will be testing “a prototype of an age-verification app” to protect minors from sensitive online content, the European Commission announced.
“Making sure our children and young people are safe online is of paramount importance... Platforms have no excuse to be continuing practices that put children at risk,” said Henna Virkkunen, European Commission Executive Vice-President for Tech Sovereignty, Security and Democracy, in a press release.
The new app, presented on July 14, will allow users to easily prove they are over 18 years old when accessing restricted adult content online, such as pornography, without revealing their identity or any other personal details.
Brussels states the tool is “based on open-source technology and designed to be robust, user-friendly, and privacy-preserving.”
Denmark, Greece, Italy and Spain will join France in testing the mobile application, which could be integrated into a preexisting national app or become a separate stand-alone tool, stemming from the EU prototype. Each member state will set their own rules and individual age restrictions.
“In France, children under the age of 15 will no longer be able to create an account on a social network that is inappropriate for their age,” wrote Clara Chappaz, France’s Minister for Artificial Intelligence and Digital Technology, on social media.
“We will be doing our utmost to translate these developments into national law from the start of the new academic year,” she added.
The announcement comes after French President Emmanuel Macron expressed his desire to ban social media for under-15s in France, with or without the EU.
Mr Macron took to social media following yesterday’s news, describing it as “a victory for the protection of our children.”
The European Commission suggests this age-verification system could eventually be extended and adapted to different age-restricted situations, such as purchasing alcohol.
Guidelines to protect minors
A series of guidelines has been published alongside the app development to encourage the evolution of online spaces in prioritising young people’s safety.
“The guidelines set out a non-exhaustive list of proportionate and appropriate measures to protect children from online risks such as grooming, harmful content, problematic and addictive behaviours, as well as cyberbullying and harmful commercial practices,” states the European Commission’s legislation.
The new recommendations are set to apply to all online platforms accessible to minors, with the exception of micro and small enterprises.
Guidelines include:
Setting minors' accounts to private by default to protect their personal information from strangers.
Modifying the platforms’ algorithms and content recommendation systems to lower the risk of children encountering harmful content.
Empowering children to be able to block and mute any user and ensuring they cannot be added to groups without their explicit consent.
Prohibiting accounts from downloading or taking screenshots of content posted by minors.
Disabling by default features that contribute to excessive use, like communication streaks (rewarding consistent daily use), ephemeral content, read receipts (showing when a user has read a message), autoplay and putting safeguards around AI chatbots integrated into online platforms.
Ensuring that children’s lack of commercial literacy is not exploited and that they are not exposed to commercial practices that may be manipulative.
Introducing measures to improve moderation and reporting tools.
Brussels recommends the respect of these guidelines in tandem with age-verification methods.
Read also: French health minister wants to ban screens for children under 3… but how?