UK Children’s Commissioner Calls for Ban on AI ‘Nudification’ Apps
The Children’s Commissioner, a United Kingdom regulator tasked with promoting and protecting the rights of children in the U.K., issued a report calling for the immediate ban of artificial intelligence (AI) apps that enable “deepfake sexual abuse of children.”
AI-generated “deepfakes,” which are synthetic images that realistically recreate a real person in an unreal scenario, have been an increasing problem in the age of AI. Many people have been victimized by AI deepfakes, including celebrities, and, more concerning, children. Various deepfake scandals have rocked schools, including a situation last fall in Pennsylvania that ultimately forced a school to close temporarily.
The Children’s Commissioner’s report cites children who told regulators they are afraid of becoming deepfake victims, and some kids are deeply worried that someone, whether a classmate, friend, or total stranger, could create sexualized deepfake images of them.
“Girls have told me they now actively avoid posting images or engaging online to reduce the risk of being targeted by this technology,” says Children’s Commissioner Dame Rachel de Souza. “We cannot allow sit back and allow these bespoke AI apps to have such a dangerous hold over children’s lives.”
“The online world is revolutionary and quickly evolving, but there is no positive reason for these particular apps to exist. They have no place in our society. Tools using deepfake technology to create naked images of children should not be legal and I’m calling on the government to take decisive action to ban them, instead of allowing them to go unchecked with extreme real-world consequences,” de Souza continues.
Dame de Souza is immediately calling on the U.K. government to introduce a complete and total ban on all apps that use AI to generate sexually explicit “deepfake” images of children.
While the U.K. already criminalized the creation of sexual deepfakes in 2024, and made it illegal to possess or distribute some sexually explicit deepfake material earlier this year, de Souza notes that offending “nudification” apps remain readily available on major search engines and app platforms.
“While it is illegal to create or share a sexually explicit image of a child, the technology enabling them remains legal — and it is no longer confined to corners of the dark web but now accessible through large social media platforms and search engines,” de Souza says.
The commissioner warns that the impacts of being a deepfake victim are significant, especially for children. The report claims that there is a link between deepfake abuse and suicidal ideation.
She says that urgent action is required, including banning nudification apps altogether, creating legal responsibilities for app developers, ensuring that it is much easier to remove sexually explicit deepfake images of children from the internet, and formally recognizing deepfake sexual abuse material as a form of violence against women and girls. The Children’s Commissioner wants the law to take seriously the damage that deepfakes can do to children.