A smartphone app that automatically deletes children's nude selfies from their phones has been developed in Japan in an effort to prevent sexual exploitation.

The app, which uses artificial intelligence, was created as there has been a sharp increase in cases of adults contacting children through social media to have them send their naked pictures.

It was born through a collaboration between app developer Smartbooks Inc. based in Tokyo, Fujita Health University in Aichi Prefecture and the prefecture's local Nakamura police station in Nagoya.

Not only does the app delete nude photos that it deems sexually exploitative but also sends messages warning the children's guardians.

Fujita Health University students learn about the dangers of sexual abuse through social media at a workshop in Toyoake, Aichi Prefecture, on June 30, 2022.(Kyodo)

The app is expected to be ready for public use by the end of the year.

The question now for the app's collaborators is to figure out how to encourage children to download the app on their smartphones.

At a workshop at Fujita Health University, around 70 students learned about the dangers of sexual abuse through social media, then considered how children could be encouraged to install the app on their phones.

Suggestions ranged from having it preinstalled on smartphones, or giving student discounts to those who download it.

Naoto Tomita, 24, co-founder of Smartbooks, said that his goal is for every child to download it to protect themselves, one shared by the Nakamura police.

Nobuhiro Suzuki, a deputy manager at the police department's community safety division, said, "The relationship between children and social media is extremely entwined. We would like to use this app to prevent sexual abuse."