Reports of deepfake-related digital sex crimes in South Korea more than tripled in 2024, driven by the rapid development and increasing accessibility of artificial intelligence (AI) technology.
The Digital Sex Crime Victim Support Centre received 1,384 reports of manipulated images used in digital sex crimes last year.
This marked a sharp increase compared to the 423 reports in 2023, according to data released on Thursday by the Ministry of Gender Equality and Family and the Women’s Human Rights Institute of Korea.
Young people, specifically those in their teens or twenties, represented the overwhelming majority of deepfake victims, accounting for 92.6 per cent of cases.
Park Sung-hye, who leads deletion support at the advocacy centre, highlighted a disturbing trend of young people using AI-generated deepfake images for entertainment, The Korea Times reported.
“We’re seeing a growing number of reports involving elementary school students because of how easily accessible AI tools have become, even for children under the age of 10,” Park said, according to the Korea JoongAng Daily.
She added that young perpetrators would sometimes use photos of people they knew, such as friends or even teachers.