Digital Sex Crimes In South Korea Surge With Rising AI Abuse
In 2024, over 10,000 individuals sought help from South Korea's Digital Sex Crime Victim Support Centre, marking an alarming rise in cases related to digital sexual abuse.
This represents the highest number of victims since the centre's establishment in 2018, driven largely by the misuse of artificial intelligence (AI) in creating explicit content such as deepfake pornography.
Deepfake Content Drives Sharp Increase In Cases
The most concerning trend highlighted in the report by South Korea’s Ministry of Gender Equality and Family is the dramatic rise in cases involving AI-generated deepfakes.
The number of such incidents soared by 227% from 423 cases in 2023 to 1,384 in 2024.
Most of these victims were under the age of 30, with many being teenagers or young adults.
Table showing number of digital sex offenses resulting from deepfake and illegal filming. (Source: Kyunghyang Shinmun)
The ministry expressed concern over the increasing accessibility of AI tools capable of producing disturbing synthetic content, including deepfakes targeting minors.
A ministry official noted on the risks posed by this new technology,
“Teens are especially vulnerable as they are frequent users of social media and digital platforms.”
Young Victims Make Up The Majority Of Cases
Young people, particularly teenagers and those in their twenties, have become the prime targets of these digital sex crimes.
Victims in their twenties accounted for 50.9% of cases, while those in their teens made up 27.8%.
This shift in the demographics of victims is concerning, as many of these young people experience abuse via social media platforms and messaging apps.
Authorities believe the actual number of victims among teenagers could be higher, due to a significant amount of underreporting.
Despite these concerns, the numbers show a clear trend: young people are increasingly vulnerable to digital sexual exploitation.
Explosive Growth In Illicit Filming And Content Distribution
Beyond deepfakes, the issue of illicit filming and content distribution continues to escalate.
The number of illegal filming cases rose from 2,927 in 2023 to 4,182 in 2024, a concerning increase of 42.8%.
In total, more than 300,000 pieces of illegal content were removed in 2024, a 22.3% rise from the previous year.
Despite these efforts, most of the illicit websites hosting these materials are based overseas, presenting significant challenges for authorities trying to combat this issue.
Source: ABC news
Centre Provides Crucial Support To Thousands Of Victims
The Central Digital Sexual Crime Victim Support Center (also known as DSC or Di-Sung Center), a key government body dedicated to assisting victims of digital sex crimes, provided over 332,000 services to 10,305 victims last year.
Image of the overview of support provided to victims over the past 7 years. (Source: Yonhapnews)
This increase in both the number of victims seeking help and the services offered marks a 14.7% rise in cases from the previous year.
The total number of victims supported have increased by 14.7%. In 2023, 36.5% were continual victims and 63.5% were new victims. In 2024, 36.9% were continual victims and 63.1% were new victims. (Source: Yonhapnews)
Victims received support in various forms, including counselling, assistance with content removal, and referrals for legal and medical aid.
The most commonly reported issues were anxiety about the distribution of explicit material, illegal filming, and threats of distribution.
Of those who sought assistance, 72.1% were female, while 27.9% were male, with women making up the majority of victims in this crisis.
78.8% were aged under 30, with about half of the victims being in their 20s. (Source: daum.net)
Government Plans To Strengthen Efforts Against Digital Sex Crimes
To address the growing issue of digital sex crimes, South Korean officials have outlined plans to enhance support for victims and strengthen preventive measures.
The government aims to provide targeted educational content for children and adolescents to raise awareness of the risks associated with digital platforms and AI-generated abuse.
Posters offering help for victims of digital sex crimes and information on preventing deepfake abuse are displayed outside the Korea Women’s Human Rights Advancement Center in Seoul.
Additionally, the Ministry of Gender Equality and Family has pledged to expand the budget and personnel at the DSC to ensure that victims can access support services around the clock.
Vice Minister Shin Young-sook emphasised,
"We will continually seek measures to proactively support victims of digital sexual crimes."
AI Tools Fuel Growing Threat To Vulnerable Individuals
As AI technology continues to evolve, the risks posed by synthetic media and digital abuse are likely to escalate.
With the growing availability of tools that can create explicit fake content, authorities are concerned about the impact on minors, especially those under 10 years old, who have increasingly become targets of these crimes.
The South Korean government has promised to work closely with platform operators and enhance regulations to curb the spread of harmful content.
On September 6, activists in Seoul held a rally demanding government action against illegal deepfake content, with banners that read ‘regulate internet platforms that encourage deepfake sexual crimes’. (Source: Associated Press)
Shin Bora, head of the Women’s Human Rights Promotion Agency, noted,
“We will work to provide faster and more effective support to victims of digital sexual crimes by ensuring the stable establishment of centralized functions and improving the quality of victim support services.”
The government’s ongoing efforts to combat digital sex crimes reflect the urgent need to protect vulnerable individuals, particularly the younger generation, from the growing threat of AI-fueled exploitation.