South Korea investigates Telegram over alleged sexual deepfakes : NPR

US

Activists wearing eye masks hold posters reading “Repeated deepfake sex crimes, the state is an accomplice too,” during a protest against sexually abusive deepfakes in Seoul, South Korea, on Aug. 30.

Anthony Wallace/AFP via Getty Images


hide caption

toggle caption

Anthony Wallace/AFP via Getty Images

SEOUL, South Korea — Students of all ages, teachers, soldiers and now journalists. More and more ordinary South Korean women are finding out they are targets of a fast-growing form of digital sex abuse: deepfakes.

South Korean authorities are scrambling to respond after local media and crowdsourced efforts recently uncovered large numbers of chat rooms on the messaging app Telegram that distribute fake sexual images and videos made with artificial intelligence.

The Korean National Police, which last week announced a crackdown on sexually abusive deepfakes, said Monday that it started an investigation into Telegram over potential charges of aiding and abetting the spread of sexually explicit deepfakes on its platform.

The agency says it is the first time South Korean law enforcement is investigating the company, whose founder Pavel Durov was arrested and indicted in France last month for alleged illegal activity on the platform.

Telegram spokesperson Remi Vaughn told NPR that the company “has been actively removing content reported from Korea that breached its terms of service and will continue to do so.”

The Korea Communications Standards Commission, the government’s media watchdog, said that Telegram complied with its request and removed 25 deepfakes specified by the commission.

An overwhelming majority of deepfake victims in South Korea are women and teenage girls, according to journalists and activists who have monitored some of the chat rooms.

Local media reports say perpetrators grab victims’ images from social media without their knowledge or consent. Or they secretly take pictures of women around them at home or in school. They then alter the pictures using artificial intelligence and share the results on Telegram with strangers or users who know the victim.

Some of the chat rooms, which come up in searches for terms like “mutual acquaintance room” or “humiliation room,” have thousands of participants.

The number of such chat rooms operating on Telegram and the scope of their alleged abuses are unclear. Many chat rooms are closed and accessible only with an invitation link or permission from the chat room administrator, and some have reportedly shut down since activists and media started tracking them.

In a post on Telegram Thursday, the platform’s founder and CEO Durov said the company has been “committed to engaging with regulators to find the right balance” between privacy and security, while acknowledging that the platform has become “easier for criminals to abuse.”

But data from South Korean law enforcement and government agencies shows a steep increase in digital sex crimes involving fake images in the country.

The government media watchdog said it received nearly 6,500 requests to tackle sexually abusive deepfake videos between January and July of this year — four times the volume of requests from the same period last year.

According to police, in the first seven months of this year, 297 cases of crimes involving sexually explicit deepfakes have been reported, up from 180 in all of 2023.

Many of the victims and perpetrators are teenagers. Of the 178 suspects the police booked during the seven-month period, 74% were ages 10 to 19, up from 65% in 2021. And more than half the deepfakes traced and erased this year by the government-run Advocacy Center for Online Sexual Abuse Victims involved minors.

Perpetrators are bullying women

South Korea has long battled sex crimes including illegal filming, nonconsensual dissemination of sexually explicit images, online grooming and sexual blackmailing.

The creators behind the kind of deepfakes that are rampant on Telegram often target women they know personally, rather than random strangers, according to experts on online sex crimes.

To victims, the damage of such assaults by someone they know goes beyond violating their privacy, says Chang Dahye, a research fellow at the Korean Institute of Criminology and Justice in Seoul, who has studied online sexual assaults.

“They lose trust in their communities,” says Chang. “They fear they can no longer maintain their everyday life with the people around them. Essentially, their trust in social relationships collapses.”

What also differentiates sexually abusive deepfakes from other crimes, according to Chang, is their purpose.

Some perpetrators are motivated by money or a grudge.

But, Chang says, “for most men consuming these contents, the goal is to belittle women in general.”

She explains that deepfakes emphasize identifiable faces and often accompany verbal sexual harassment.

“It’s a form of expressing misogyny and anger toward women. By mocking and belittling women, they get affirmation from each other,” Chang says.

In a joint statement last week, women’s rights groups said the “root cause” of recurring digital sexual abuses is sexism. They blamed President Yoon Suk Yeol’s government for failing to recognize that and letting the problem grow.

Yoon has said that “structural sexism no longer exists” in South Korea and pledged to abolish the country’s Ministry of Gender Equality and Family.

The minister’s position has remained vacant since February, and the ministry’s budget for preventing violence against women and aiding victims experienced a significant cut this year. In a recently announced budget proposal for next year, the fund assigned to the Advocacy Center for Online Sexual Abuse Victims, which deletes online sexual abuse material, decreased from the previous year, despite the center’s surging workload.

Despite those cutbacks, a multiagency government emergency task force and the governing People Power Party recently vowed to strengthen investigations and punishment for deepfake crimes and increase support for victims.

The laws pertaining to digital sex abuse have developed piecemeal as they have tried catching up with new types of crime emerging from new technologies. According to Chang, of the Korean Institute of Criminology and Justice, that leaves a constant gap between what victims perceive as damage and what the law sees as crime.

Even when an action is prosecutable under the law, which currently criminalizes doctored or fake materials that “may cause sexual desire or shame” and are created “for the purpose of dissemination,” perpetrators often evade punishment.

The arrest rate for fake sexual materials last year was 48%, far lower than the rate for other forms of digital sexual assault, police statistics show.

And according to an analysis by South Korean broadcaster MBC, even if the perpetrators are tried in court, about half of them get suspended sentences.

Chang says the legal system still struggles to recognize digital sex abuse as a serious crime with actual victims. “In many cases, judges think the damage is not as severe as in sexual violence involving direct physical contact,” she says.

NPR correspondent Anthony Kuhn contributed reporting from Seoul, South Korea.

Products You May Like

Articles You May Like

‘More of a legacy thing’: Young guard Destiny Jackson commits to Illinois
An ex-Mafia hitman is set for sentencing in the prison killing of gangster James ‘Whitey’ Bulger
NFL rookie shot during alleged robbery attempt released from hospital
Nick Kyrgios offers to coach Coco Gauff following Brad Gilbert backlash
Biden to host U.K. prime minister on Sept. 13

Leave a Reply

Your email address will not be published. Required fields are marked *