Like any other day, middle schooler Kim logs onto Instagram. As she opens the app, she receives an explicit video with her face plastered onto it. With it comes an ominous message: “Send me more photos, or I will publish this on the Internet for your friends and family to see.” Stricken with shame and with nobody to lean on, Kim has no choice but to comply with the criminal’s demands.
Deepfakes — images and videos that synthesize faces with adult content via artificial intelligence — recently plunged Korea into a national crisis, with over 2000 victims affected by this deceptive technology.
Such illicit videos originally victimized celebrities. Hyunsang Lee from the Big Wave AI Data Analysis Team said, “With the popularity of K-pop culture, young stars and celebrities have been subject to deepfake. However, with the further proliferation of social media, the target range has expanded to the general public.”
However, as more people flock to this technology, these crimes have begun to creep into the lives of unprotected adolescents, especially young girls. As of Aug. 26, deepfake crimes have affected over 200 schools nationwide. Criminals capture photos from social media accounts and overlay them onto mature content. They then publicly distribute the content on platforms such as Telegram.
Although the public education system and the international school sphere in Korea share few commonalities, deepfake crimes broke through this demographic as well. A student at an international school in Korea photoshopped their classmates’ faces onto adult material and shared them with friends.
The widespread nature of these crimes leaves many young women in fear. “I kept reloading my account to see if I was on the list because I noticed the middle school I attended was included on the list… I was afraid of being involved in that situation, I was afraid of being a victim of the crime,” sophomore Lucy Kim said. “The thought of my face displayed in deep-faked videos and seen by multiple people was terrifying.”
Isabel Bae in seventh grade said, “It’s really easy to create deepfake content. I would be really scared. I was scared. After hearing about the incident I sure did make my account private and took down a lot of my posts.”
Telegram’s accessibility and anonymity provide a haven for perpetrators to distribute and share illicit content. Kim said, “[T]he problem not only arises from mere privacy invasion but it being used in explicit videos without consent.”
“Such crimes occur in the virtual world where transactions between criminal entities are private,” Lee said. “This makes it extremely difficult to monitor and prevent… As the quality of deep fake videos improves—which it will— detection becomes increasingly difficult.”
Cultural factors amplify the burden the victims face as well. Korea’s Confucianism shushes the discussion of “scandalous” topics such as sexual assault even amongst family members. Consequently, victims feel as if they are the ones to blame and hesitate to reach out, – over 80% of cases go unreported.
Kim said, “Although I’m close with my family, I would be reluctant to tell my parents… I’m not sure why. Perhaps I was worried about being misjudged and feeling ashamed.”
The taboo around such topics causes superficial, outdated sex education and silences calls for precautionary measures. “I wasn’t told anything about it at school,” Bae said. “But I do think that it should be more talked about because most students are active on social media [and] not a lot of teachers know about it.”
Kim critiques the school’s daily advisory sessions that aim for social-emotional learning and digital citizenship: “I believe [advisory periods] need to focus on subjects and events closely related to what’s happening in the real world. I don’t really get why this crime hasn’t been officially mentioned.”
“Solutions that can be implemented now are limited, but they still exist. Recognition and filtering systems that detect artificially made content, automatic watermarks or differentiation codes that are detected to be made from deepfakes, and the fortified protection of users are some measures that can be taken,” Lee said.
Although politicians have begun legislative efforts, to protect themselves, users should restrict viewers on social media accounts, regularly monitor impersonations, or implement digital watermarks.
But more importantly, schools must actively fight against the cultural taboo and educate students about these crimes to foster a supportive environment where victims can reach out for help. In the absence of strict regulation laws, vigilance and education remain the best defense.
Dylan • Oct 9, 2024 at 7:28 pm
Wow, it’s sad to see that so many people have become victims of this AI and criminal activity.
Jerome • Oct 9, 2024 at 7:27 pm
I love how you connected deepfake crimes with the social taboos of Korea! Thank you for such an amazing article.