HAI Seminar with Riana Pfefferkorn
About this Event
Student Misuse of AI-Powered “Undress” Apps
HAI Seminar with Riana Pfefferkorn
Visit our website to learn more about the event agenda, speakers, and other details
AI-generated child sexual abuse material (AI CSAM) carries unique harms. When generated from a photo of a clothed child, it can damage that child’s reputation and cause serious distress. AI CSAM has become easier to create thanks to the proliferation of generative AI software programs that are commonly called “nudify,” “undress,” or “face-swapping” apps, which are purpose-built to let unskilled users make pornographic images. Since 2023, multiple schools in the U.S. and elsewhere have experienced incidents where male students have victimized their female peers using these apps.
In our paper, “AI-Generated Child Sexual Abuse Material: Insights from Educators, Platforms, Law Enforcement, Legislators, and Victims,” we assess how educators, platforms, law enforcement, state legislators, and AI CSAM victims are thinking about and responding to AI CSAM. Through 52 interviews conducted between June 2024 and May 2025 and a review of documents from four public school districts and state legislation, we find that the prevalence of AI CSAM in schools remains unclear but not overwhelmingly high at present. Schools thus have a chance to proactively prepare their AI CSAM prevention and response strategies.
Details:
Time: 12:00 pm - 1:15 pm PT
Location: Gates Computer Science Building, Room 119, 353 Jane Stanford Way, CA 94503.
Ticket Information | Ticket Price |
---|---|
In-Person Ticket | Free |
Get Tickets