Westfield High School Student Seeks Justice in Federal Court Over AI-Generated Nude Images

Westfield, New Jersey – A high school student from Westfield, New Jersey, is taking legal action against a classmate who allegedly shared AI-generated nude images of her online. The girl, identified as Jane Doe to protect her identity, has filed a lawsuit in federal court along with her parents. They claim that the boy, identified only by his initials, violated her privacy by circulating the images, and they are seeking justice.

The lawsuit argues that the evidence gathered by Westfield High School could not be used in the investigation because the boy and other potential witnesses refused to cooperate or provide access to their electronic devices. Shane Vogt, the attorney representing Jane Doe, highlights the long-lasting emotional and psychological impact that nonconsensual pornography has on its victims. Vogt emphasizes that victims not only suffer from the creation of such images but also face the constant fear that these images can resurface and be viewed by countless individuals.

This legal battle has gained international attention and prompted U.S. Representative Tom Kean Jr., a Republican from New Jersey, to cosponsor the Preventing Deepfakes of Intimate Images Act. The case came to light when school officials informed Jane Doe’s parents that she was a confirmed victim of distributed pornography. The suit alleges that the boy downloaded images of the girl, as well as other girls, from social media sites and used an AI app, likely ClothesOff, to manipulate and share the images through Snapchat.

The lawsuit contends that the images, due to Jane Doe’s age at the time they were taken, qualify as child pornography and are therefore not protected by the First Amendment. Christopher Adams, the boy’s attorney, argues that current laws do not regulate or prohibit the creation and dissemination of “deepfake” images, as claimed by Jane Doe’s legal team. However, Vogt and John Gulyas, the girl’s local attorney, assert that the images are illegal and actionable “morphed” pornography. They emphasize that the images are real and that whether they are computer-generated or not is beside the point.

Jane Doe and her family are seeking compensation of $150,000 for each of the nude images that were posted online. The lawsuit does not specify the exact number of images involved. This legal battle will undoubtedly shed light on the issue of nonconsensual pornography and the regulations surrounding “deepfake” images. It remains to be seen how the court will interpret and address this complex and evolving aspect of digital privacy and personal rights.

For inquiries, please contact [email protected]. This article was written by Mike Deak, a reporter for mycentraljersey.com, offering unlimited access to his articles on Somerset and Hunterdon counties for subscribers and digital account holders.