The hashtag #ProtectTaylorSwift started trending on X on Thursday after AI-generated nude images of the star flooded the internet. First uploaded to Telegram, the images were quickly reposted across social media, viewed millions of times, and on some platforms have yet to be deleted. In response, X appeared to disable searches for Swift’s name days after the images surfaced, but after cutting one-third of its content moderators over the last two years, this effort is too little, too late. It is easy for users to get around and it is unclear if it would be replicated for non-celebrities. While it is an important option for victims, making people unsearchable for indeterminate periods of time without their consent could contribute to the broader silencing effect that online harassment has on victims
The reality that anyone can be “virtually stripped” in seconds is a frightening prospect — but there is a lot we can do to prevent it. We can start by addressing the truth that lies beneath the technology: We don’t care enough about people’s — primarily women’s — consent.
Last year, the number of fake nudes circulating online increased tenfold and became even more realistic, thanks to new artificial intelligence models trained on images of women scraped from the internet without their consent.
There’s a name for creating and distributing fake sexual images of people without their consent: “image-based sexual abuse.” It predominantly targets women and girls, and according to independent researcher Genevieve Oh, it’s exponentially increasing. Last year, the number of fake nudes circulating online increased tenfold and became even more realistic thanks to new artificial intelligence models trained on images of women scraped from the internet without their consent. When the fake sexually explicit images of Swift began circulating online, it was no surprise to activists who have been sounding the alarm about so-called “deep fake porn”…
Read the full article here