If you Google my name, the phrase “revenge porn” is bound to pop up. Go ahead and do it, I’ll wait. There’s a good reason for this association: I led a protest march across the Brooklyn Bridge called “March Against Revenge Porn,” and just a few years ago, I spoke at a TEDx event about my journey from victim to activist.
If you Google my name, the phrase “revenge porn” is bound to pop up. Go ahead and do it, I’ll wait.
But today, I’d never use those words to describe my experience. I’m not a victim of “revenge porn” — I’m the victim of child sexual abuse material, or CSAM, and image-based sexual violence, or IBSV.
And these distinctions matter. Using the correct terms is crucial in raising awareness of a problem that is still traumatizing thousands, and to getting lawmakers and tech companies to take action. Pornography is produced between two consenting adults and has nothing in common with CSAM, which depicts sexualization, rape, and assault of children and teenagers.
As I’ve come to understand how to accurately categorize my abuse, I’ve also learned more about the broader landscape of CSAM. It’s not an exaggeration to describe this as an epidemic of harm that is shattering childhoods worldwide. In 2022, the National Center for Missing and Exploited Children’s CyberTipline received 32 million reports of CSAM. The vast majority of those reports were submitted by Google, WhatsApp, Facebook, Instagram and Omegle. (Omegle shut down in 2023.) In 2023, NCMEC says it received a record 36 million reports.
Missing from this list is Apple, and its nearly entirely unregulated iCloud.
In 2022, while other large tech companies reported and worked to remove millions of CSAM, Apple reported just 234 pieces of content. That’s because, unlike many of its competitors, the company refuses to voluntarily monitor for and flag when such content is uploaded. Reporting indicates that Apple’s disclosures are merely the tip of an iceberg of…
Read the full article here