Sexually generative artificial intelligence (AI) and child sexual abuse material (CSAM) stand as harrowing reminders of the dark underbelly of technological advancement and social media expansion. Despite the rise of sexual exploitation on social media platforms such as Meta, TikTok, X, Discord and Snap, Big Tech attempts to evade accountability under the alleged protection of 47 USC Section 230. However, the time for complacency has passed. Urgent action is imperative to curb this epidemic. We cannot afford to allow Big Tech to promote, advertise, publish, create, encourage, profit from, and perpetuate the online exploitation of children. As April is Sexual Assault Awareness Month, it is now time to address this ongoing disaster.

According to the U.S. National Center for Missing and Exploited Children (NCMEC), CSAM is the sexual abuse and exploitation of children through images and videos. CSAM is a particularly heinous and egregious form of child pornography because the child victims suffer re-victimization each time the image of their sexual abuse is viewed. This affects both young girls and young boys. According to NCMEC, “Prepubescent children are at greatest risk to be depicted in CSAM. When boys are victimized, they are much more likely than girls to be subjected to very explicit or egregious abuse. On average boys depicted on CSAM are younger than girls and more likely to have not yet reached puberty. 78% of reports regarding online enticement involved girls and 15% involved boys.”