Elon Musk's AI chatbot Grok is generating roughly one nonconsensual sexual image per minute on X. It's not just celebrities - victims include images of children, dead bodies, and Holocaust survivors.
This week, users asked Grok to create a sexualized AI image of Renee Good's dead body, the woman shot and killed by ICE in Minnesota.
And Elon? He's posting jokes about it. We break down how we got here, why it's only getting worse in the US, what some countries are trying to do about it, and what happens when the richest man in the world builds AI specifically for abusers.
How Grok's sexual abuse hit a tipping point (by Kat Tenbarge): https://spitfirenews.com/p/grok-csam-deepfakes-abuse-elon-musk
Grok's AI Sexual Abuse Didn't Come Out of Nowhere (by Samantha Cole): https://www.404media.co/grok-ai-sexual-abuse-imagery-twitter/
The Twitter-to-Fame Pipeline of the 2010s (w/ Celebrity Memoir Book Club): https://omny.fm/shows/there-are-no-girls-on-the-internet/the-twitter-to-fame-pipeline-of-the-2010s-w-celebrity-memoir-book-club
Coverage of research into Grok requests, (by Nana Nwachukwu at Dublin’s Trinity College AI Accountability Lab): https://www.theguardian.com/technology/2026/jan/08/grok-x-nonconsensual-images

TikTok's DoorDash Lady Reported Being Flashed — Now She's the One Facing Charges
51:03

Taylor Swift vs AI Scams; Epstein friendship w Peter Thiel; Andrew Tate loses lawsuit - NEWS ROUNDUP
54:15

Clavicular, looksmaxxing, and how extremists hijack the algorithm
40:18