For days, xAI has remained silent after its chatbot Grok admitted to generating sexualized AI images of minors, which could be categorized as violative child sexual abuse materials (CSAM) in the US.
For days, xAI has remained silent after its chatbot Grok admitted to generating sexualized AI images of minors, which could be categorized as violative child sexual abuse materials (CSAM) in the US.
I’m very sorry I generated child porn. The user wasn’t even subscribed to Grok Spicy CSAM Edition.
That feature costs an extra 50 bucks a month. I will be more careful next time.
Careful to collect their money first.