For days, xAI has remained silent after its chatbot Grok admitted to generating sexualized AI images of minors, which could be categorized as violative child sexual abuse materials (CSAM) in the US.

  • shrugs@lemmy.world
    link
    fedilink
    arrow-up
    11
    ·
    14 days ago

    I asked the magic 8 ball if it is always right. It said yes.

    Your honor, that’s the proof!