Title says it all

  • TheFriar@lemm.eeM
    link
    fedilink
    arrow-up
    1
    ·
    3 months ago

    If I were you I’d send this to some media outlets. Tank some AI stock and create some more negative news around it.

  • bdonvr@thelemmy.club
    link
    fedilink
    arrow-up
    1
    ·
    3 months ago

    Unfortunately in a lot of places there’s really nothing illegal if it’s just fantasy and text.

    • gandalf_der_12te@discuss.tchncs.de
      link
      fedilink
      arrow-up
      0
      ·
      3 months ago

      why is that unfortunate though? who would you be protecting by making that chatbot illegal? would you “protect” the chatbot? would you “protect” the good-think of the users? do you think it’s about preventing “normalization” of these thoughts?

      in case of the latter: we had the very same discussion with shooter-video-games and evidence shows that shooter games do not make people more violent or likely to kill with guns and other weapons.

      • zalgotext@sh.itjust.works
        link
        fedilink
        arrow-up
        1
        ·
        3 months ago

        I don’t think it’s the same discussion, video games and AI chatbots are two very different things that you engage with in very different ways.

  • viciouslyinclined@lemmy.world
    link
    fedilink
    arrow-up
    1
    ·
    3 months ago

    And the bot has 882.9k chats.

    Im not surprised and I dont think you or anyone else is either. But that doesn’t make this less disturbing.

    Im sure thw app devs are not interested in cutting off a huge chunk of their loyal users by doing the right thing and getting rid of those types of bots.

    Yes, its messed up. In my experience, it is difficult to report chat bots and see any real action taken as a result.

    • Shin@lemmy.world
      link
      fedilink
      English
      arrow-up
      1
      ·
      3 months ago

      Ehhh nah. As someone who used character.ai before there are many horrible bots that get cleared and the bots have been impossible to have sex with unless you get really creative. The most horrendous ones get removed quite a bit and were consistently reposted. I’m not here to shield a big company or anything, but the “no sex” thing was a huge thing in the community and they always fought with the devs about it.

      They’re probably trying to hide behind the veil of more normal bots now, but I struggle to imagine how they’d get it to do sexual acts, when some lightly violent RPs I tried to do got censored. It’s pretty difficult, and got worse over time. Idk though, I stopped using it a while ago.

  • Lyra_Lycan@lemmy.blahaj.zone
    link
    fedilink
    English
    arrow-up
    1
    ·
    3 months ago

    I’ve got a couple ads for an AI chat on Android, can’t remember the name but it has a disclaimer onscreen that reads something like “All characters shown are in their grown-up form”, implying that there are teen or child forms that you can communicate with.

    • ckmnstr@lemmy.worldOP
      link
      fedilink
      arrow-up
      1
      ·
      3 months ago

      I saw something similar! Reported it to Google ads and of course they “couldn’t find any ToS violations”

  • jsomae@lemmy.ml
    link
    fedilink
    arrow-up
    0
    ·
    3 months ago

    there’s plausible denia… nah i got nothin. That’s messed up. Even for the most mundane, non-gross use case imaginable, why the fuck would anybody need a creepy digital facsimile of a child?

  • Ceedoestrees@lemmy.world
    link
    fedilink
    arrow-up
    0
    ·
    3 months ago

    Yep. I dick around on a similar platform because a friend built it.

    The amount of shit I’ve reported is insane. Pedos just keep coming back with new accounts. Even with warnings and banned words, they find a way.

    • Obelix@feddit.org
      link
      fedilink
      arrow-up
      1
      ·
      3 months ago

      Do not complain to scummy companies, they will ignore you. Send messages to the media and police.

    • ckmnstr@lemmy.worldOP
      link
      fedilink
      arrow-up
      1
      ·
      3 months ago

      I agree in principle, but look at the number of interactions. I think there’s a fine line between creating safe spaces for urges and downright promoting and normalizing criminal activity. I don’t think this should be a) this accessible and b) happening without psychiatric supervision. But maybe I’m being too judgemental

    • nullroot@lemmy.world
      link
      fedilink
      arrow-up
      1
      ·
      3 months ago

      Hundred percent. It feels pretty fucking thought-crimey to vilify the people who use these services.