• freddydunningkruger@lemmy.world
          link
          fedilink
          English
          arrow-up
          2
          arrow-down
          1
          ·
          2 days ago

          Someone programmed/trained/created a chatbot that talked a kid into killing himself. It’s no different than a chatbot that answers questions on how to create explosive devices, or make a toxic poison.

          If that doesn’t make sense to you, you might want to question whether it’s the chatbot that is mindless.