cally [he/they]

what are you doing in my lemmy profile

  • 5 Posts
  • 468 Comments
Joined 2 years ago
cake
Cake day: September 14th, 2023

help-circle


  • How would that work? What if I gain access to the AI and predict my own choices? Would the AI be able to predict that I am using it, and somehow come to a conclusion even though its conclusions would change my behavior?

    Let’s say the AI says that I’ll do thing A, and then I see that and choose to do thing B, the AI is wrong.

    But if AI had predicted thing B, I, the smartass, would’ve chosen to do thing A, the opposite, so the AI is wrong.

    How intelligent would it need to be to realize that my behavior depends on its output, and that it could control me with its predictions? Maybe the AI predicts that I’ll use it, so it deliberately shifts its predictions in a way to make me act in its favor somehow…

    Is there a name for this kind of paradox? Can a machine predict itself?

    This is the issue I have with machines that predict the universe, because if the machine itself influences the universe (even if in a relatively small way), the machine would have to replicate itself in its simulation, which would be a problem as the simulated machine would also have to predict itself, etc, etc… this seems like it’d require infinite computing power. So by extension, if the super-intelligence wants to predict my actions, but I have access to the machine, then the machine would need to predict itself.



  • The only decent use of LLMs I have seen is a bot on Mastodon that generates alt-text automatically for posts of users who opt-in, and shows how much power was used to generate the text.

    Other than that use case, I hate LLMs. They have never been useful for me and it just seems like overkill for most things, a search engine would do fine, you just have to use your brain a little bit.

    Image generation also sucks. Stock images exist and you can just draw, take a photo, hire someone, etc. AI makes people less willing to learn because they think that AI can just do it better, it makes them miss out on the joy of art.

    AI’s job is to lie about being human. I fundamentally disagree with that and think its existence is unnaceptable. If AI-generated output was somehow always tagged as AI-generated, then maybe I would change my mind about this, but this is completely unenforceable because AI copies and mimics human output.