• SavvyWolf@pawb.social
    link
    fedilink
    English
    arrow-up
    3
    ·
    3 days ago

    “Hate” and “Love” are complex things that are very ingrained into human nature. For an AI or robot to have these things, it would essentially have to emulate or implement a human mind. Such a thing is currently very far beyond our current technology level, and arguably isn’t even a goal of many AI projects. Most AI systems like ChatGPT are basically glorified autocomplete. They are given an input and use complex maths and probabilities to predict what a human would respond to it. They don’t have any understanding about what they are talking about.

    I think if an AI were able to hate or love, it would raise complicated and perhaps uncomfortable questions about what it means to be “human”. Can a system that perfectly replicates human emotions and experiences, not be considered human itself?

  • threelonmusketeers@sh.itjust.works
    link
    fedilink
    English
    arrow-up
    4
    ·
    3 days ago

    I’m not sure if we understand “hate” and “love” well enough in biological brains to the point where we would be able to replicate the emotions with transistors and be confident that we had succeeded.

  • BougieBirdie@lemmy.blahaj.zone
    link
    fedilink
    English
    arrow-up
    1
    ·
    3 days ago

    You can’t teach a computer to feel because computers lack the hardware for emotion.

    You might be able to emulate a feeling in a convincing way. This means that computers will always be somewhat sociopathic.

    So to your question, I think that means they can mirror their inputs or fake feelings according to their programming

  • AwkwardLookMonkeyPuppet@lemmy.world
    link
    fedilink
    English
    arrow-up
    1
    ·
    3 days ago

    They mimic the inputs. Microsoft made a chatbot a few years ago named Tay who turned into a hateful Nazi in less than 24 hours because Microsoft didn’t install any safeguards around the type of inputs it received. The program was scrapped almost immediately.