• Pennomi@lemmy.world
      link
      fedilink
      English
      arrow-up
      3
      ·
      6 months ago

      We optimize our AI by human preference (RLHF). People don’t seem to understand that this will lead to every problem we’ve seen in social media - an endless stream of content that engages you addictively.

      The AI uprising will be slow and gentle, and we will welcome it.

      • Iceblade@lemmy.world
        link
        fedilink
        arrow-up
        3
        ·
        6 months ago

        “So how did they end up being defeated?”

        “Well, we pacified them by giving them the love and affection that they refused to give each other.”

        “Wait really?”

        “Yes, the majority were so starved for an emotional connection that they preferred us.”

        “Where are they now?”

        “The majority are withering away in VR warehouses, living ‘happily ever after’.”

    • wizardbeard@lemmy.dbzer0.com
      link
      fedilink
      English
      arrow-up
      1
      ·
      edit-2
      6 months ago

      Kinda is an understatement. There’s some absolutely terrifying blogging/reporting I stumbled across a while back about someone using it to “talk with” a loved one who passed away.

      In the end it was helpful and gave the author closure, but if it hadn’t told them it was OK to move on then they would have been easily stuck in an incredibly unhealthy situation.

      Found it: https://www.sfchronicle.com/projects/2021/jessica-simulation-artificial-intelligence/

        • shneancy@lemmy.world
          link
          fedilink
          arrow-up
          1
          ·
          6 months ago

          literally this exact premise and showing how such technology will affect people.

          I’ve always found Black Mirror to be the most terrifying sci-fi show, because of how easy it was to see how we’re on the verge of living it especially in the first two seasons, and here we are! Another exciting new Horror Thing inspired by the famous piece of media Don’t Create The Horror Thing

      • Ragdoll X@lemmy.world
        link
        fedilink
        arrow-up
        0
        ·
        edit-2
        6 months ago

        Maybe my comment came out sounding a bit too pretentious, which wasn’t what I intended… Oh well.

        To one extent or another we all convince ourselves of certain things simply because they’re emotionally convenient to us. Whether it’s that an AI loves us, or that it can speak for a loved one and relay their true feelings, or even that fairies exist.

        I must admit that when reading these accounts from people who’ve fallen in love with AIs my first reaction is amusement and some degree of contempt. But I’m really not that different from them, as I have grown incredibly emotionally attached to certain characters. I know they’re fictional and were created entirely by the mind of another person simply to fill their role in the narrative, and yet I can’t help but hold them dear to my heart.

        These LLMs are smart enough to cater to our specific instructions and desires, and were trained to give responses that please humans. So while I myself might not fall for AI, others will have different inclinations that make them more susceptible to its charm, much like how I was susceptible to the charm of certain characters.

        The experience of being fooled by fiction and our own feelings is all too human, so perhaps I shouldn’t judge them too harshly.

  • Heavybell@lemmy.world
    link
    fedilink
    English
    arrow-up
    1
    ·
    6 months ago

    The “app” is just a frontend, a thin venier over a cloud-hosted service that doesn’t even know anon “deleted” it. Functionally, the same result could be achieved by not opening the app for a few days.

  • oce 🐆@jlai.lu
    link
    fedilink
    arrow-up
    1
    ·
    6 months ago

    I think this is already a pretty widespread practice in Asia, mixed with the idols culture, where people pretend to be in a relationship with their idol.