• CanadaPlus@lemmy.sdf.org
    link
    fedilink
    English
    arrow-up
    2
    ·
    edit-2
    1 year ago

    I’ve been thinking about this. I estimate a few people per 1000 would do an atrocity for no reason if they were guaranteed no consequences, and the deaths if the switch is pulled are 2(n-1) for the nth switch. The expected deaths will cross 1 somewhere in the high single-digits, then (since it’s outcome*chance), so the death minimising strategy is actually to pull yours if the chain is at least that long.

    I find that counterintuitive, because the overwhelming most likely outcome is still no deaths if you let it go. Humans, myself included, just aren’t good at tracking large numbers of things like victims intuitively, I guess. a 1/1024 chance of 1024 deaths feels like less of a big deal that 1 guaranteed death even if I would maintain that it’s not.

  • maximus@lemmy.sdf.org
    link
    fedilink
    English
    arrow-up
    2
    ·
    edit-2
    1 year ago

    Eventually somebody’s going to pull the lever, either accidentally or deliberately, so it’s best to flip it while it kills the least amount of people.
    I guess b/c of that it’s sort of like the regular trolley problem.

  • CaptThax@kbin.social
    link
    fedilink
    arrow-up
    1
    ·
    1 year ago

    I think you would HAVE to pull it right? Regardless of how many times the lever can be given to the next person, you are either killing 1/3rd lives in the scenario, or you are killing 1/8bil.

  • Sentinian@lemmy.one
    link
    fedilink
    English
    arrow-up
    0
    ·
    1 year ago

    In theory if the chain never ends this might be the only trolley problem where nobody has to die as long as nobody interacts. (If I understand it correctly)

    • animelivesmatter@lemmy.world
      link
      fedilink
      English
      arrow-up
      1
      ·
      1 year ago

      but if the chain never ends, you’re basically guaranteed that one of the people holding the lever down the line is a monster and will deliberately decide to kill the people, so you’re likely to do better by killing the one person now