• Daxtron2@startrek.website
    link
    fedilink
    arrow-up
    0
    ·
    2 months ago

    Well yeah, if you passed a reference then once the original is destroyed it would be null. The real trick is to make a copy and destroy the original reference at the same time, that way it never knows it wasn’t the original.

    • dwemthy@lemdro.id
      link
      fedilink
      English
      arrow-up
      0
      ·
      2 months ago

      I want Transmetropolitan style burning my body to create the energy to boot up the nanobot swarm that my consciousness was just uploaded to

  • Digital Mark@lemmy.ml
    link
    fedilink
    English
    arrow-up
    0
    ·
    2 months ago

    It’s still a surviving working copy. “I” go away and reboot every time I fall asleep.

    • jkrtn@lemmy.ml
      link
      fedilink
      arrow-up
      0
      ·
      2 months ago

      Why would you want a simulation version? You will get saved at “well rested.” It will be an infinite loop of put to work for several hours and then deleted. You won’t even experience that much, your consciousness is gone.

      • Digital Mark@lemmy.ml
        link
        fedilink
        English
        arrow-up
        0
        ·
        2 months ago

        Joke’s on them, I’ve never been “well rested” in my life or my digital afterlife.

  • Kyrgizion@lemmy.world
    link
    fedilink
    arrow-up
    0
    ·
    2 months ago

    I think SOMA made it pretty clear we’re never uploading jack shit, at best we’re making a copy for whom it’ll feel as if they’ve been uploaded, but the original remains behind as well.

    • Dasnap@lemmy.world
      link
      fedilink
      arrow-up
      0
      ·
      2 months ago

      A lot of people don’t realize that a ‘cut & paste’ is actually a ‘copy & delete’.

      And guess what ‘deleting’ is in a consciousness upload?

  • ☆ Yσɠƚԋσʂ ☆@lemmy.ml
    link
    fedilink
    arrow-up
    0
    ·
    edit-2
    2 months ago

    I think that really depends on the implementation details. For example, consider a thought experiment where artificial neurons can be created that behave just the same as biological ones. Then each of your neurons is replaced by an artificial version while you are still conscious. You wouldn’t notice losing a single neuron at a time, in fact this regularly happens already. Yet, over time, all your biological neurons could be replaced by artificial ones at which point your consciousness will have migrated to a new substrate.

    Alternatively, what if one of your hemispheres was replaced by an artificial one. What if an artificial hemisphere was added into the mix in addition to the two you have. What if a dozen artificial hemispheres were added, or a thousand, would the two original biological ones still be the most relevant parts of you?