astercontrol:

phaeton-flier:

lilietsblog:

argumate:

centrally-unplanned:

phaeton-flier:

phaeton-flier:

This isn’t the same as a rigorous philosophical argument, but my suspicion about the cloning/teleporter problem is that if such a device were created, everyone drops their issues in 5 minutes. I think you see your friend for lunch a week after they casually travel to Mars and they’re the same and you stop caring regardless of they’re the same de re. They certainly seem to think they’re the same!

Eventually the desire to go to a destination wedding outweighs whatever abstract fear you have about this being some abstract death, you walk through, and you get told that due to an accident you were actually created 10 seconds ago with a collection of memories no one ever experienced. You are a traveler from nowhere whose most cherished memories and friends never existed

Ma'am- Ma'am, there is no wedding. Those people don’t exist. I’m sorry but I cannot call them because they never existed in the first place. Your memories just happen to cohere with themselves but not with reality.

Normally we’d try and call a friend or emergency contact but that’s not really an option for you. There’s a hotel we can book you at for a bit.

This is typically the way of things for almost all technology, but I do actually think this one will give people pause. For everything else it is like “oh, is your sexy AI companion ~real~?” and it is like bro who cares, that doesn’t actually affect me in the day to day and the ways that it does I will deal with in a granular, context-abundant fashion. For teleporters the challenge is “This device will fucking murder you. It will shoot a bullet into your skull and you will be dead forever”. You can’t ease your way into that, and people really care about that specific outcome. And all the ways that the technology “mitigates” the problem by showing you after the fact happy and healthy are in fact exactly part of the defined problem space - if that happy, healthy person isn’t you you don’t care in the slightest.

That isn’t to say that it won’t have uptake, I just think this tech will in fact be something like a religious split, with large swaths of the populace unwilling to use them. Based on how “necessary” they become for things like high-status career tracks, etc, you will see more or less uptake, and I am open to a new generation of people who “grew up with them” and used them when they were babies eventually leading to mass adoption. But it will be a generational process compared to say cellphone adoption or w/e.

(though this kind of teleporter is so high tech it might be impossible to even consider what a society that has it really looks, this is a “randomly shows up tomorrow in our world” kind of deal)

ironically that Christopher Nolan movie (spoilers) had the best representation of repeated teleportation, where he duplicates himself dozens of times and kills the clone each time, so as a result he “remembers” surviving teleportation dozens of times, implying that each clone is increasingly shocked to find themselves dying.

I maintain that “what will society with teleporters be like” is an extremely silly question because by the time the science can do that, the science will be extremely aware of whether or not the teleporter murders you. Like it won’t be philosophical. They’ll know how teleportation works.

No? Knowing how it works, in terms of physics, is not the same as knowing that it doesn’t kill you, philosophically.

Yeah, it’s the philosophical question of, “if my current body is destroyed and is replaced with an exact copy identical to me just before death, including all memories duplicated exactly, does that count as death in any way that matters to me?”

And even knowing all the science, different people will have different opinions of that.

An additional thought experiment that people might use to figure out their own opinion, is: “What if my original body didn’t die right away, but coexisted with the copy for a while and THEN the first one died and the other lived on?”

This is different emotionally from the experience of just vanishing and popping back into existence “as” the copy, because– as long as it’s painless– that feels indistinguishable from simply continuing as the same person. But, if you persist some time after the copy is made, you have to face something that really feels like your own death then.

Personally, I think that in that case (if I were the original copy) I would feel less okay about dying in that scenario. But I would be able to reconcile myself to it, with the argument that it’s functionally the same as just going on existing but losing some short time’s worth of memories, to be replaced with the same number of memories from a different viewpoint. (Altering my memories is an idea I really don’t like, but it doesn’t scare me as much as death, assuming it’s not enough to alter my whole identity.)

…and, as for the fear of the scenario OP suggested, where the process spontaneously creates a person who never existed, with memories that never existed? That’s fucking terrifying.

No one you know was ever real. You have nobody. What would your life even be from then on? What could you even do, knowing that any skills and knowledge you had might be complete nonsense inapplicable to the real world.

…But this would not be avoidable on an individual level just by choosing not to teleport. Even if the person in the example “had chosen not to” take the teleporter to the wedding– that would just be a slight difference in a set of spontaneously hallucinated memories. It wouldn’t affect the fact that the person was generated with that set of memories, or how thoroughly fucked they are now.