|
Post by Star Cluster on Apr 19, 2009 12:48:49 GMT -5
Think about it this way. Consider two computers. Yeah, I know, but bear with me for the point of illustration. One computer has been used for quite some time and has accumulated a vast array of information. Now, a second computer has been formatted that has been configured exactly the same as the original and has been loaded with the exact same data as the original. At that point, they are identical, one being a copy of the other. Now, unless those computers are linked together on a network, they will, as you stated, diverge. Each will still retain the information (memories) prior to the duplication, but from that point on, they are separate and unique. The data acquired afterward will probably be quite different and at some point in the future, they will likely not be even close to resembling each other except in outward appearance. Let's say there are ten parallel universes, each one exactly the same as the others. Would there be anything unethical about destroying all but one? And this question has what to do with the OP? Or did I miss something somewhere in the thread that equated consciousness with ethics?
|
|
|
Post by needless on Apr 19, 2009 12:58:13 GMT -5
Let's say there are ten parallel universes, each one exactly the same as the others. Would there be anything unethical about destroying all but one? And this question has what to do with the OP? Or did I miss something somewhere in the thread that equated consciousness with ethics? I'm not really sure, I just thought of it and I was curious what people would say.
|
|
|
Post by Star Cluster on Apr 19, 2009 13:06:16 GMT -5
And this question has what to do with the OP? Or did I miss something somewhere in the thread that equated consciousness with ethics? I'm not really sure, I just thought of it and I was curious what people would say. Oh, okay. As to the answer to the question then, yes, I think it would be unethical to destroy all but one. If they are indeed identical, then they would have the identical number of inhabitants. And referring back to the OP, each inhabitant would, in my opinion, have their own consciousness so you would be murdering 9 individuals for each one saved. And there could be nothing ethical about that. Edit: to correct spelling.
|
|
|
Post by Trevelyan on Apr 19, 2009 13:07:05 GMT -5
I think what you are trying to get at is something that I've often pondered. Take sci-fi books. Often they include some way in which people can back themselves up, usually with some type of digital storage and cloning.
Now what I think you are asking is, if you were killed, and your clone activated, is that you? From an outside prospective, yes the clone is you. It has your experiences, it has your memories up to a certain point, and your body. What does this mean for the copy of you that is killed though? There in lies the question. When the you that was killed is killed, necessarily this means it was slightly different from the saved copy. Now while the copy is still 99.9% the same as you, they are still slightly different. They will see themselves as "you", behave as "you" did so long as the copies where fairly recent. However, technically speaking one entity has been destroyed, and another created that is just really really really like that one. So, what I think you are asking is, "If you died, when the copy was made, would it be just like you're playing a video game? Would it be the same you, the same consciousness you have right this instant, transferred to the new copy after a short pause and some crappy elevator music?" Not that I can claim to be sure on something like this because obviously we don't have the ability to test it, but I still want to say no.
However, I have read in sci-fi books another way that consciousness is backed up and safeguarded from death. In the Richard K. Morgan series of books, people have a "stack" in the base of their brain stems that records everything constantly. When you die, as long as the stack itself is not destroyed the patterns of your consciousness can be put into a new brain and you will have the exact same memories up to the point that you were killed. In that case I think that you would be "the same person" as your consciousness was never copied, but rather backed up right at the last minute, much like if you paused a video game.
Of course the above statements and explanations are merely my own. This is no way makes them correct, or right. If they are wrong however, I hope they are wrong in a hilarious and amusing way.
|
|
|
Post by needless on Apr 19, 2009 13:38:10 GMT -5
Now while the copy is still 99.9% the same as you, they are still slightly different. They will see themselves as "you", behave as "you" did so long as the copies where fairly recent. In my hypothetical situation in the OP the copy starts at absolute 100% and slowly diverges from there. Not that I can claim to be sure on something like this because obviously we don't have the ability to test it, but I still want to say no. Even when they are exactly the same?
|
|
|
Post by Captain Obvious on Apr 19, 2009 14:28:02 GMT -5
First of all you did not prove that
You just assumed that. And anyway, you would continue being the one that was copied, because you are still there. The copy would also think it was you. For you it would seem that your consciousness remained in the original, for the copy it would seem that the consciousness moved into the new body.
|
|
|
Post by Sigmaleph on Apr 19, 2009 14:37:54 GMT -5
On the original question: Both are you. On the destruction of universes: Assuming the universes are and always will be exactly the same, and the manner of destruction is an instantaneous "blink out of existence" or equivalent, then the action is not exactly ethical, but it's irrelevant. Ethically neutral, so to speak. If the universes had potential to diverge (through some random element, for example), or the destruction was a painful process for the sentient beings within it, it would be unethical. If you have a purpose for destroying 9 universes, though, it might change this.
|
|
|
Post by needless on Apr 19, 2009 14:42:07 GMT -5
You just assumed that. And anyway, you would continue being the one that was copied, because you are still there. The copy would also think it was you. For you it would seem that your consciousness remained in the original, for the copy it would seem that the consciousness moved into the new body. Here is a variation on the transporter conundrum: At what point does the transporter become unethical? 1. It simply teleports you, nothing is created or destroyed. 2. It creates a copy, 100% exactly the same but destroys the original at the exact instant it creates the copy, meaning they never had a chance to diverge from eachother. 3. It creates a copy as in number 2 but the original is allowed to remain for a moment before being destroyed. Is there a certain threshold of divergence at which point it becomes unethical to destroy it? Sorry if it sounds like I am throwing stuff at you, It's hard to communicate what I am trying to say without an example.
|
|
|
Post by Sigmaleph on Apr 19, 2009 18:13:44 GMT -5
One and two are functionally identical, neither is unethical. Three is a bit trickier, if it's very short lapse of time it would be the same as the other two, but if you let the original go for long enough to realise it will be destroyed it would be unethical. That's my take on it.
|
|
|
Post by Captain Obvious on Apr 22, 2009 12:00:20 GMT -5
You just assumed that. And anyway, you would continue being the one that was copied, because you are still there. The copy would also think it was you. For you it would seem that your consciousness remained in the original, for the copy it would seem that the consciousness moved into the new body. Here is a variation on the transporter conundrum: At what point does the transporter become unethical? 1. It simply teleports you, nothing is created or destroyed. 2. It creates a copy, 100% exactly the same but destroys the original at the exact instant it creates the copy, meaning they never had a chance to diverge from eachother. 3. It creates a copy as in number 2 but the original is allowed to remain for a moment before being destroyed. Is there a certain threshold of divergence at which point it becomes unethical to destroy it? Sorry if it sounds like I am throwing stuff at you, It's hard to communicate what I am trying to say without an example. 1. No ethical problems here boss. 2. I still don't see the problem. 3. Nope, still no problem with that. Sure the old me is going to get a nasy surprise. But the new me is exactly the same as the old me and thinks it is the old me so it is effectively the same.
|
|
|
Post by Oriet on Apr 24, 2009 15:25:26 GMT -5
At no point with the original and copy be exactly the same. Even if you make an exact duplicate down the all quantum states, being that they are in different locations, no matter how close, itself is a difference, and also means they are immediately subject to different gravitational influences and background radiation, even at the exactly moment of duplication. Therefore the premise that they start off exactly the same is false, as no matter how close to the same they are they can never be the fully exactly alike. Even if one were to somehow neutralize gravitational influence, such as a generated gravity field within an area cut off from outside gravity waves, and completely identical facilities around the original and duplicate, before there'd be any chance to experience anything, or even before one electron can change orbit, there will be differences at the quantum state thus rendering it an inexact duplicate.
Beyond that they would still both be individuals, wholly separate from each other unless their particles are entangled; but even then that'd really only last until outside influence changes it. Immediately after the moment of duplication their experiences will be different, and so could not scientifically be considered the same consciousness no matter how similar they might otherwise be.
As for the teleporters: #1 has no ethical concerns. #2 (which is how it works in Star Trek) does have ethical concerns, as earlier explained the copy is not the original, however being that there would be no chance for them to diverge it can be considered to be as though there was no destruction/reconstruction and so, aside from transporter malfunction concerns it could be considered an ethically moot point. #3 would be unethical as it would be murder. As explained above, even a fraction of a second where the particles not quantumly entangled causes them to be different persons.
|
|
|
Post by Paradox on Apr 25, 2009 23:06:17 GMT -5
From what I've read there is still no scientific consensus on what consciousness is (if indeed it is anything at all), or what causes it. We need to do more research. In recent years the rate of learning about the brain has skyrocketed, but there's still a hell of a lot we don't know.
|
|
|
Post by Julian on Apr 26, 2009 7:05:47 GMT -5
What would happen if an exact copy was made of your brain? Both of you can't share the same consciousness. Assuming the first was possible which it currently isn't, how on Earth did your particular variant of said brain come to the conclusion in your second sentence... Please elaborate. Namely what the fuck do you think consciousness is, and why the fuck didn't it get copied over. Please reference any articles by neurosurgeons or biologists you think will help your 'case'.
|
|
|
Post by needless on Apr 26, 2009 11:45:51 GMT -5
What would happen if an exact copy was made of your brain? Both of you can't share the same consciousness. Assuming the first was possible which it currently isn't, how on Earth did your particular variant of said brain come to the conclusion in your second sentence... Please elaborate. Namely what the fuck do you think consciousness is, and why the fuck didn't it get copied over. Please reference any articles by neurosurgeons or biologists you think will help your 'case'. I have done elaborate research which I have put in a separate threadI don't know. I tried to get a thread started but intuitively none of the explanations seem to make sense to me. This either means there is another explanation I have not heard of, or my intuition is wrong (probable). Either way the thread is interesting and might help me understand it better.
|
|
|
Post by Julian on Apr 26, 2009 12:26:19 GMT -5
Assuming the first was possible which it currently isn't, how on Earth did your particular variant of said brain come to the conclusion in your second sentence... Please elaborate. Namely what the fuck do you think consciousness is, and why the fuck didn't it get copied over. Please reference any articles by neurosurgeons or biologists you think will help your 'case'. I have done elaborate research which I have put in a separate threadI don't know. I tried to get a thread started but intuitively none of the explanations seem to make sense to me. This either means there is another explanation I have not heard of, or my intuition is wrong (probable). Either way the thread is interesting and might help me understand it better. I see, so basically you're a clueless, idiotic, half-assed pseudo-troll, who didn't actually have any idea about it, but still decided to call people who were educating you a little - wrong. How the fuck's that work then? Do you need to start a new thread for that too?
|
|