For one, it is impossible for science to distinguish between origin and expression. A perfect copy of the physical structure may only produce a mimic, an imperfect image of the original.
The android looked at me. "Everything is data. Two copies of the same data are indistinguishable. They are the same."
"Uh, no, you don't get it. Haruki died. It's not the same guy. He's still dead. Your guy is a clone. We humans have this concept called a soul."
Yuki said nothing, so I went on.
"Uhm, let me try to explain this. Ok, here's an analogy. Let's pretend that you have a mind transfer machine. You know, a machine like that one in the movie Young Frankenstein. Say that you zapped my mind into Frankenstein, and you swapped his mind into mine."
She didn't react so I kept going.
"Ok, now you pick up a gun. You shoot me, my body I mean. I'm dead. So here's the question: Who is left alive now?"
"You."
"No! I died. My body died. That's just a copy, a replica."
"No. It is the same."
"No it is not! I died. Someone else is taking my place. If I was in heaven I'd probably be pretty pissed off about it."
"I do not understand."
Kyonko finally spoke up in my defense. "You're saying it is not the same 'you'. It's a different person."
"Yeah. Thanks for finally helping me out here."
"I'm sorry, Kyon. I want to help you, but I'm not saying that I agree with you. In fact I think I don't."
"Aw c'mon!" I pounded my chest. "This is me. This stuff right here. These atoms that make up my body, my brain, are me. If you shoot my brain I'm dead. If you can somehow magically recreate a new brain, and give it the same neurons, the same interconnections, it's just a clone. It looks like me, talks like me, walks like me, quacks like me, but it is not me. It is a different duck!"
Yuki was being patient with me. "That is not logical. Your 'clone', as you call it, is made of exactly the same data. Therefore it is the same. Two copies of the same data are indistinguishable."
Kurosawa leaned into Kyonko and whispered, "He's a materialist."
I overheard him. "Dang right I am! If you shoot me in the head, I'm dead. Forever. My soul, or whatever you want to call it, is kaput."
I was getting worked up. "Look, I don't believe in heaven. I only believe in what I can see. And even if I did believe in it, I am pretty sure that I would be pretty pissed off when I looked down from my fluffy white cloud. I mean, down there is some other guy, some imposter jerk, down there pretending to be me, taking my place!"
Kyonko leaned in my direction. "Kyon, I'm sorry but you're view is simplistic. If you are trying base your identity, your 'you', solely on the basis of your physical body, it won't work."
I banged my chest again. "Yes it does! This is my body, my stuff, this is me!"
"You really think so?"
"Well, yeah!" Then I crossed my arms. "I'm not changing my mind on this."
Yuki explained, "Kyon, every atom, every molecule in your physical brain, is replaced approximately every three years through metabolism and waste elimination."
"What?"
"Yes. It is true."
"Huh?"
Kurosawa giggled. "She means your brain gets flushed down the toilet every three years."
"You're kidding."
Yuki continued to blast away my argument. "Yes. It's replaced. Completely. The physical 'stuff' of your brain, as you call it, is gone. So where is the 'you' now? Where is your 'soul', so to speak?"
I admitted I didn't know.
"It is data combined with an actualizer. It is your DNA coding sequence combined with the enumeration of your neural connections between your axons and synapses. Then it is actualized. All within the system. You understand?"
To his credit, Kurosawa was one who tried to help me out. "Let me try to explain what Yuki is saying." He used an analogy. "You remember Star Trek, the transporter on that TV show?"
"Uh, yeah.."
"You remember Leonard 'Bones' McCoy, the ship's doctor? He always hated the transporter. He called it 'that infernal contraption that's always flinging my atoms all over the place'. The transporter converted matter to energy, transmitted it, and then converted the energy back into matter."
"Yeah. He really hated that thing."
"Right. Well, you are like McCoy. The transporter is actually just transmitting data, along with the raw energy to reassemble it back into matter. The atoms in the source pad were destroyed and turned into pure raw energy. It's Einstein's basic equation, E equals m c squared. Then on the planet the transporter ran the equation backwards to create different atoms, new atoms, in the same configuration using that raw energy. But the energy itself is always fluid, undifferentiated."
"So you are saying they weren't the same atoms?"
"That's right. So in McCoy's view the transporter killed him. Then it created a clone of him on the planet surface. That's your view too, Kyon."
"I guess so.."
Kyonko chimed in. "Don't you see? Yuki is saying was that it doesn't matter. Your identity is tied to your data, not the particular atoms in your body. In her view you are just transported. It is still the same 'you'."
"Wait. So, let's say you cloned me somehow, and you copied my mind too. You copied it exactly. Now tell me, what happens? Are there two of 'me' now? I mean, two bodies with the same soul?"
"Yes Kyon. There would now be two of 'you'."
This was nuts.
She went on. "But as soon as that happens they diverge. Each clone has a different experience. The neural configuration changes. They quickly become two separate people."
"You mean their, uh, souls, split?"
"That is semantics. But I believe that the answer would be yes."
Wow. I need to tell McCoy about that. Version 342 that is.
I remembered an episode of Star Trek where the transporter malfunctioned. It beamed up two copies of Kirk, the good Kirk and the evil Kirk. So they were in fact two entirely separate people. They both had unique identities. Unique 'souls' in my point of view. And when Scotty fixed the transporter to re-merge the two halves, the normal Kirk popped out. It was the same soul as the original.
I understood. The soul and the body were separate. They weren't the same. I remembered that French philosopher, that René Descartes guy. The guy who said 'I think, therefore I am.' He believed in, what was it, oh yeah, Cartesian Dualism. That was it. Dualism. The soul and the body are separate entities.
So your body could die but your soul could remain.
Hmm.. so in a sense maybe Yuuki really did go to Heaven after all? I mean, if his data was reconstituted into a new body. Maybe at that higher level. Or maybe his Creator would then inject him into a new reality down at our level. Let's call that Heaven, or Paradise, or Elysium, or the Happy Hunting Grounds, or whatever.
I wish my Creator would give me a new body. My back always hurts.
"Ok guys.. I'm starting to track you. So the 'backup copy' is just as real as the original."
There is one solution to the Fermi paradox that posits an inevitable transition from wet intelligence to dry intelligence, wet being biological. The speculation is that once dry minds replace wet ones the host civilization goes dark as far as radio emissions are concerned, that dry minds are incurious and non-aggressive. Since a computer's personality, its "soul" if you will, can be saved indefinitely a dry mind civilization will not have the incentive to expand that any civilization of mortal wet minds must have as a consequence of thermodynamics if not political philosophy.
This seems plausible to me because at this instant I am seated before a machine that by the standards of 1980 must qualify as a super computer, and even though it has more RAM than a gerbil's brain has neurons, it is noticeably less curious about the world than my tropical fish.
"Never" is a very long time. "Computer" is a term that has changed in meaning very dramatically during my and my father's lifetimes. "Mind" is hard to define, and my phone and web browser seem to know a lot more about me already than most of the women I dated for less than a month.
There is also a hypothesis, speculation really, about intelligence generally called emergent complexity, which hold that once a brain reaches a critical level of complexity of connectedness, i.e. any given neuron, or functional equivalent, having direct synaptic communication with 2^n other neurons and indirect communication with 2^n+n other neurons, n being some some unspecified non-zero positive integer, then what we recognize as intelligence spontaneously arises, which also seems plausible to me give the paucity of other testable ideas. If emergent complexity has any validity then we should see a genuine machine intelligence any time with the next five years.
I remember reading a story about a Dr who refused to enter a transformer because he didn't want to die. Finally he was prevailed upon because his skills were needed and the fate of some incredibly important thing (I forget what) depended on it, so he went in as his duty. Before he went in though, he wrote a note and put it in his pocket: "Remember me to her." it said.
I always figure that his was the correct way to look at the matter.
It's time to watch "The Fly" again. The movie ends before we know which creature has the soul in it...or can a soul become a cut phrenia?? If so, how many people/ insects can it be spread among?
And "nowhere" is a very tiny place. If there is only one wet intelligence per galaxy as technologically capable as our own then that means 100 billion such intelligences at a minimum. That's 100 billion chances for an "uploaded" dry intelligence to emerge somewhere in the Universe, and once such an intelligence emerges it will only be a matter of time before it dominates.
Re: The Fly What about teleportation? A teleportation machine must totally annihilate the body of whomever is to be teleported, else the teleportation machine is really a human duplicator rather than a means of travel. If an "uploading" is impossible, then teleportation is also impossible.
When a distinguished but elderly scientist states that something is possible, he is almost certainly right. When he states that something is impossible, he is very probably wrong.
Seems like a better option would be the fax machine paradigm, where the original stays and a perfect copy is sent on a mission to someplace the original couldn't go anyway. And the original would never be needed there again as long as the copy lived. If the copy dies and is needed again, send a new one. If the original dies, send a copy. After all, an engineer like Scotty doesn't come along every day.
Take a clone, transfer a perfect copy of the current state of your brain into it and then, post transfer, with full awareness, be killed. Who survived?
What if the post-transfer death was contemporaneous with the transfer? What if you the original were drugged so that between the transfer and death no spark of awareness existed?
It will never be possible to upload your mind — your consciousness — into a computer.
First they would have to figure out what consciousness and mind are. Not an easy task considering that many centuries of philosophizing have produced no provable answers.
BTW, if anyone is new to an interest in philosophy my recommendation is to start with “The Story of Philosophy,” by Will Durant. It is written in an easy to read style but without condescension.
It will take a lot of computing and storage power, but in the final analysis, human beings are still finite state machines. So you need to emulate a person's entire collection of quantum eigenstates. You can estimate that number using the Beckenstein bound. It's big in human terms, but still just a single exponential. Single-exponentials don't scare computer scientists—they're our bread and butter. So this argument is flawed, unless your hypothesis is "a human being is not exhaustively described by their quantum eigenstates," in which case you aren't mounting a scientific argument (not that there's anything wrong with that; it's just important to be clear about our metaphysics).
It certainly will never happen without our current concept and design of computers. Even neural nets don't mimic in the smallest way the way the human brain actual works.
But, even if we had the right computer, how would you go about "uploading" your conscience? That's what never makes any sense.
There are some people for whom uploading their belief system into a graphical formulation would be a trivial operation. But then is the computer really holding their consciousness, or merely acting like they would act given their beliefs?
Do you believe in magic and ghosts? One or both may exist, but I am not aware of scientific evidence to establish either. So, I'll defer to your beliefs, until there is cause to change my position.
The definition of a mimic is unambiguous.
I don't mistake inference (i.e. created knowledge) to lead to any logical domain other than philosophical, which may branch to another domain, including: faith, fantasy, and science.
The popular belief that a human mind or consciousness originates in the brain is at best an article of faith. It cannot be established within the narrow scope of the scientific domain. However, perhaps it does not matter for practical purposes. After all, there are people who will not even agree when human life (i.e. process) begins. Many of them believe in the "magic and ghosts" of spontaneous conception, and god-like storks, too.
Look at how scientists create penises for transgendered women. If that's the best they got how soon unt they can create machines that teleport people. We are still in the dark ages
Actually the best movie on the duplication/teleportation issue is "The Prestige." You have to watch it twice to see all that's going on, but that's OK because Scarlett Johansson is in it.
When a distinguished but elderly scientist states that something is possible, he is almost certainly right. When he states that something is impossible, he is very probably wrong.
Neither of those guys debating are scientists, they're both philosophers with an interest in skiffy stuff. Almost as annoying as that pompous ass Tyson blovating about the uselessness of philosophy, isn't it?
"Right. Well, you are like McCoy. The transporter is actually just transmitting data, along with the raw energy to reassemble it back into matter. The atoms in the source pad were destroyed and turned into pure raw energy. It's Einstein's basic equation, E equals m c squared. Then on the planet the transporter ran the equation backwards to create different atoms, new atoms, in the same configuration using that raw energy. But the energy itself is always fluid, undifferentiated."
The problems with a Star Trek transporter include:
1. Consider the ramifications of e=mc^2. With c being the speed of light, it doesn't take a lot of matter (m) for energy (e) to become very large. If you take roughly 50 grams of matter (any kind) and convert it to energy at 100% efficiency, you get the same energy as a 1 megaton nuclear weapon. So, converting 100kg (say a pudgy late model Kirk) to energy would result in 2000 megatons of energy. As an approximation, 2000 megatons is probably about the energy of the entire US nuclear arsenal. You'd better have a damned good containment field or things will get very unpleasant very fast, but not for long.
2. You'd need to map all of the body's cells with extreme accuracy before converting them to energy. This means you'd need to map not only the billions of neurons in the brain but all of the connections between them. There are a lot of connections. At the other end, you'd have to regenerate both the cells and the connections with equal precision or the person's memories (and everything else) will be scrambled.
3. Nothing is 100% efficient. Not all of the cells would likely be converted into pure energy and some energy will be lost in buffering and transmission. This wouldn't matter if the cells being lost were fat cells in the gut but would be an issue with things like brain cells.
4. You'd need a machine with equal capabilities at the receiving end. You step on the transporter pad, it destroys you to convert you to energy and data, and beams that data to another location. How is it reassembled with perfect efficiency if there is no machine there?
I am a participant in the Amazon Services LLC Associates Program, an affiliate advertising program designed to provide a means for me to earn fees by linking to Amazon.com and affiliated sites.
Encourage Althouse by making a donation:
Make a 1-time donation or set up a monthly donation of any amount you choose:
43 comments:
For one, it is impossible for science to distinguish between origin and expression. A perfect copy of the physical structure may only produce a mimic, an imperfect image of the original.
Not a worry.
It is not my mind that I wish to 'upload' into the MileyCyrus3000.
I am Laslo.
n.n said...
For one, it is impossible for science to distinguish between origin and expression.
Science is awful, but can you do it with magic and ghosts?
A perfect copy of the physical structure may only produce a mimic, an imperfect image of the original.
So, a perfect copy is imperfect - whatever that means.
Mind is a spiritual concept. Brain ie a physical tissue concept. Computer is a digital signal concept.
Althouse Blog is a salon where the three overlap and cross transfer in a moment that is eternal, except for certain bad comments.
I'd worry about going nuts from lack of sensory input or developing a terrible itch in a ghost appendage.
Read Scalzi's Old Man's War. In Sci-Fi, anything is possible.
That's what backups are for.
Wouldn't fit, anyways. Then I'd have to decide what to save and what to throw out. I'm not good at that.
‘Prediction is very difficult, especially about the future.’
~ Niels Bohr
"He is not the original Haruki."
The android looked at me. "Everything is data. Two copies of the same data are indistinguishable. They are the same."
"Uh, no, you don't get it. Haruki died. It's not the same guy. He's still dead. Your guy is a clone. We humans have this concept called a soul."
Yuki said nothing, so I went on.
"Uhm, let me try to explain this. Ok, here's an analogy. Let's pretend that you have a mind transfer machine. You know, a machine like that one in the movie Young Frankenstein. Say that you zapped my mind into Frankenstein, and you swapped his mind into mine."
She didn't react so I kept going.
"Ok, now you pick up a gun. You shoot me, my body I mean. I'm dead. So here's the question: Who is left alive now?"
"You."
"No! I died. My body died. That's just a copy, a replica."
"No. It is the same."
"No it is not! I died. Someone else is taking my place. If I was in heaven I'd probably be pretty pissed off about it."
"I do not understand."
Kyonko finally spoke up in my defense. "You're saying it is not the same 'you'. It's a different person."
"Yeah. Thanks for finally helping me out here."
"I'm sorry, Kyon. I want to help you, but I'm not saying that I agree with you. In fact I think I don't."
"Aw c'mon!" I pounded my chest. "This is me. This stuff right here. These atoms that make up my body, my brain, are me. If you shoot my brain I'm dead. If you can somehow magically recreate a new brain, and give it the same neurons, the same interconnections, it's just a clone. It looks like me, talks like me, walks like me, quacks like me, but it is not me. It is a different duck!"
Yuki was being patient with me. "That is not logical. Your 'clone', as you call it, is made of exactly the same data. Therefore it is the same. Two copies of the same data are indistinguishable."
Kurosawa leaned into Kyonko and whispered, "He's a materialist."
I overheard him. "Dang right I am! If you shoot me in the head, I'm dead. Forever. My soul, or whatever you want to call it, is kaput."
I was getting worked up. "Look, I don't believe in heaven. I only believe in what I can see. And even if I did believe in it, I am pretty sure that I would be pretty pissed off when I looked down from my fluffy white cloud. I mean, down there is some other guy, some imposter jerk, down there pretending to be me, taking my place!"
Kyonko leaned in my direction. "Kyon, I'm sorry but you're view is simplistic. If you are trying base your identity, your 'you', solely on the basis of your physical body, it won't work."
I banged my chest again. "Yes it does! This is my body, my stuff, this is me!"
"You really think so?"
"Well, yeah!" Then I crossed my arms. "I'm not changing my mind on this."
Yuki explained, "Kyon, every atom, every molecule in your physical brain, is replaced approximately every three years through metabolism and waste elimination."
"What?"
"Yes. It is true."
"Huh?"
Kurosawa giggled. "She means your brain gets flushed down the toilet every three years."
"You're kidding."
Yuki continued to blast away my argument. "Yes. It's replaced. Completely. The physical 'stuff' of your brain, as you call it, is gone. So where is the 'you' now? Where is your 'soul', so to speak?"
I admitted I didn't know.
"It is data combined with an actualizer. It is your DNA coding sequence combined with the enumeration of your neural connections between your axons and synapses. Then it is actualized. All within the system. You understand?"
I didn't.
A lack of imagination is a terrible thing.
To his credit, Kurosawa was one who tried to help me out. "Let me try to explain what Yuki is saying." He used an analogy. "You remember Star Trek, the transporter on that TV show?"
"Uh, yeah.."
"You remember Leonard 'Bones' McCoy, the ship's doctor? He always hated the transporter. He called it 'that infernal contraption that's always flinging my atoms all over the place'. The transporter converted matter to energy, transmitted it, and then converted the energy back into matter."
"Yeah. He really hated that thing."
"Right. Well, you are like McCoy. The transporter is actually just transmitting data, along with the raw energy to reassemble it back into matter. The atoms in the source pad were destroyed and turned into pure raw energy. It's Einstein's basic equation, E equals m c squared. Then on the planet the transporter ran the equation backwards to create different atoms, new atoms, in the same configuration using that raw energy. But the energy itself is always fluid, undifferentiated."
"So you are saying they weren't the same atoms?"
"That's right. So in McCoy's view the transporter killed him. Then it created a clone of him on the planet surface. That's your view too, Kyon."
"I guess so.."
Kyonko chimed in. "Don't you see? Yuki is saying was that it doesn't matter. Your identity is tied to your data, not the particular atoms in your body. In her view you are just transported. It is still the same 'you'."
"Wait. So, let's say you cloned me somehow, and you copied my mind too. You copied it exactly. Now tell me, what happens? Are there two of 'me' now? I mean, two bodies with the same soul?"
"Yes Kyon. There would now be two of 'you'."
This was nuts.
She went on. "But as soon as that happens they diverge. Each clone has a different experience. The neural configuration changes. They quickly become two separate people."
"You mean their, uh, souls, split?"
"That is semantics. But I believe that the answer would be yes."
Wow. I need to tell McCoy about that. Version 342 that is.
I remembered an episode of Star Trek where the transporter malfunctioned. It beamed up two copies of Kirk, the good Kirk and the evil Kirk. So they were in fact two entirely separate people. They both had unique identities. Unique 'souls' in my point of view. And when Scotty fixed the transporter to re-merge the two halves, the normal Kirk popped out. It was the same soul as the original.
I understood. The soul and the body were separate. They weren't the same. I remembered that French philosopher, that René Descartes guy. The guy who said 'I think, therefore I am.' He believed in, what was it, oh yeah, Cartesian Dualism. That was it. Dualism. The soul and the body are separate entities.
So your body could die but your soul could remain.
Hmm.. so in a sense maybe Yuuki really did go to Heaven after all? I mean, if his data was reconstituted into a new body. Maybe at that higher level. Or maybe his Creator would then inject him into a new reality down at our level. Let's call that Heaven, or Paradise, or Elysium, or the Happy Hunting Grounds, or whatever.
I wish my Creator would give me a new body. My back always hurts.
"Ok guys.. I'm starting to track you. So the 'backup copy' is just as real as the original."
I waited for Yuki to say "Yes".
But she didn't.
It will never be possible etc.
How do you know?
A rehash of the ever unsettled mind/body problem used to excite freshmen into thinking philosophy is interesting.
They'll perfect the technology two weeks after my death.
Never say never. Stick around long enough as a species and anything is possible. It's all a matter of having enough addressable storage.
There is one solution to the Fermi paradox that posits an inevitable transition from wet intelligence to dry intelligence, wet being biological. The speculation is that once dry minds replace wet ones the host civilization goes dark as far as radio emissions are concerned, that dry minds are incurious and non-aggressive. Since a computer's personality, its "soul" if you will, can be saved indefinitely a dry mind civilization will not have the incentive to expand that any civilization of mortal wet minds must have as a consequence of thermodynamics if not political philosophy.
This seems plausible to me because at this instant I am seated before a machine that by the standards of 1980 must qualify as a super computer, and even though it has more RAM than a gerbil's brain has neurons, it is noticeably less curious about the world than my tropical fish.
"Never" is a very long time. "Computer" is a term that has changed in meaning very dramatically during my and my father's lifetimes. "Mind" is hard to define, and my phone and web browser seem to know a lot more about me already than most of the women I dated for less than a month.
There is also a hypothesis, speculation really, about intelligence generally called emergent complexity, which hold that once a brain reaches a critical level of complexity of connectedness, i.e. any given neuron, or functional equivalent, having direct synaptic communication with 2^n other neurons and indirect communication with 2^n+n other neurons, n being some some unspecified non-zero positive integer, then what we recognize as intelligence spontaneously arises, which also seems plausible to me give the paucity of other testable ideas. If emergent complexity has any validity then we should see a genuine machine intelligence any time with the next five years.
typo alert: within the next five years
I remember reading a story about a Dr who refused to enter a transformer because he didn't want to die. Finally he was prevailed upon because his skills were needed and the fate of some incredibly important thing (I forget what) depended on it, so he went in as his duty. Before he went in though, he wrote a note and put it in his pocket: "Remember me to her." it said.
I always figure that his was the correct way to look at the matter.
It's time to watch "The Fly" again. The movie ends before we know which creature has the soul in it...or can a soul become a cut phrenia?? If so, how many people/ insects can it be spread among?
Beldar wrote: "Never" is a very long time.
And "nowhere" is a very tiny place. If there is only one wet intelligence per galaxy as technologically capable as our own then that means 100 billion such intelligences at a minimum. That's 100 billion chances for an "uploaded" dry intelligence to emerge somewhere in the Universe, and once such an intelligence emerges it will only be a matter of time before it dominates.
Re: The Fly
What about teleportation? A teleportation machine must totally annihilate the body of whomever is to be teleported, else the teleportation machine is really a human duplicator rather than a means of travel. If an "uploading" is impossible, then teleportation is also impossible.
But will it be possible to upload a computer into your mind?
Clarke's first law:
When a distinguished but elderly scientist states that something is possible, he is almost certainly right. When he states that something is impossible, he is very probably wrong.
Seems like a better option would be the fax machine paradigm, where the original stays and a perfect copy is sent on a mission to someplace the original couldn't go anyway. And the original would never be needed there again as long as the copy lived. If the copy dies and is needed again, send a new one. If the original dies, send a copy. After all, an engineer like Scotty doesn't come along every day.
Take a clone, transfer a perfect copy of the current state of your brain into it and then, post transfer, with full awareness, be killed. Who survived?
What if the post-transfer death was contemporaneous with the transfer? What if you the original were drugged so that between the transfer and death no spark of awareness existed?
You can't make a copy of entanglements.
It will never be possible to upload your mind — your consciousness — into a computer.
First they would have to figure out what consciousness and mind are. Not an easy task considering that many centuries of philosophizing have produced no provable answers.
BTW, if anyone is new to an interest in philosophy my recommendation is to start with “The Story of Philosophy,” by Will Durant. It is written in an easy to read style but without condescension.
It will take a lot of computing and storage power, but in the final analysis, human beings are still finite state machines. So you need to emulate a person's entire collection of quantum eigenstates. You can estimate that number using the Beckenstein bound. It's big in human terms, but still just a single exponential. Single-exponentials don't scare computer scientists—they're our bread and butter. So this argument is flawed, unless your hypothesis is "a human being is not exhaustively described by their quantum eigenstates," in which case you aren't mounting a scientific argument (not that there's anything wrong with that; it's just important to be clear about our metaphysics).
It certainly will never happen without our current concept and design of computers. Even neural nets don't mimic in the smallest way the way the human brain actual works.
But, even if we had the right computer, how would you go about "uploading" your conscience? That's what never makes any sense.
There are some people for whom uploading their belief system into a graphical formulation would be a trivial operation. But then is the computer really holding their consciousness, or merely acting like they would act given their beliefs?
Fernandinande:
Do you believe in magic and ghosts? One or both may exist, but I am not aware of scientific evidence to establish either. So, I'll defer to your beliefs, until there is cause to change my position.
The definition of a mimic is unambiguous.
I don't mistake inference (i.e. created knowledge) to lead to any logical domain other than philosophical, which may branch to another domain, including: faith, fantasy, and science.
The popular belief that a human mind or consciousness originates in the brain is at best an article of faith. It cannot be established within the narrow scope of the scientific domain. However, perhaps it does not matter for practical purposes. After all, there are people who will not even agree when human life (i.e. process) begins. Many of them believe in the "magic and ghosts" of spontaneous conception, and god-like storks, too.
Fernandinande:
A "perfect copy" is for some intents and purposes indistinguishable from the original.
Look at how scientists create penises for transgendered women. If that's the best they got how soon unt they can create machines that teleport people. We are still in the dark ages
Video means the message is wasted.
Can't index it, and it's fragile content.
Ideas that matter should be expressed in text, for the Internet.
Not a video of two dudes talking.
Re: The Fly
What about teleportation?
Actually the best movie on the duplication/teleportation issue is "The Prestige." You have to watch it twice to see all that's going on, but that's OK because Scarlett Johansson is in it.
When a distinguished but elderly scientist states that something is possible, he is almost certainly right. When he states that something is impossible, he is very probably wrong.
Neither of those guys debating are scientists, they're both philosophers with an interest in skiffy stuff. Almost as annoying as that pompous ass Tyson blovating about the uselessness of philosophy, isn't it?
"Right. Well, you are like McCoy. The transporter is actually just transmitting data, along with the raw energy to reassemble it back into matter. The atoms in the source pad were destroyed and turned into pure raw energy. It's Einstein's basic equation, E equals m c squared. Then on the planet the transporter ran the equation backwards to create different atoms, new atoms, in the same configuration using that raw energy. But the energy itself is always fluid, undifferentiated."
The problems with a Star Trek transporter include:
1. Consider the ramifications of e=mc^2. With c being the speed of light, it doesn't take a lot of matter (m) for energy (e) to become very large. If you take roughly 50 grams of matter (any kind) and convert it to energy at 100% efficiency, you get the same energy as a 1 megaton nuclear weapon. So, converting 100kg (say a pudgy late model Kirk) to energy would result in 2000 megatons of energy. As an approximation, 2000 megatons is probably about the energy of the entire US nuclear arsenal. You'd better have a damned good containment field or things will get very unpleasant very fast, but not for long.
2. You'd need to map all of the body's cells with extreme accuracy before converting them to energy. This means you'd need to map not only the billions of neurons in the brain but all of the connections between them. There are a lot of connections. At the other end, you'd have to regenerate both the cells and the connections with equal precision or the person's memories (and everything else) will be scrambled.
3. Nothing is 100% efficient. Not all of the cells would likely be converted into pure energy and some energy will be lost in buffering and transmission. This wouldn't matter if the cells being lost were fat cells in the gut but would be an issue with things like brain cells.
4. You'd need a machine with equal capabilities at the receiving end. You step on the transporter pad, it destroys you to convert you to energy and data, and beams that data to another location. How is it reassembled with perfect efficiency if there is no machine there?
Post a Comment