Loading ...
Sorry, an error occurred while loading the content.

Re: [nanotech] "vitality"

Expand Messages
  • j.lark@att.net
    Dear perfect nano descriptor I agree with you in most of your justified rantings. However, I think (actually) that we should create a base for consideration,
    Message 1 of 40 , May 10, 2002
    • 0 Attachment
      Dear perfect nano descriptor
      I agree with you in most of your justified rantings.
      However, I think (actually) that we should create a base
      for consideration, before creating the straw monkey to
      fling globs of dung at before any conclusion. Please do
      not advocate conclusion before consideration.
      Lark
      > In a message dated 5/9/2002 10:39:54 PM Pacific Daylight Time,
      > mgubrud@... writes:
      >
      > Ah, here we go again. Note to fellow forum members: if more people would
      > prefer this thread to be removed from the forum than continued, I'll take it
      > to private discussion. However, despite the circular semantics issues that
      > keep cropping up, there is a valid core to this argument I think should be
      > observed.
      >
      > What we have here is a deeply reactionary side that believes in an unchanging
      > humanity, that certain technologies should be--MUST be--banned to prevent
      > changing or replacing the human race with something artificial. The other,
      > deeply progressive side maintains that changing/replacing ourselves with
      > something we design is a highly desirable goal, and that banning technology
      > has never worked and will not work. While certainly not everyone on this
      > forum agrees with the progressive position, I strongly suspect that almost
      > all prefer it to the reactionary one.
      >
      > This isn't a discussion of whether or not certain technologies will be
      > possible; it's a discussion of the implications of these technologies if they
      > should happen to come to be. As the technologies permitting the development
      > of artificial/enhanced beings begin to arrive, the reactionary voice may gain
      > strength out of proportion to its numbers (much like the religious right in
      > its practice of international terrorism or its opposition to abortion). I
      > maintain that it's better to argue the issues now and observe the reactionary
      > mind to better understand how it thinks, so that it can be handled
      > appropriately as technology develops.
      >
      > However, as I said, I'll take the thread off this public board if most would
      > prefer I do so. My own interest in discussing this topic is magnified by
      > previous exposure to so-called "creation scientists." These fundamentalist
      > Protestants almost managed to ban the teaching of evolution in my high
      > school, believing such secular information to be a dangerous threat to
      > humanity, scientific evidence be damned. They almost succeeded because few
      > cared to argue the point. Since the fundamentalist mindset eerily parallels
      > the reactionary position in regards to human enhancement, I believe the issue
      > deserves to be confronted.
      >
      > > > it's essentially what
      > > > everyone (except the original person, who dies) perceives.
      > >
      > > They perceive the "transfer" of a "mind"? Or just the creation of a
      > > likeness? Perhaps, upon encountering the likeness, and with the other
      > > person dead, they might "fill in the blanks" with the image of a "mind"
      > > [how again is this different from a "soul"?] having flown from one body
      > > to another. But this would not have been directly perceived, would it?
      > > Nor would it be a reality, would it?
      >
      > Semantics again. The reality is that the biological body dies and a perfect
      > duplicate (plus enhancements) takes its place. (A soul, as I understand it,
      > is a religious concept referring to a spiritual essence. I'm referring to an
      > identical copy of a mind, nothing more. I have stated this over and over
      > again, yet you persist in insisting I am referring to a soul. If you define a
      > soul as nothing more than an identical copy of a mind, then this is just
      > another semantics issue, but I doubt any spiritual leader would agree with
      > your definition.) The perception to others is that the last time they saw
      > you, you were biological. Now you are not.
      >
      > > Suicide is always considered wrong, except perhaps self-sacrifice to
      > > save the lives of others. That wouldn't be the case here. It would
      > > just be needless self-destruction.
      >
      > Yes, and the world was always considered to be flat, and the sun revolved
      > around the Earth. Perceptions change. What seems horrifying to you now will
      > likely seem normal enough in a few decades. If you want a glimpse of how the
      > people of the future will regard a technological issue, the best guide is to
      > talk to the people on the progressive end of the spectrum. If there are
      > perceived advantages to a particular technology--and there certainly are
      > perceived advantages to transhuman enhancement--it will eventually become
      > acceptable to society as a whole. So it's been with the combustion engine,
      > TV, computers, test tube babies, en utero scanning, etc., and so it's
      > becoming with genetic engineering and even cloning. Actually...is there ANY
      > technology that can offer personal benefit that has been successfully made to
      > go away or never be developed?
      >
      > > > we all experience conscious interruption when we sleep anyway
      > >
      > > True, but since when is sleep the same as death?
      >
      > The short answer is that it does not matter. If you die in your sleep, you
      > are unaware of the fact. And as long as an identical copy of your personality
      > continues, who's to know? Who's to care?
      >
      > > > nor is there any information of any worth lost
      > >
      > > Of any worth to whom?
      >
      > Strange question. Of worth to anyone.
      >
      > > If "the original" is dead, how are any continuing thought processes
      > > hers? Thought processes, personality, memories, etc. in another body
      > > would be those of that other person, wouldn't they?
      >
      > Yes and no. Again, it's a semantics issue. Yes, the mind would be in another
      > body and therefore be another person. However, since a perfectly duplicated
      > mind would continue to think just like the original mind, it would also be
      > enough like the original that no one need know the difference. It could plug
      > directly into the original's life and continue on in its stead. Therefore it
      > would not be like a different person.
      >
      > > When I sleep, I dream, and I am conscious of dreaming. True, I also
      > > sleep without dreaming, but to me (it is unclear what you mean by "to
      > > consciousness" if it is different from "to the person") there is a very
      > > clear difference between sleep and death.
      >
      > Certainly there is...but not to you when you are not conscious. Between
      > dreams, there's little difference to your perception between sleep and death,
      > since both states offer no sense of awareness. Again, this is a semantics
      > issue.
      >
      > > > The only reason we fear death is the perceived lack of continuance
      > >
      > > Wrong! We fear death because we evolved to avoid being killed.
      >
      > Again, a meaningless disagreement over semantics. I could say I want to eat a
      > steak because I'm hungry, and you could insist that my body is just
      > expressing a desire for fuel. Yes, we fear death because we evolved to avoid
      > being killed, but the REASON we fear death is the lack of continuance. I
      > don't disagree with what you said; it just has no bearing on what I said.
      >
      > > > For some people, continuance of a copy of consciousness in a
      > > > construct is a perfectly adequate substitute for continuance
      > >
      > > And some people strap explosives to their body anticipating heavenly
      > > rewards... Some people are wrong about some things.
      >
      > I personally don't approve of suicide bombings, but without sufficient
      > evidence, I don't have the temerity to say they are wrong in believing they
      > will go to paradise for their acts. This is not intended to sound flip; it
      > addresses the core of this discussion. Different people have different
      > beliefs. One could just as easily say you are wrong for trying to
      > influence--even force--others to allow their minds to die of old age, rather
      > than have a copy continue indefinitely. That would put you on equal--or
      > worse--standing with suicide bombers. I'd rather not accuse you of being
      > wrong purely on the basis of your value system; I prefer using the evidence
      > to point out the flaws in your logic.
      >
      > > Well, to review, first of all, there is no "thing" to "transfer"; we are
      > > talking about maybe making xeroxes of people and killing them in the
      > > process. This doesn't offer YOU anything besides nonexistence, which
      > > you can have easily enough anyway. However, it would pose a threat to
      > > humanity's future, since these machines would then assert claims to the
      > > rights of human beings, legal rights of citizenship, ownership, immunity
      > > from destruction, etc. We would thus have formally raised technology to
      > > moral equivalence with humanity, and if the capabilities of that
      > > technology then exceeded human capabilities, and was furthermore
      > > constituted with human-like drives and motives, it would pose a serious
      > > threat of competition for, and perhaps takeover, of the resources which
      > > are otherwise the birthright of the human species. So if you hate
      > > yourself, if you hate people, then perhaps this may appeal to you, but
      > > in that case I say you are sick, and I am hopeful that you may be cured;
      > > possibly all you need is a little love. But if you cannot be cured, and
      > > you threaten to do things which pose a danger to humanity, then you will
      > > need to be restrained.
      >
      > I agree with most of your above statement. However, I don't know where you
      > get the idea that anything is the birthright of the human species. That which
      > we call human rights is not a universal mandate of rightness, but merely a
      > set of rules we would like to apply as a minimum standard to others of our
      > species. I don't hate myself, nor people in general, but I don't consider
      > Homo sapiens to be particularly special, nor do I consider the species to be
      > the end of the line. I find nothing wrong with regarding our species as a
      > work in progress, and one that is worth improving through means other than
      > the blind force of evolution. Clearly you are opposed to this line of
      > thought, though at this point you've only presented an insistence that it is
      > wrong, not WHY it's wrong.
      >
      > > One possible resolution which I have previously proposed, for those who
      > > are absolutely incurable of the "uploading" delusion, would be to
      > > stipulate that "uploads" can be set aside some limited fraction of the
      > > Solar system's resources, enough to support some ridiculous amount of
      > > computation, so that the "uploads" can "live" in some really jazzy
      > > virtual worlds, but subject to the restriction that this system never be
      > > connected to an output device of any type. I don't see that any harm is
      > > going to be done by molecular transistors switching on and off, although
      > > it does seem rather pointless.
      >
      > Oh, wonderful! And we could put humans in concentration camps where they
      > won't do any harm to the rest of the world, too. No, Mark, just because an
      > uploaded mind wouldn't be the original mind wouldn't make it any less
      > deserving of the rights we reserve for other humans.
      >
      > >
      > > > Once more, let me make it clear in as simple terms as I can:
      > > > biological mind is copied to machine. Machine thinks like biological
      > > > mind. Biological mind is disposed of (yes,
      > > > "killed," Mark). Machine takes over the biological's identity.
      > >
      > > Once more, you always slip in some word that means "soul." Here it's
      > > "identity." Somehow this "identity" is left hanging around for the
      > > machine to "take over." Thus you complete the image of soul transfer,
      > > even having avoided claiming that something was transfered in the
      > > creation of the machine (although the word "copied" is also somewhat
      > > suspicious). The point is, you always find some way of suggesting the
      > > image of your soul - your true essence, your true personhood, your true
      > > "identity" - flying from you to this new "embodiment."
      >
      > I am truly stunned. Even with the clearest, simplest, most straightforward
      > language, you appear incapable of understanding the concept. Have you ever
      > heard of someone stealing another person's identity? You don't really think
      > that means the thief has taken a person's "soul," do you? But if the word
      > identity causes you such confusion, I shall try to clarify further: by
      > "identity" I mean ownership and the position one holds in society. That's
      > ALL. The machine would take possession of the biological's house and bank
      > account, socialize with the very same friends, hold the same job, have the
      > same Social Security number, use the same passport, etc. THAT'S what it means
      > to take over a person's identity. Again (for the millionth time), there is NO
      > LITERAL TRANSFER OF "SOUL" IMPLIED. You insist on certain implications for
      > "mind transference" (that the machine mind may think just like the original,
      > but it is "only" a copy, not the original, and that if the original was
      > disposed of, it would be completely dead), and I wholeheartedly agree with
      > those implications, yet you insist that cannot be, and that I must somehow
      > overtly or covertly believe that the "essence that is me" will somehow
      > magically transfer from the biological form to the machine. How can I
      > possibly make it more clear? The identical machine copy would be just
      > that...a COPY. The result would be two distinct, separate minds. The
      > subsequent destruction of the original would NOT result in some magical
      > transference of "essence" to the copy. Yet the copy could inherit all that
      > the original possessed, thereby "taking over the identity" of the original.
      > The concept is really quite simple....
      >
      > >
      > > > I've defined my terms and I will continue to use them for the sake
      > > > of brevity (which is desperately needed here), since you know what I'm
      > > > referring to-
      > >
      > > If you need brevity, without obfuscation, use appropriate terms.
      > > Instead of "mind transfer," which is functionally indistinguishable from
      > > "soul transfer" and does in fact implicitly claim to represent the
      > > transfer of a soul, you could say "brain duplication."
      > >
      >
      > NOW we're getting somewhere. If you object to my use of the term "mind
      > transfer" for the process we're talking about, "brain duplication" is a
      > perfectly adequate substitute. Now go back over everything I've said and
      > insert "brain duplication" where I've used the term "mind transfer," and
      > perhaps you will be able to understand what's being said.
      >
      > As I've been saying all along, this is a semantics issue....
      >
      > Derek
    • Ron Alderton
      Please remove me from your mailing list. Thanks ... From: To: Sent: Saturday, May 11, 2002 12:31 PM Subject: Re:
      Message 40 of 40 , May 10, 2002
      • 0 Attachment
        Please remove me from your mailing list. Thanks
        ----- Original Message -----
        From: <j.lark@...>
        To: <nanotech@yahoogroups.com>
        Sent: Saturday, May 11, 2002 12:31 PM
        Subject: Re: [nanotech] "vitality"


        > Dear perfect nano descriptor
        > I agree with you in most of your justified rantings.
        > However, I think (actually) that we should create a base
        > for consideration, before creating the straw monkey to
        > fling globs of dung at before any conclusion. Please do
        > not advocate conclusion before consideration.
        > Lark
        > > In a message dated 5/9/2002 10:39:54 PM Pacific Daylight Time,
        > > mgubrud@... writes:
        > >
        > > Ah, here we go again. Note to fellow forum members: if more people would
        > > prefer this thread to be removed from the forum than continued, I'll
        take it
        > > to private discussion. However, despite the circular semantics issues
        that
        > > keep cropping up, there is a valid core to this argument I think should
        be
        > > observed.
        > >
        > > What we have here is a deeply reactionary side that believes in an
        unchanging
        > > humanity, that certain technologies should be--MUST be--banned to
        prevent
        > > changing or replacing the human race with something artificial. The
        other,
        > > deeply progressive side maintains that changing/replacing ourselves with
        > > something we design is a highly desirable goal, and that banning
        technology
        > > has never worked and will not work. While certainly not everyone on this
        > > forum agrees with the progressive position, I strongly suspect that
        almost
        > > all prefer it to the reactionary one.
        > >
        > > This isn't a discussion of whether or not certain technologies will be
        > > possible; it's a discussion of the implications of these technologies if
        they
        > > should happen to come to be. As the technologies permitting the
        development
        > > of artificial/enhanced beings begin to arrive, the reactionary voice may
        gain
        > > strength out of proportion to its numbers (much like the religious right
        in
        > > its practice of international terrorism or its opposition to abortion).
        I
        > > maintain that it's better to argue the issues now and observe the
        reactionary
        > > mind to better understand how it thinks, so that it can be handled
        > > appropriately as technology develops.
        > >
        > > However, as I said, I'll take the thread off this public board if most
        would
        > > prefer I do so. My own interest in discussing this topic is magnified by
        > > previous exposure to so-called "creation scientists." These
        fundamentalist
        > > Protestants almost managed to ban the teaching of evolution in my high
        > > school, believing such secular information to be a dangerous threat to
        > > humanity, scientific evidence be damned. They almost succeeded because
        few
        > > cared to argue the point. Since the fundamentalist mindset eerily
        parallels
        > > the reactionary position in regards to human enhancement, I believe the
        issue
        > > deserves to be confronted.
        > >
        > > > > it's essentially what
        > > > > everyone (except the original person, who dies) perceives.
        > > >
        > > > They perceive the "transfer" of a "mind"? Or just the creation of a
        > > > likeness? Perhaps, upon encountering the likeness, and with the other
        > > > person dead, they might "fill in the blanks" with the image of a
        "mind"
        > > > [how again is this different from a "soul"?] having flown from one
        body
        > > > to another. But this would not have been directly perceived, would
        it?
        > > > Nor would it be a reality, would it?
        > >
        > > Semantics again. The reality is that the biological body dies and a
        perfect
        > > duplicate (plus enhancements) takes its place. (A soul, as I understand
        it,
        > > is a religious concept referring to a spiritual essence. I'm referring
        to an
        > > identical copy of a mind, nothing more. I have stated this over and over
        > > again, yet you persist in insisting I am referring to a soul. If you
        define a
        > > soul as nothing more than an identical copy of a mind, then this is just
        > > another semantics issue, but I doubt any spiritual leader would agree
        with
        > > your definition.) The perception to others is that the last time they
        saw
        > > you, you were biological. Now you are not.
        > >
        > > > Suicide is always considered wrong, except perhaps self-sacrifice to
        > > > save the lives of others. That wouldn't be the case here. It would
        > > > just be needless self-destruction.
        > >
        > > Yes, and the world was always considered to be flat, and the sun
        revolved
        > > around the Earth. Perceptions change. What seems horrifying to you now
        will
        > > likely seem normal enough in a few decades. If you want a glimpse of how
        the
        > > people of the future will regard a technological issue, the best guide
        is to
        > > talk to the people on the progressive end of the spectrum. If there are
        > > perceived advantages to a particular technology--and there certainly are
        > > perceived advantages to transhuman enhancement--it will eventually
        become
        > > acceptable to society as a whole. So it's been with the combustion
        engine,
        > > TV, computers, test tube babies, en utero scanning, etc., and so it's
        > > becoming with genetic engineering and even cloning. Actually...is there
        ANY
        > > technology that can offer personal benefit that has been successfully
        made to
        > > go away or never be developed?
        > >
        > > > > we all experience conscious interruption when we sleep anyway
        > > >
        > > > True, but since when is sleep the same as death?
        > >
        > > The short answer is that it does not matter. If you die in your sleep,
        you
        > > are unaware of the fact. And as long as an identical copy of your
        personality
        > > continues, who's to know? Who's to care?
        > >
        > > > > nor is there any information of any worth lost
        > > >
        > > > Of any worth to whom?
        > >
        > > Strange question. Of worth to anyone.
        > >
        > > > If "the original" is dead, how are any continuing thought processes
        > > > hers? Thought processes, personality, memories, etc. in another body
        > > > would be those of that other person, wouldn't they?
        > >
        > > Yes and no. Again, it's a semantics issue. Yes, the mind would be in
        another
        > > body and therefore be another person. However, since a perfectly
        duplicated
        > > mind would continue to think just like the original mind, it would also
        be
        > > enough like the original that no one need know the difference. It could
        plug
        > > directly into the original's life and continue on in its stead.
        Therefore it
        > > would not be like a different person.
        > >
        > > > When I sleep, I dream, and I am conscious of dreaming. True, I also
        > > > sleep without dreaming, but to me (it is unclear what you mean by "to
        > > > consciousness" if it is different from "to the person") there is a
        very
        > > > clear difference between sleep and death.
        > >
        > > Certainly there is...but not to you when you are not conscious. Between
        > > dreams, there's little difference to your perception between sleep and
        death,
        > > since both states offer no sense of awareness. Again, this is a
        semantics
        > > issue.
        > >
        > > > > The only reason we fear death is the perceived lack of continuance
        > > >
        > > > Wrong! We fear death because we evolved to avoid being killed.
        > >
        > > Again, a meaningless disagreement over semantics. I could say I want to
        eat a
        > > steak because I'm hungry, and you could insist that my body is just
        > > expressing a desire for fuel. Yes, we fear death because we evolved to
        avoid
        > > being killed, but the REASON we fear death is the lack of continuance. I
        > > don't disagree with what you said; it just has no bearing on what I
        said.
        > >
        > > > > For some people, continuance of a copy of consciousness in a
        > > > > construct is a perfectly adequate substitute for continuance
        > > >
        > > > And some people strap explosives to their body anticipating heavenly
        > > > rewards... Some people are wrong about some things.
        > >
        > > I personally don't approve of suicide bombings, but without sufficient
        > > evidence, I don't have the temerity to say they are wrong in believing
        they
        > > will go to paradise for their acts. This is not intended to sound flip;
        it
        > > addresses the core of this discussion. Different people have different
        > > beliefs. One could just as easily say you are wrong for trying to
        > > influence--even force--others to allow their minds to die of old age,
        rather
        > > than have a copy continue indefinitely. That would put you on equal--or
        > > worse--standing with suicide bombers. I'd rather not accuse you of being
        > > wrong purely on the basis of your value system; I prefer using the
        evidence
        > > to point out the flaws in your logic.
        > >
        > > > Well, to review, first of all, there is no "thing" to "transfer"; we
        are
        > > > talking about maybe making xeroxes of people and killing them in the
        > > > process. This doesn't offer YOU anything besides nonexistence, which
        > > > you can have easily enough anyway. However, it would pose a threat to
        > > > humanity's future, since these machines would then assert claims to
        the
        > > > rights of human beings, legal rights of citizenship, ownership,
        immunity
        > > > from destruction, etc. We would thus have formally raised technology
        to
        > > > moral equivalence with humanity, and if the capabilities of that
        > > > technology then exceeded human capabilities, and was furthermore
        > > > constituted with human-like drives and motives, it would pose a
        serious
        > > > threat of competition for, and perhaps takeover, of the resources
        which
        > > > are otherwise the birthright of the human species. So if you hate
        > > > yourself, if you hate people, then perhaps this may appeal to you, but
        > > > in that case I say you are sick, and I am hopeful that you may be
        cured;
        > > > possibly all you need is a little love. But if you cannot be cured,
        and
        > > > you threaten to do things which pose a danger to humanity, then you
        will
        > > > need to be restrained.
        > >
        > > I agree with most of your above statement. However, I don't know where
        you
        > > get the idea that anything is the birthright of the human species. That
        which
        > > we call human rights is not a universal mandate of rightness, but merely
        a
        > > set of rules we would like to apply as a minimum standard to others of
        our
        > > species. I don't hate myself, nor people in general, but I don't
        consider
        > > Homo sapiens to be particularly special, nor do I consider the species
        to be
        > > the end of the line. I find nothing wrong with regarding our species as
        a
        > > work in progress, and one that is worth improving through means other
        than
        > > the blind force of evolution. Clearly you are opposed to this line of
        > > thought, though at this point you've only presented an insistence that
        it is
        > > wrong, not WHY it's wrong.
        > >
        > > > One possible resolution which I have previously proposed, for those
        who
        > > > are absolutely incurable of the "uploading" delusion, would be to
        > > > stipulate that "uploads" can be set aside some limited fraction of the
        > > > Solar system's resources, enough to support some ridiculous amount of
        > > > computation, so that the "uploads" can "live" in some really jazzy
        > > > virtual worlds, but subject to the restriction that this system never
        be
        > > > connected to an output device of any type. I don't see that any harm
        is
        > > > going to be done by molecular transistors switching on and off,
        although
        > > > it does seem rather pointless.
        > >
        > > Oh, wonderful! And we could put humans in concentration camps where they
        > > won't do any harm to the rest of the world, too. No, Mark, just because
        an
        > > uploaded mind wouldn't be the original mind wouldn't make it any less
        > > deserving of the rights we reserve for other humans.
        > >
        > > >
        > > > > Once more, let me make it clear in as simple terms as I can:
        > > > > biological mind is copied to machine. Machine thinks like biological
        > > > > mind. Biological mind is disposed of (yes,
        > > > > "killed," Mark). Machine takes over the biological's identity.
        > > >
        > > > Once more, you always slip in some word that means "soul." Here it's
        > > > "identity." Somehow this "identity" is left hanging around for the
        > > > machine to "take over." Thus you complete the image of soul transfer,
        > > > even having avoided claiming that something was transfered in the
        > > > creation of the machine (although the word "copied" is also somewhat
        > > > suspicious). The point is, you always find some way of suggesting the
        > > > image of your soul - your true essence, your true personhood, your
        true
        > > > "identity" - flying from you to this new "embodiment."
        > >
        > > I am truly stunned. Even with the clearest, simplest, most
        straightforward
        > > language, you appear incapable of understanding the concept. Have you
        ever
        > > heard of someone stealing another person's identity? You don't really
        think
        > > that means the thief has taken a person's "soul," do you? But if the
        word
        > > identity causes you such confusion, I shall try to clarify further: by
        > > "identity" I mean ownership and the position one holds in society.
        That's
        > > ALL. The machine would take possession of the biological's house and
        bank
        > > account, socialize with the very same friends, hold the same job, have
        the
        > > same Social Security number, use the same passport, etc. THAT'S what it
        means
        > > to take over a person's identity. Again (for the millionth time), there
        is NO
        > > LITERAL TRANSFER OF "SOUL" IMPLIED. You insist on certain implications
        for
        > > "mind transference" (that the machine mind may think just like the
        original,
        > > but it is "only" a copy, not the original, and that if the original was
        > > disposed of, it would be completely dead), and I wholeheartedly agree
        with
        > > those implications, yet you insist that cannot be, and that I must
        somehow
        > > overtly or covertly believe that the "essence that is me" will somehow
        > > magically transfer from the biological form to the machine. How can I
        > > possibly make it more clear? The identical machine copy would be just
        > > that...a COPY. The result would be two distinct, separate minds. The
        > > subsequent destruction of the original would NOT result in some magical
        > > transference of "essence" to the copy. Yet the copy could inherit all
        that
        > > the original possessed, thereby "taking over the identity" of the
        original.
        > > The concept is really quite simple....
        > >
        > > >
        > > > > I've defined my terms and I will continue to use them for the sake
        > > > > of brevity (which is desperately needed here), since you know what
        I'm
        > > > > referring to-
        > > >
        > > > If you need brevity, without obfuscation, use appropriate terms.
        > > > Instead of "mind transfer," which is functionally indistinguishable
        from
        > > > "soul transfer" and does in fact implicitly claim to represent the
        > > > transfer of a soul, you could say "brain duplication."
        > > >
        > >
        > > NOW we're getting somewhere. If you object to my use of the term "mind
        > > transfer" for the process we're talking about, "brain duplication" is a
        > > perfectly adequate substitute. Now go back over everything I've said and
        > > insert "brain duplication" where I've used the term "mind transfer," and
        > > perhaps you will be able to understand what's being said.
        > >
        > > As I've been saying all along, this is a semantics issue....
        > >
        > > Derek
        >
        >
        >
        > The Nanotechnology Industries mailing list.
        > "Nanotechnology: solutions for the future."
        > www.nanoindustries.com
        >
        > Your use of Yahoo! Groups is subject to http://docs.yahoo.com/info/terms/
        >
        >
      Your message has been successfully submitted and would be delivered to recipients shortly.