Loading ...
Sorry, an error occurred while loading the content.

"Vitality"

Expand Messages
  • Ooo0001@aol.com
    Subj: Re: [nanotech] vitality Date: 4/25/2002 6:25:50 PM Pacific Daylight Time From: Ooo0001 To:
    Message 1 of 13 , May 10 6:06 PM
    • 0 Attachment
      Subj: Re: [nanotech] "vitality"
      Date: 4/25/2002 6:25:50 PM Pacific Daylight Time
      From: Ooo0001
      To: nanotech@yahoogroups.com



      In a message dated 4/24/2002 7:17:50 PM Pacific Daylight Time, mgubrud@... writes:


      Okay, now it's my turn to ask for a definition.  If the definition of
      "alive" requires that the object under discussion be "life," then
      "extraterrestrial life" cannot be said to be "alive," but perhaps some
      other term not yet coined would apply.  But now we really are into
      semantics.  The statement that "extraterrestrial life" is "alive," if
      accepted, would not change the fact that it is not "life" if that word
      is understood, as it usually is, as referring to the phenomenon of life
      as it exists on Earth, with its common constitution and common ancestry.


      Like I said, semantics. See the earlier post on how life and alive are synonymous.


      Agreed.  There is no automatic reason to be hostile toward
      extraterrestrial life, should we encounter it, and we would very likely
      consider it to be something of profound interest and value to us,
      assuming it was not a threat to us or to something else that we value.


      Cool! Agreement! B-)

      Let's be clear about this.  The Endangered Species Act does not apply to
      variola (smallpox) virus.  It would not apply to any species which posed
      a mortal threat to human survival or to human freedom or that threatened
      to cause vast damage to the Earth's ecosystem.  Any extraterrestrial
      life or any artificial creation which posed such a threat would become
      the target of every effort to arrange
      its extinction in the wild and very likely in captivity as well.


      Agreement again! We're on a roll!

      I think it is a very serious argument.  If you agree that such a
      "constructed organism" could be capable of suffering, then it would
      become immoral to impose such suffering on it.  Therefore do not create
      any "constructed organisms" that would suffer if not permitted to pose a
      mortal threat to human survival or freedom.


      Hm, odd argument. With that logic one might suggest that humans should not have children, since children are capable of suffering and it would be immoral to impose such suffering upon them. Therefore don't have kids that would suffer if not permitted to pose a threat to their fellow humans (such as through perpetrating crimes, for example).

      > Same thing with genetic engineering--it continues even with a
      > fair amount of protestation.

      Subject to increasing scrutiny and regulation as its potential for
      actual harm increases.  Most of the protest to date has been directed at
      biotech endeavors which do not pose much of a threat of actual harm.
      Hence these protests have not been taken very seriously by society at
      large.  This will change as the technology expands.


      Agreed overall, though I suspect the implied conclusion is wrong. Rock 'n' roll was considered a grave threat to society by many at one time. Now it's mainstream. Cloning of any sort was once considered a grave threat to society by many not too long ago (and some still do). Now cloned animals rarely even make the news. The first cloned humans will cause a big stir too...and then become mainstream as well. Genetically engineered foods caused a stink when they first appeared, but they're gradually being accepted (after all, most of us have unknowingly consumed them already). It's always the same pattern: viability, appalled reaction by many, gradual integration, mainstream. There's no reason to think the genetic engineering of humans will be any different.


      >  I agree that your fear of such constructs rendering unaltered
      > humans obsolete or even eliminating them is a genuine, valid
      > concern...but a simple solution like not making them at all is a
      > bit naive. When have we as a species EVER abdicated potentially
      > threatening technologies--especially when their abilities could
      > prove extremely beneficial?

      I do not oppose the use of advanced artificial intelligence and
      information technology systems for the betterment of humankind.
      However, a certain type of such system should be banned - the type that
      simulates the human mind or that behaves in a self-interested way like a
      human being.  I do not see any actual benefit which could be obtained by
      android systems which cannot be obtained by non-android systems.


      I'm sure there are many who would oppose a ban on the development of an "artificial mind" (the potential benefits and even just the milestone are too alluring), but even if it were banned...how long would it last? The technology would have to be extremely hard for nations to develop, using extremely rare materials, for any ban to hold for long. The impetus for advances in parallel technologies would ensure that someone, sometime, would produce true AI. The field would advance in secrecy...which isn't healthy at all. It's the same danger in trying to ban nanotechnology.

      Derek
    • Mark Gubrud
      ... People have been responding to anthropomorphic constructs... as real people ever since the first lonely cave man whacked off fondling the first Venus
      Message 2 of 13 , May 10 8:52 PM
      • 0 Attachment
        Derek wrote:

        > surely we will eventually create anthropomorphic constructs.
        > People will respond to them as real people. I know this to be
        > true because I've seen it happen already with even crude avatars
        > in massively multiplayer online role-playing games.

        People have been responding to "anthropomorphic constructs... as real
        people" ever since the first lonely cave man whacked off fondling the
        first "Venus" doll. Today we've got interactive porno CD-ROMs and
        tomorrow we'll have virtual Penthouse pets... Ummm, that's progress for
        you.

        Fake is fake.

        > yes, I'd consider an android like Data to be living.

        If I understand the premise correctly, Data is very explicitly NOT
        supposed to be living, but rather electromechanical. What would be the
        purpose of erasing this clear distinction?

        > I'd also consider a sentient computer to be living. But I'm a
        > little more liberal with my definition of life than some.

        Would you also redefine the word "mountain" to include shoes, or the
        word "boat" to include motorcycles?

        > You're saying that which is alive is not necessarily life

        No, I insist the word "alive" should be restricted to describing life.

        > > how can any person do anything humans can't do?
        >
        > Easy answer: by becoming more than human.

        You mean less than fully human, or something part person, part not.

        Anyway, I can fly, if I make use of an airplane. You can view the
        airplane and me as forming a system which is more than either alone.
        Part of this system is human, part is not.

        > Yes, the super tennis shoes and plugged-in PDA you mention could
        > probably do many of the things people want. But the real point is
        > something more philosophical, more psychological. Ever wonder
        > why people get permanent tattoos instead of just applying temporary
        > transfers? The effect is really no different, and with temporary
        > transfers you can change your mind and remove the image much more
        > readily. A primary reason tattoos are so appealing is that they are
        > not an attachment; they become a part of you, a part of who you are.

        The psychology of tattoos, piercings and the like is an interesting
        question. I will have to make some assertions that people who have
        piercings and tattoos won't like and will vigorously deny.

        It seems to me that such body modifications signify and symbolize a pact
        with death. It is no accident that most tattoos and piercings in modern
        culture express negative feelings about life, about the state of the
        world, and so on. Tattoos are more popular among the underclass and the
        disenfranchised, and often carry negative messages expressing anger or
        tendencies to violence. The piercing craze started with punk culture,
        as an attempt to express toughness and an "in-your-face" attitude toward
        bourgeois society. Most of the popularity now relates to the
        objectification of the body associated in our culture with sexuality;
        but such modifications do not respect the body as beautiful in itself.
        They are distinct from adornments such as rings, bangles, or removable
        earrings. They are typically contrived to suggest pain, bondage,
        corporeality and mortality.

        All body modifications call attention to corporeality and thus to
        mortality: "I will wear this until death." However, I do not regard
        attention to corporeality as inherently negative. You will note that I
        keep saying we ARE our bodies, nothing more or less... but my feeling is
        that our culture DOES view corporeality from a negative perspective, and
        thus most of us who have tattoos or piercings are expressing negative
        feelings about our existence.

        I would contrast this with tribal cultures, with their beautiful body
        modification arts, in which death is never far from life and is not
        viewed with as much horror as it is in modern cultures. In these
        cultures, the tattoos and other body modifications do not so much
        express negative attitudes, but they still express the idea of bondage
        to the body, to the earth, and to death.

        At the same time, they serve as proof that the person remains alive
        despite the modifications (note that we are not talking about neural
        implants or any of your other comic-book imaginings), and since in these
        cultures it is normal to view persons and the world as made up of
        incorporeal spirits only temporarily inhabiting these physical bodies,
        the body modifications actually suggest the triumph of "life" over
        corporeality, and the immortal spirit over the mortality of the body.

        > I could carry around a magnifying glass to effectively give myself
        > microscopic vision, but carrying around an external tool like that
        > holds little appeal. However, give me the option of having microscopic
        > vision permanently and safely integrated into my eyes (well, at an
        > affordable price), and you've got yourself a sale, buddy. Telescopic
        > vision? Nightvision? Hey, the more, the merrier!

        I suppose with advanced nanotech we could put all of that into a set of
        contact lenses that would also be quite imperceptible either to the
        wearer or to any casual observers. You could wear them permanently, if
        you like. But one nice feature would be the ability to take them off.

        > You may not understand the tremendous appeal of integrating technology
        > into our physiology

        It's a combination of body-loathing and comic-book power-trip fantasies.

        > having the biological body die does not necessarily mean that the
        > most valuable information a person possesses ceases to exist

        Valuable to whom? If the person dies, nothing is of value to her.

        > If that information is copied and run on a machine architecture,
        > very little of value is lost...

        The person loses her life, which is the basis of all other values.

        > Species extinction is how species evolve.

        Not the extinct species.

        > An old, maladapted species evolves into a new species with
        > superior survival characteristics for a changed environment,
        > and in that way the old species goes extinct.

        This is just technically wrong. What typically happens is that
        speciation takes place when one population becomes geographically (and
        thus reproductively) isolated from another; later it may happen that the
        two by now distinct (reproductively isolated by genetic mechanisms)
        species come back into contact and compete for the same niche, which the
        fitter species driving the less fit into extinction. This happens less
        dramatically at the genetic level within a species, where a novel allele
        proves fitter than an earlier version and spreads throughout the
        population.

        But anyway, this is irrelevant. We do not need to worship at the altar
        of "evolution." There is no need for us humans to will our own
        extinction.

        > Evolution is not a crime. Hell, it's a biological imperative

        Not for us. We have technology, and can use it to mold our environment
        instead of the other way around.

        > Homo sapiens is no more (or less) important to our
        > evolution than was Australopithicus afarensis.

        And "our evolution" is important to what or whom? Why do you raise
        "evolution" to the status of a moral order? You are inventing a
        anti-humanistic religion based on silly ideas and sloppy thinking.

        > Why should our species be the end of the hominid line?

        Why the heck not?

        > Is there anything wrong with taking evolution into our own hands instead?

        Is there anything right with it? What's wrong with being human? Since
        you are human, it's all you can be, anyway... But what's wrong with
        humans surviving and thriving? What's wrong with the idea that our
        species should have a future? You are in such a hurry to replace us?
        Why? In the service of what God or demon?

        > If we develop the means to improve upon ourselves, why shouldn't we?

        Why do you think nonhumans would be an improvement?

        > Homo sapiens is just another step.
        >
        > Call me forward thinking,

        No, your thinking is retrograde, fascistic, or worse, pre-Copernican.
        You believe there is an objective moral order, defined by your
        assumptions about what "evolution" and "improvement" mean. Your worship
        of technology and "progress" is at best quaint. You haven't absorbed
        the essential message of modernity, which is that our existence serves
        no purpose outside of ourselves. We are the source of the only moral
        order that exists, and we are the reference point for all comparisons of
        "better" and "worse." You cannot "improve" upon the human species.
        Anyone who attempts to do so is an enemy of humanity, and will be
        treated accordingly.
      • Mark Gubrud
        ... Humans are restricted in the threats they are allowed to pose, but most importantly humans do not pose the kind of threat that would be posed by
        Message 3 of 13 , May 10 9:06 PM
        • 0 Attachment
          Derek wrote:

          > one might suggest that humans should not have children, since
          > children are capable of suffering and it would be immoral to impose
          > such suffering upon them. Therefore don't have kids that would
          > suffer if not permitted to pose a threat to their fellow humans

          Humans are restricted in the threats they are allowed to pose, but most
          importantly humans do not pose the kind of threat that would be posed by
          self-interested artificial intelligences of greater than human
          capability if they were created and allowed (as ethics might then
          dictate) to run rampant in the world. In any case, there is no good
          reason why such things need to be created for any practical purpose. We
          don't need them. There is no need to create them and let them pose a
          threat to ourselves and our children.

          > > I do not see any actual benefit which could be obtained by
          > > android systems which cannot be obtained by non-android systems.
          >
          > I'm sure there are many who would oppose a ban on the development of
          > an "artificial mind" (the potential benefits and even just the
          > milestone are too alluring)

          You did not answer my point. You want your machines to do some kind of
          work that you want to have done. That doesn't require android
          intelligence or willfulness. I think the evolution of applied
          artificial intelligence is not moving toward artificial people, but
          toward highly engineered and specialized systems designed to perform
          useful tasks. The idea that you make an artificial brain and then use
          it just like you use people to do different jobs is a cartoon. In
          practice it isn't so easy and it isn't going to be useful enough to
          justify the associated risks.
        • Ooo0001@aol.com
          In a message dated 5/10/2002 10:38:39 PM Pacific Daylight Time, ... To prevent the very bigotry you espouse, of course. ... Of course not. But my definition of
          Message 4 of 13 , May 11 1:10 AM
          • 0 Attachment
            In a message dated 5/10/2002 10:38:39 PM Pacific Daylight Time, mgubrud@... writes:


            If I understand the premise correctly, Data is very explicitly NOT
            supposed to be living, but rather electromechanical.  What would be the
            purpose of erasing this clear distinction?


            To prevent the very bigotry you espouse, of course.


            > I'd also consider a sentient computer to be living. But I'm a
            > little more liberal with my definition of life than some.

            Would you also redefine the word "mountain" to include shoes, or the
            word "boat" to include motorcycles?


            Of course not. But my definition of life does not contradict the scientific definition of life...which, contrary to your insistence, is open to much less rigid interpretation than you insist.

            > Easy answer: by becoming more than human.

            You mean less than fully human, or something part person, part not.


            No, I mean more than human. A human altered to superior function is more, not less.

            The psychology of tattoos, piercings and the like is an interesting
            question.  I will have to make some assertions that people who have
            piercings and tattoos won't like and will vigorously deny. It seems to me that such body modifications signify and symbolize a pact
            with death.


            Of course they'll deny it, because you are incorrect in your assessment. Tribal tattoos of various cultures are the most popular types of tattoos in OUR society today, representing symbols of life, renewal, fertility and vigor. I personally have a symmetrical set of angular-tribal tattoos that I designed myself across my chest. They're designed to compliment the flow of musculature under the skin and in no way symbolize death or even anything counterculture. Your "skull and dagger" view of tattoo society is mired in the 60s.

            I suppose with advanced nanotech we could put all of that into a set of
            contact lenses that would also be quite imperceptible either to the
            wearer or to any casual observers.  You could wear them permanently, if
            you like.  But one nice feature would be the ability to take them off.


            I don't disagree, but again, you miss the point. Contact lenses are not integrated into one's physiology as are implants. There is a significant segment of society for whom "adornments" hold little appeal. Since you don't understand tattoos, you probably won't understand this concept. However, the millions who do will likely prove hard to cajole into not implanting nanotechnological advancements into their physiology, should the technology become available. There will always be large segments of society who will disagree with everything you believe in. Their needs and desires will be no less valid than yours.

            > An old, maladapted species evolves into a new species with
            > superior survival characteristics for a changed environment,
            > and in that way the old species goes extinct.

            This is just technically wrong.  What typically happens is that
            speciation takes place when one population becomes geographically (and
            thus reproductively) isolated from another; later it may happen that the
            two by now distinct (reproductively isolated by genetic mechanisms)
            species come back into contact and compete for the same niche, which the
            fitter species driving the less fit into extinction.  This happens less
            dramatically at the genetic level within a species, where a novel allele
            proves fitter than an earlier version and spreads throughout the
            population.


            Thank you very much for the treatise on evolutionary theory--somehow I must have missed that information when I got my college degree in evolution. ;-) What you described is punctuated equilibrium following the founder effect, as well as the slower gradualism. Read my quote again and you'll see it is indeed technically accurate (although certainly not complete), and it actually applies to both punctuated equilibrium and gradualism.

            But anyway, this is irrelevant.  We do not need to worship at the altar
            of "evolution."  There is no need for us humans to will our own
            extinction.


            True indeed. There's also no need not to.

            Why do you raise
            "evolution" to the status of a moral order?  You are inventing a
            anti-humanistic religion based on silly ideas and sloppy thinking.


            Why not? It's no less valid a position--and no more sloppy or silly--than insisting on the evolutionary stagnation of humanity. I urge you to prove otherwise. You are the one who has stated that evolutionary stagnation is a "moral" issue for you. Fine, I'll happily counter that position by making evolutionary continuance a moral issue for me. Why not? I'll even have the natural world to point to as a model of what's proper. At least it's not as silly as your insistence that we can't be anything other than what we are now.

            > Is there anything wrong with taking evolution into our own hands instead?

            Is there anything right with it?  What's wrong with being human?  Since
            you are human, it's all you can be, anyway... But what's wrong with
            humans surviving and thriving?  What's wrong with the idea that our
            species should have a future?  You are in such a hurry to replace us?
            Why?  In the service of what God or demon?


            No, there's nothing right with it and there's nothing wrong with it. What's wrong with being a normal human is that for some people it's not enough. If all we can be is human, as you insist, then you won't mind if some of us change our physiology to incorporate biotechnological or nanotechnological enhancements, will you? Because we'll still be human, right? B-) Ah, call it what you wish, just don't try to force us to not change and we won't try to force you to change.

            No, your thinking is retrograde, fascistic, or worse, pre-Copernican.
            You believe there is an objective moral order, defined by your
            assumptions about what "evolution" and "improvement" mean.  Your worship
            of technology and "progress" is at best quaint.  You haven't absorbed
            the essential message of modernity, which is that our existence serves
            no purpose outside of ourselves.  We are the source of the only moral
            order that exists, and we are the reference point for all comparisons of
            "better" and "worse."  You cannot "improve" upon the human species.
            Anyone who attempts to do so is an enemy of humanity, and will be
            treated accordingly.


            Retrograde? You'll have to specify which definition of retrograde you mean for that to mean anything in this context. Fascistic? I don't advocate autocratic control over anyone; on the contrary, I'm for freedom of choice. Pre-Copernican? Okay, you got me there--you're right, I actually believe the Sun revolves around the Earth. ;-)

            I do not believe in ANY moral order, but I am certainly not above fighting fire with fire and coming up with some moral order to counter yours. Morals are not universal, and what is moral for one person may not be moral for another.

            Sorry, but the essential message of modernity is that your home computer is out of date before you even get it home. B-) However, I fully agree that our existence serves no purpose, except what we wish to make of it, and that we are the source of the only moral order that exists. That is precisely why I wish to "upgrade" my human physiology--it is my own moral imperative (shared by others, of course, but that is irrelevent). I don't expect it to be yours and I don't much care if you remain stagnant or not.

            But of course we can "improve" upon the human species. I'm surprised you can't see this. A simple example would be, let's say, the creation of a gene that allowed immunity to cancer. If it could be applied to all humans, say through gene therapy, and the genes would be automatically passed on to our offspring, then the human species would indeed be improved. I'm aware you believe that entirely machine versions of humans would not be considered human in your world, but regardless, if such machines could do everything humans could, only moreso, then that would also be an improvement on the human species (though perhaps you personally would have to call it an improvement OVER the human species, considering your beliefs).

            So, now that we've continued to affirm that we simply disagree on some fundamental concepts, let me ask you this: what REALISTIC solution would you have to the "problem" of people like us eventually getting hold of technologies that will allow us to become the things you fear most: creatures or machines with vastly greater capabilities than normal humans?

            Derek
          • Ooo0001@aol.com
            In a message dated 5/10/2002 10:38:51 PM Pacific Daylight Time, ... Mark, there was no good reason for humans to climb Mt. Everest, explore the Marianas
            Message 5 of 13 , May 11 1:33 AM
            • 0 Attachment
              In a message dated 5/10/2002 10:38:51 PM Pacific Daylight Time, mgubrud@... writes:


              Humans are restricted in the threats they are allowed to pose, but most
              importantly humans do not pose the kind of threat that would be posed by
              self-interested artificial intelligences of greater than human
              capability if they were created and allowed (as ethics might then
              dictate) to run rampant in the world.  In any case, there is no good
              reason why such things need to be created for any practical purpose.  We
              don't need them.  There is no need to create them and let them pose a
              threat to ourselves and our children.

              > > I do not see any actual benefit which could be obtained by
              > > android systems which cannot be obtained by non-android systems.
              >
              > I'm sure there are many who would oppose a ban on the development of
              > an "artificial mind" (the potential benefits and even just the
              > milestone are too alluring)

              You did not answer my point.  You want your machines to do some kind of
              work that you want to have done.  That doesn't require android
              intelligence or willfulness.  I think the evolution of applied
              artificial intelligence is not moving toward artificial people, but
              toward highly engineered and specialized systems designed to perform
              useful tasks.  The idea that you make an artificial brain and then use
              it just like you use people to do different jobs is a cartoon.  In
              practice it isn't so easy and it isn't going to be useful enough to
              justify the associated risks.


              Mark, there was no good reason for humans to climb Mt. Everest, explore the Marianas Trench, travel to the moon, ride 145 hp motorcycles, or make porn accessible over the Web. We do such things because we can and because we want to. Once the technology becomes feasible, that's all it takes: someone wanting to do it. And I know I'm not the only one who would very much like to see the development of machine brains that can not only think just like humans, but actually support a duplicate copy of an existing human's mind, enough so that no one could tell the difference between real and Memorex through communication alone. Traveling to the moon WAS a cartoon...up until it actually happened. Hell, porn on the Web still IS a cartoon. But it's not going away any time soon; in fact, it's evolving at a breakneck pace along with the rest of technology. Now add an actual *perceived* benefit to the development of "transhuman" technologies (as you can see, that's not hard to do), and you have a juggernaught of motivation for that development.

              Derek
            • Mark Gubrud
              ... No, you are the bigot, the narrow-minded one, who thinks he knows what makes (or would make) an objectively biologically superior or improved brand of
              Message 6 of 13 , May 11 9:40 PM
              • 0 Attachment
                Derek wrote:

                > bigotry you espouse

                No, you are the bigot, the narrow-minded one, who thinks he knows what
                makes (or would make) an objectively biologically "superior" or
                "improved" brand of humanoid.

                > >You mean less than fully human, or something part person, part not.
                >
                > No, I mean more than human. A human altered to superior function is
                > more, not less.

                Part human, part not. Thus less than fully human.

                Human plus machine tool would be more than just a human, but that is
                aparently not what you mean.

                > Tribal tattoos of various cultures are the most popular types of
                > tattoos in OUR society today, representing symbols of life,
                > renewal, fertility and vigor. I personally have a symmetrical set
                > of angular-tribal tattoos that I designed myself across my chest.
                > They're designed to compliment the flow of musculature under the
                > skin and in no way symbolize death or even anything counterculture.

                Such symbols can suggest many things at once. But in any case I don't
                see how you can escape the fact that tattoos call attention to
                corporeality and (assumed) mortality, especially within a culture that
                does not believe in supernatural spirits separable from our bodies. I
                am not condeming tattoos and other body art, although I personally don't
                much like them. One can observe that these are ancient arts which are
                very much a part of our human cultural heritage. They are very
                different in spirit from the idea of getting neural implant chips to
                make you smarter than the other kid, or having power muscles and
                nanofiber-reinforced bones so you can act out your violent fantasies
                from comic books.

                > An old, maladapted species evolves into a new species with
                > superior survival characteristics for a changed environment,
                > and in that way the old species goes extinct.

                It's interesting to know that you have a "college degree in evolution,"
                but I don't think your use of terms in describing what I wrote was
                correct, and in any case the problem with your statement is that it
                ignores the essential fact that evolution works by one species or one
                allele outcompeting and driving another or others to extinction.
                Likewise your proposed "evolution" of humanity would work by driving
                humans to extinction, which as a human I would naturally view as not a
                good thing.

                > You are the one who has stated that evolutionary stagnation is a
                > "moral" issue for you.

                No, I said the future of humanity is a moral issue.

                > Fine, I'll happily counter that position by making evolutionary
                > continuance a moral issue for me. Why not?

                Why?

                > I'll even have the natural world to point to as a model of what's
                > proper.

                What exactly is "proper" about it? What was "proper" about the K-T
                extinction, for example? Nothing! It just happend, that's all. It was
                completely meaningless! There is nothing "proper," there is no meaning,
                outside of human moral evaluation.

                > At least it's not as silly as your insistence
                > that we can't be anything other than what we are now.

                We aren't anything other than what we are now. This is just a fact.
                When we ask about the future, the question is, What will exist then?
                Something like us? Will our kind endure, or not?

                > If all we can be is human, as you insist, then you won't mind if some
                > of us change our physiology to incorporate biotechnological or
                > nanotechnological enhancements, will you? Because we'll still be human,
                > right?

                Whatever remains of you, yes. The "enhancements" wouldn't be human and
                would'nt be you; in fact, we are probably talking about events
                sufficiently far in the future that I can say even the human part
                wouldn't be you, either.

                > Retrograde? You'll have to specify which definition of retrograde
                > you mean for that to mean anything in this context. Fascistic? I
                > don't advocate autocratic control over anyone; on the contrary,
                > I'm for freedom of choice.

                Your thinking is a throwback to the fascistic futurism of the early 20th
                century, or in any case stuck somewhere between Darwin and Hitler. Only
                jazzed up with microchips and genetic engineering and hypothetical
                nanobots. This "freedom of choice" business is a new twist, but it's
                just a cover for capitalist coercion. You would allow technology to
                assume autonomy, and to seize control of what is properly the birthright
                of humanity, by means of the "free market" but enforced by the usual
                military brutality. This is what I am against. Letting ANY kind of
                machine claim it has rights against those of ANY human being.

                > Pre-Copernican?

                I call you pre-Copernican because you continue to believe in a moral
                order which is given as the structure of the world: Evolution, in your
                view, is objectively upward and good.

                > I fully agree that our existence serves no purpose, except what we
                > wish to make of it, and that we are the source of the only moral
                > order that exists. That is precisely why I wish to "upgrade" my
                > human physiology

                The problem here is that even while putting the word "upgrade" in quotes
                you continue to believe that you know which way is "up." This appears
                to you as a set of objective facts: more intelligence is better, more
                physical strength is better, more visual acuity is better, etc. But
                these are culturally-based prejudices. Sure, we'd like to be able to
                see in the dark, but we have night-vision devices that enable this, so
                what's the big deal? Why would a humanoid that was part human part
                technology be "better" just because it could see in the dark without the
                aid of external night-vision devices that need not be any more of a
                burden than the "implanted" ones would be?

                > of course we can "improve" upon the human species.... A simple
                > example would be, let's say, the creation of a gene that allowed
                > immunity to cancer.

                Not biologically plausible. However, we may expect to be able to cure
                cancer with the use of advanced bio- and nanotechnology. This could be
                done without any need to permanently implant or incorporate any
                technology into the body, nor would there be any advantage to doing so.

                > what REALISTIC solution would you have to the "problem" of people
                > like us eventually getting hold of technologies that will allow us
                > to become the things you fear most: creatures or machines with vastly
                > greater capabilities than normal humans?

                Such freaks will be treated as they deserve. They will NOT have any
                vastly greater capabilities than humans who use technologies while
                preserving the integrity of the human body and the human species.
                However, technology itself is bringing capitalism to a crisis, and the
                issue of who owns the future will have to be addressed by human
                society. The notion of competition, which is what gives rise to a lot
                of this pathological thinking, will increasingly be seen as unneeded in
                the human future. Our purpose is not to be the smartest, the strongest,
                the fastest or the most ruthless. It is just to be, and to be human.

                In any case, machines do not have rights. The best way to avoid the
                issue of ethical treatment of machines is by not permitting the creation
                of machines for which this could be an issue. But in no circumstances
                will we permit such machines to "outcompete" humans under capitalist or
                any other rules of competition.
              • Ooo0001@aol.com
                In a message dated 5/12/2002 12:26:49 AM Pacific Daylight Time, ... Please, let s not get get snippy. While it s true I have a clear idea of what I would
                Message 7 of 13 , May 12 7:54 PM
                • 0 Attachment
                  In a message dated 5/12/2002 12:26:49 AM Pacific Daylight Time, mgubrud@... writes:


                  No, you are the bigot, the narrow-minded one, who thinks he knows what
                  makes (or would make) an objectively biologically "superior" or
                  "improved" brand of humanoid.


                  Please, let's not get get snippy. While it's true I have a clear idea of what I would consider "superior," I by no means wish to impress it upon anyone else. I want to be made better by my OWN standards. I make no claim for knowing what's better for everyone else, nor am I arrogant enough to insist on forcing others to do my bidding...which is quite unlike your position, which advocates the most profound banning of personal expression: the freedom to change oneself.



                  Part human, part not.  Thus less than fully human.


                  Hmm, so someone with an artificial limb is less than fully human, eh? That's nice, strip the label "fully human" from a person for not being 100% made of original natural ingredients. I guess that makes me not "fully human" either, since I'm part metal (I've had a little dental work in my time, you see). Pardon me for doubting that almost anyone would agree with your position. And for some reason you choose to label me a fascist (you misuse the term, but I'll let it slide), yet you wish to impose the first step history's most famous fascists applied to the Jews: make them seem less than human.

                  Your definition of what makes a human human is simply wrong, at least in our society. As I stated before, the status of "human" is measured by the mind, not the body. Stephen Hawking has a brilliant mind but an almost useless body, yet he is not considered any less than human than you or I. A flatlined brain in an otherwise perfect body, on the other hand, IS considered less than human (both socially and legally), even to the extent of being chopped up and used for spare parts. A functioning mind in a wasted body--or an entirely artificial body--is no less human than if it's in a perfectly healthy body. I defy you to dispute that.


                  Such symbols can suggest many things at once.  But in any case I don't
                  see how you can escape the fact that tattoos call attention to
                  corporeality and (assumed) mortality, especially within a culture that
                  does not believe in supernatural spirits separable from our bodies. 


                  That's a curious interpretation of tattoos. But they don't call attention to mortality any more than any other form of art. I know at least a dozen people with tattoos, and almost universally their primary reason for getting them was to permanently adorn their bodies with something esthetically appealing. In effect, it's quite similar to the choice one makes for clothing style, just a little more permanent. Yet nobody considers the choice of clothing style a "call of attention to mortality." Believe what you will, but your interpretation strains the bounds of logic as well as the evidence.

                  I am not condeming tattoos and other body art, although I personally don't
                  much like them.  One can observe that these are ancient arts which are
                  very much a part of our human cultural heritage.  They are very
                  different in spirit from the idea of getting neural implant chips to
                  make you smarter than the other kid, or having power muscles and
                  nanofiber-reinforced bones so you can act out your violent fantasies
                  from comic books.


                  Tattoos are ancient arts, yes...just like technological advancement and even technological enhancements are "ancient arts" (as well as modern arts). But "in spirit" tattoos are indeed very similar to having "power muscles and nanofiber-reinforced bones" implanted. One just mostly enhances esthetics, while the other mostly enhances function.

                  I'm not sure where you get the idea that anyone wants to act out violent fantasies. I personally work out hard to improve my ability to handle all aspects of life, and working out is in effect a minor form of personal enhancement practiced by hundreds if not thousands of millions of people. Why should a more significant form of personal enhancement cause anyone to have violent fantasies? I suspect this is revealing of what you fear about transhumanism...do you see it resulting in a world full of cyber-enhanced teenage boys bringing chaos to the world? If so, perhaps you need to realize that plenty of mature and thoughtful people are also interested in transhumanism. Also, if the advanced technology is available only in wearable form, as you support, it would do nothing to stop those "frightening teenagers" from doing exactly the same things.



                  > An old, maladapted species evolves into a new species with
                  > superior survival characteristics for a changed environment,
                  > and in that way the old species goes extinct.

                  It's interesting to know that you have a "college degree in evolution,"
                  but I don't think your use of terms in describing what I wrote was
                  correct, and in any case the problem with your statement is that it
                  ignores the essential fact that evolution works by one species or one
                  allele outcompeting and driving another or others to extinction.
                  Likewise your proposed "evolution" of humanity would work by driving
                  humans to extinction, which as a human I would naturally view as not a
                  good thing.


                  Yes, I have a degree in evolution (and also English), odd as it may sound. It's only proved useful in receiving a job for a year as a naturalist in the Galapagos Islands, but I love the topic. While it's certainly true that I didn't give a complete treatise on evolutionary theory, I didn't feel it was germane to the point (like saying the Sun is the source of Earth's heat is accurate but incomplete, since it makes no mention of such heat sources as radiation, gravity, convection, etc.). Your statement on how evolution functions was correct, although it too was incomplete. While some evolution (punctuated equilibrium) occurs through the founder effect or genetic drift when a small number of a species is separated from the main population, some evolution (gradualism) can also occur to large populations through slow but constant environmental changes. With gradualism there is also a constant rate of extinction of less fit genes, and competition is not between species but between genes best suited for the current environment. It does not require competition with a different species. Eventually an entire population will become a new species, thereby rendering the previous species extinct.



                  > You are the one who has stated that evolutionary stagnation is a
                  > "moral" issue for you.

                  No, I said the future of humanity is a moral issue.


                  Effectively it's the same thing. If you consider it a moral imperative to prevent the human species from evolving (naturally or otherwise), then you're espousing evolutionary stagnation.


                  Your thinking is a throwback to the fascistic futurism of the early 20th
                  century, or in any case stuck somewhere between Darwin and Hitler.  Only
                  jazzed up with microchips and genetic engineering and hypothetical
                  nanobots.  This "freedom of choice" business is a new twist, but it's
                  just a cover for capitalist coercion.  You would allow technology to
                  assume autonomy, and to seize control of what is properly the birthright
                  of humanity, by means of the "free market" but enforced by the usual
                  military brutality.  This is what I am against.  Letting ANY kind of
                  machine claim it has rights against those of ANY human being.


                  So I gather you are also against eugenics? It's a pity that eugenics was usurped by fascists, but that does not make the concept itself fascist any more than the concept of a military heirarchy is fascist.

                  I've always advocated the freedom of choice. I have no problem with you calling it capitalism, but I have to say you're living in the wrong country if you have something against capitalism. The control over individual improvement you're advocating is closer to socialism...which hasn't exactly proven a successful societal model. Pure capitalism fosters economic inequity between individuals, and presumably that is what you fear most, since unenhanced humans won't be as competitive. Well, the US does not function on pure capitalism; it has laws that help obviate economic inequity. Presumably such laws would be extended to transhumans and machine intelligence to allow them to coexist with unaugmented humans. I certainly don't advocate the "seizing of control" of what is "properly the birthright of humanity" (whatever that means), with or without "military brutality" (where do you come up with such accusations?). Nobody is advocating the elimination of humans, yet you're advocating the proactive elimination of any alternatives we may soon be able to create. I honestly can't see how your position could be any more "right" than mine.


                  > Pre-Copernican?

                  I call you pre-Copernican because you continue to believe in a moral
                  order which is given as the structure of the world: Evolution, in your
                  view, is objectively upward and good.


                  First, believing in a moral order is not "pre-Copernican" any more than believing the Earth is flat is "pre-Clinton"; second, I believe in no moral order; third, evolution is neither good nor evil, but is entirely neutral. You've missed the point entirely. The whole reason I mentioned evolution as a model to use in support of transhumanism is to counter your insistence that the human species remain stagnant. I certainly don't believe that evolution gives us a mandate to alter our species; I need no mandate! My desire to modify myself is mandate enough for me. But you unyieldingly maintain that we have a moral imperative to not allow humans to be modified. On what basis do you make this moral imperative? If there is any moral imperative at all concerning modification of species (and let me make this perfectly clear: I personally do not believe there is ANY moral imperative), then nature shows us that evolutionary change is fundamental to life. Insisting on stagnation runs counter to nature, so whatever you present as a reason for this moral imperative against change has to be stronger than nature. THAT'S my point. So I ask again...what is the reasoning behind your moral imperative for stagnation of the human species?


                  > I fully agree that our existence serves no purpose, except what we
                  > wish to make of it, and that we are the source of the only moral
                  > order that exists. That is precisely why I wish to "upgrade" my
                  > human physiology

                  The problem here is that even while putting the word "upgrade" in quotes
                  you continue to believe that you know which way is "up."  This appears
                  to you as a set of objective facts: more intelligence is better, more
                  physical strength is better, more visual acuity is better, etc.  But
                  these are culturally-based prejudices.  Sure, we'd like to be able to
                  see in the dark, but we have night-vision devices that enable this, so
                  what's the big deal?  Why would a humanoid that was part human part
                  technology be "better" just because it could see in the dark without the
                  aid of external night-vision devices that need not be any more of a
                  burden than the "implanted" ones would be?


                  You appear to have made another incorrect assumption about me. I don't maintain that a humanoid who was part human, part technology would be "better" than a normal human. If the humanoid was equipped to do something better than a normal human, then he would be better at doing that thing, that's all. It would not necessarily mean that he was "better" at anything else, or a "better" person. I put "upgrade" in quotes for precisely the opposite reason from what you deduced: my idea of an upgrade may be very different from what another person might consider an upgrade. It does not matter whether having more intelligence or strength or visual acuity is "better" or not. That's immaterial. What is important to the individual is simply that he may wish to have them. If I think I'd be a better person because of such upgrades, what of it? What do you care whether I think they'd make me better? If someone else thinks he'd be "better" without genetic or implanted augmentations, more power to him.



                  > of course we can "improve" upon the human species.... A simple
                  > example would be, let's say, the creation of a gene that allowed
                  > immunity to cancer.

                  Not biologically plausible.  However, we may expect to be able to cure
                  cancer with the use of advanced bio- and nanotechnology.  This could be
                  done without any need to permanently implant or incorporate any
                  technology into the body, nor would there be any advantage to doing so.


                  Agreed on all accounts! But so what? You're missing the point of the thought experiment, which is to show that of course we could "improve" upon the human species, something you've denied being possible.


                  > what REALISTIC solution would you have to the "problem" of people
                  > like us eventually getting hold of technologies that will allow us
                  > to become the things you fear most: creatures or machines with vastly
                  > greater capabilities than normal humans?

                  Such freaks will be treated as they deserve. 


                  Ouch. Remind me again who's the fascist?

                  They will NOT have any vastly greater capabilities than humans who use technologies while
                  preserving the integrity of the human body and the human species.


                  No matter how much you wrap up normal humans with technology, they will still have certain limitations that a pure machine species might avoid much more readily. Humans have a lot of relatively inefficient mass and they can't take much acceleration, for example. These inefficiencies are already forcing limitations on vehicles, for instance. Motorcyclists can account for as much as a third or more of the total loaded weight of their machines, significantly slowing acceleration, braking and handling. Fighter jets are even now capable of accelerating far faster than a human can tolerate. Machines would make better pilots in both instances, as well as permitting vehicles much smaller than a normal human being.

                  However, technology itself is bringing capitalism to a crisis, and the
                  issue of who owns the future will have to be addressed by human
                  society.  The notion of competition, which is what gives rise to a lot
                  of this pathological thinking, will increasingly be seen as unneeded in
                  the human future.  Our purpose is not to be the smartest, the strongest,
                  the fastest or the most ruthless.  It is just to be, and to be human.


                  Capitalism is pathological thinking? And competition is wrong? I wouldn't be so disparaging about competition--it is what has fostered the development of technology, after all. It also underlies almost everything we do--it's how we improve ourselves, whether we compete against others or against ourselves, making competition inherently "human" (inherently "life," too). Even the things we enjoy most--from relationships to games--involve competition, either friendly or adversarial. Eliminate competition and you're left with a species that accomplishes little or nothing. If you actually believe that competition is wrong, then you advocate the stagnation of society and even the human mind. And it's no wonder you advocate the stagnation of human biology.


                  In any case, machines do not have rights.  The best way to avoid the
                  issue of ethical treatment of machines is by not permitting the creation
                  of machines for which this could be an issue.  But in no circumstances
                  will we permit such machines to "outcompete" humans under capitalist or
                  any other rules of competition.


                  I asked for your solution and this is it? Ban the creation of machine intelligence? I asked for a realistic solution--I even capitalized the word for emphasis. Do you honestly think it would be possible to impose a world-wide ban on the development of AI? That would likely prove almost as difficult as imposing a world-wide ban on the development of nanotechnology. Hell, human cloning carries a much greater stigma than does AI, and many are trying to ban it. But it's not being banned in all countries. Even if it were, someone would still attempt it--IS attempting it, as a matter of fact. So let's try again...what would be your REALISTIC solution?

                  Derek
                • Mark Gubrud
                  ... If you want to say so. I would not. I think, though, that normally someone with an artificial limb would prefer to have their natural limb back. Since
                  Message 8 of 13 , May 13 8:16 PM
                  • 0 Attachment
                    Derek wrote:

                    > so someone with an artificial limb is less than fully human, eh?

                    If you want to say so. I would not. I think, though, that normally
                    someone with an artificial limb would prefer to have their natural limb
                    back. Since that is generally impossible, people with artificial limbs
                    might in some cases assert that they are perfectly happy and proud of
                    their prostheses, thank you. But I wouldn't believe them, even if I
                    might respect their privacy. I would think any person who has lost a
                    limb, or the use of a limb, must mourn such a loss at the deepest
                    personal level.

                    A person who deliberately chose the amputation of a perfectly good limb
                    in order to replace it with an artificial one would be viewed as crazy
                    by most people, and I think this would be an appropriate response even
                    if the artificial limb were made of nanostuff.

                    > I'm part metal (I've had a little dental work

                    Yeah, so have I. We have fillings put in our teeth, in order to keep
                    them as near as possible to an intact condition, and likewise people
                    have prostheses of all kinds in order to maintain as near as possible a
                    normal life and health.

                    > you wish to impose the first step history's most famous fascists
                    > applied to the Jews: make them seem less than human.

                    Your rhetoric is tiresome. You are the proto-fascist here, obsessed
                    with your vision of "becoming" an ubermensch and taking over the
                    universe.

                    > the status of "human" is measured by the mind, not the body.

                    No, the status of "human" is not measured at all. Either you are human,
                    or you are something else, like a dog or a fire hydrant.

                    > A flatlined brain in an otherwise perfect body, on the other hand,
                    > IS considered less than human

                    No, a person who is brain-dead is considered dead, a dead human.

                    > A functioning mind in a wasted body--or an entirely artificial
                    > body--is no less human than if it's in a perfectly healthy body
                    > I defy you to dispute that.

                    I dispute it. A human whose body is wasted and close to death is
                    considered a human whose body is wasted and close to death... and if
                    that person is at all aware, we have sympathy for him, and will try to
                    comfort him, keep him company until the end.

                    On the other hand, a "functioning mind in... an entirely artificial
                    body" is an object which does not exist, except in science fiction, and
                    in most of the movies I have seen such an object is not considered human
                    at all. A human brain kept alive in a box of some kind is generally and
                    properly an object of fear and loathing. Computer simulations of human
                    minds should be viewed likewise, for many reasons which we have by now
                    discussed ad nauseam.

                    > I suspect this is revealing of what you fear about transhumanism...
                    > do you see it resulting in a world full of cyber-enhanced teenage
                    > boys bringing chaos to the world?

                    No, because such critters are easily dealt with by means of ordinary
                    military force. It is the ideology itself which is ugly, fascistic and
                    anti-humanistic, as well as founded on covert superstitions and a
                    failure to comprehend the world and the human predicament as it truly
                    is.

                    > if the advanced technology is available only in wearable form, as you
                    > support, it would do nothing to stop those "frightening teenagers"

                    You can do a lot of damage with an AK-47. Technology is very powerful,
                    and humanity is frail. This is the essential fact which we must
                    respect. All your cyborg crap isn't going to change this fact. The
                    evil is in the superman fantasies, in the belief that you would be
                    something "better" than other people if you had nanofiber bones or night
                    vision or whatever.

                    > With gradualism there is also a constant rate of extinction of less
                    > fit genes, and competition is not between species but between genes

                    Right, that's what I said. The "less fit" genes are driven into
                    extinction. One form drives out another.

                    > Eventually an entire population will become a new species,
                    > thereby rendering the previous species extinct.

                    Right, exactly. Of course, the demarcation is somewhat arbitrary. But
                    my point is the same: there is no reason for us to desire human
                    extinction, whether by sudden catastrophe, being driven to extinction by
                    new species, or by "improving ourselves" out of existence.

                    > I gather you are also against eugenics?

                    If we look at the human genome, we will see many examples of clear
                    "genetic damage," where a functional gene was mutated to a nonfunctional
                    form, and where it is clear that the result is a state of less health
                    than with the gene in the previous functional form (although there are
                    cases such as sickle cell where the mutated form was somewhat
                    advantageous due to external stresses which can also be removed). When
                    it is clear that genetic health has been damaged and can be restored by
                    technological means, it will be considered desirable to do so.

                    This is very different from the attempt to breed superbabies. Again, we
                    have to recognize that technology is rendering the issue of superman
                    moot. No one is ever going to be able to store as much information in
                    her head as exists in google's disk farm. No one is ever going to be as
                    strong as a diesel engine. We don't need superpeople. The game's over.

                    > I've always advocated the freedom of choice. I have no problem
                    > with you calling it capitalism,

                    Actually, "freedom of choice" is just a slogan masking the coercive
                    reality of capitalism.

                    > but I have to say you're living in the wrong country if you have
                    > something against capitalism.

                    Dude, this is my country. I was born here. As for capitalism, it has
                    many virtues, and also many vices. I'm against market fundamentalism,
                    the idea that the market system is a moral value in itself, not
                    something to be judged according to how well it works for people - and
                    not just some people, but all the people.

                    > The control over individual improvement you're advocating is
                    > closer to socialism...

                    Even capitalist countries have laws to protect public safety.

                    > which hasn't exactly proven a successful societal model.

                    Actually, it has. There is no pure capitalism in the world; every
                    country has some sort of public sector. Even so-called failed states,
                    to the extent that they still function socially, have structures such as
                    the village and extended families. Personally, I think the European
                    model, with a larger public sector financed by heavier taxation,
                    compares very favorably with the American model, with our threadbare
                    commons, impoverished underclass and high crime rates. Our capitalism
                    needs a bigger dose of socialism.

                    > Pure capitalism fosters economic inequity between individuals

                    Yes, it does, and without mechanisms of redistribution such as
                    progressive taxation and public spending, wealth will concentrate to the
                    point that the system strangles itself and casts millions into poverty,
                    as happened in the 1930s, for example.

                    > presumably that is what you fear most, since unenhanced humans
                    > won't be as competitive.

                    No, no, no. What I fear is your stupidity in believing that this is the
                    case. You think "I'm going to become superman and win the big game."
                    It is this proto-fascist ideology that concerns me. The reality is
                    this: your nanobones and cyborg eyes aren't going to make you one whit
                    more competitive than an "unaugmented" human being who has MORE MONEY
                    and MORE ACCESS TO TECHNOLOGY than you do. It doesn't matter at all
                    whether the technology is implanted or not; in fact, implanting it can
                    only hamper its performance. The point is that we have to get over this
                    idiotic obsession with competition, because we are all going to be
                    outcompeted by technology. THE GAME IS OVER.

                    > I certainly don't advocate the "seizing of control" of what is
                    > "properly the birthright of humanity" (whatever that means),

                    You would transfer the rights of humanity to technology, creating a
                    competitor to our species. I am all in favor of our continuing to make
                    use of technology, but opposed to letting technology become an end in
                    itself, or worse, a self-interested force in competition with human
                    beings.

                    > with or without "military brutality" (where do you come up with
                    > such accusations?).

                    Capitalist "property rights" are always enforced by as much military
                    brutality as needed in order to suppress any challenge.

                    > Nobody is advocating the elimination of humans, yet you're
                    > advocating the proactive elimination of any alternatives we
                    > may soon be able to create.

                    We do not need to create any "alternatives" to humans, which would
                    threaten the possible "elimination of humans."

                    > you unyieldingly maintain that we have a moral imperative to not
                    > allow humans to be modified. On what basis do you make this moral
                    > imperative?

                    Protecting humanity against mortal threats is a moral imperative.

                    > If there is any moral imperative at all concerning modification of
                    > species (and let me make this perfectly clear: I personally do not
                    > believe there is ANY moral imperative), then nature shows us that
                    > evolutionary change is fundamental to life. Insisting on
                    > stagnation runs counter to nature,

                    Amazing. You make an argument, claim that you do not believe it, then
                    repeat it as proof. I take it as proof of what I said in the first
                    place: You are pre-Copernican in your belief that "nature" tells us what
                    is right and what is wrong.

                    > what is the reasoning behind your moral imperative for stagnation
                    > of the human species?

                    I DO NOT insist on "stagnation." With our technology, our sciences, our
                    arts, our relationships, our expansion into the universe, etc., etc., we
                    have lots of growing and exploring and living and maturing to do. What
                    I insist on is SURVIVAL of the human species. And that what is NOT
                    human is NOT to be accepted as "human," and is NOT to be allowed to
                    usurp the future from humanity, and NO ONE is to be allowed to create
                    such a thing that would or even potentially could usurp the human
                    future.

                    > Capitalism is pathological thinking? And competition is wrong?

                    Capitalist fundamentalism and the obesssion with competition are
                    pathological.

                    > I wouldn't be so disparaging about competition--it is what has
                    > fostered the development of technology, after all

                    Cooperation, the opposite of competition, has fostered the development
                    of technology. This is obviously so, since technology represents the
                    cooperative efforts of millions of people over hundreds and thousands of
                    years. Competition has played a role, as well, but one way overstated,
                    and very often a destructive role, as when, for example, competing
                    software companies refuse to accept each other's standards - causing
                    huge levels of waste.

                    > Eliminate competition and you're left with a species that
                    > accomplishes little or nothing.

                    An experiment never performed, and never to be performed; I don't mean
                    to say we should eliminate copetition from human life, but the obession
                    with competition is harmful. We need to tone it down and recognize that
                    the game is over; it no longer matters who is the smartest or the
                    strongest because machines are stronger and smarter than we are. We
                    have to learn how to value each other, and to compete in the spirit of
                    fun, but not to drive each other into poverty, death, or extinction.

                    > Ban the creation of machine intelligence?

                    I never said so. I would ban the creation of self-willed,
                    self-interested machine intelligence, including anything modeled on the
                    human mind.

                    The other night, I wanted to find out the name of a rare bird I had
                    seen... I would like to be able to go to my computer, describe the bird,
                    have it ask me a few questions, and then have it show me a few pictures
                    until I see a picture of the bird I saw. I think that's coming, and I
                    don't see why the damn thing would have to think it was a person and
                    secretly plot to take over the universe in order to be able to help me
                    identify a bird species.

                    Likewise, we may want robots that can navigate and perform various
                    tasks, vehicles that can drive themselves, etc., but we don't need
                    robots that are seeking their own place in the world and we don't need
                    cars that decide for themselves where they'd like to go.
                  • Ooo0001@aol.com
                    In a message dated 5/13/2002 8:44:32 PM Pacific Daylight Time, ... First, how a person feels about his artificial limb is not the point. The point was, you
                    Message 9 of 13 , May 14 1:00 PM
                    • 0 Attachment
                      In a message dated 5/13/2002 8:44:32 PM Pacific Daylight Time, mgubrud@... writes:


                      > so someone with an artificial limb is less than fully human, eh?

                      If you want to say so.  I would not.  I think, though, that normally
                      someone with an artificial limb would prefer to have their natural limb
                      back.  Since that is generally impossible, people with artificial limbs
                      might in some cases assert that they are perfectly happy and proud of
                      their prostheses, thank you.  But I wouldn't believe them, even if I
                      might respect their privacy.  I would think any person who has lost a
                      limb, or the use of a limb, must mourn such a loss at the deepest
                      personal level.


                      First, how a person feels about his artificial limb is not the point. The point was, you said in no uncertain terms that a person who is partly made of machine is less human than someone who is 100% normal. Now you're saying that's not the case. If you agree that someone with a prosthesis is no less human than a normal human, then you must also agree that a person with cybernetic augmentations is no less human either--for what are cybernetic augmentations but advanced prostheses?

                      Second, if you feel those who have lost a limb mourn the loss at "the deepest personal level" (something I dispute but won't bother pursuing for now), then having a functionally superior artificial limb would go a long way to easing that profound sense of loss, wouldn't it? I know it would work wonders for me. How do I know that? Well, with glasses I have almost 20/10 vision (and something like 20/200 without). Would I trade that visual accuity for 20/20 vision without glasses? Not on your life, despite how much I hate wearing glasses or contacts. Now make those glasses a permanent implant, and I'd be pleased as punch--the best of both worlds.



                      A person who deliberately chose the amputation of a perfectly good limb
                      in order to replace it with an artificial one would be viewed as crazy
                      by most people, and I think this would be an appropriate response even
                      if the artificial limb were made of nanostuff.


                      People lop off bits of their faces and bodies in the pursuit of nothing more than esthetics! For the same reasons they scar themselves and poke chunks of metal through their flesh. They're willing to use steroids to sacrifice their sex lives and even shorten their full lifespans for greater physical performance. Lopping off limbs to replace them with greater functionality really isn't much of a stretch. The only thing holding people back right now is the lack of functionally superior cybernetic enhancements. Sure, a lot of people call those who pursue plastic surgery, body piercing, scarification and steroid treatments crazy...but just see how effective calling them "crazy" is in an attempt to get them to stop. It's not working. And when decent cybernetic enhancements come along, it likely won't work then, either.



                      > I'm part metal (I've had a little dental work

                      Yeah, so have I.  We have fillings put in our teeth, in order to keep
                      them as near as possible to an intact condition, and likewise people
                      have prostheses of all kinds in order to maintain as near as possible a
                      normal life and health.


                      "Normal life and health" involves decay, sickness and crippling injuries. There's nothing normal to the human condition about filling dental cavities with metal, yet we do it to preserve or regain the function we're accustomed to. As technology progresses, we incorporate more and more of our technology into ourselves. That trend will surely continue, and even accelerate as prostheses exceed the performance capabilities of flesh.

                      Regardless, I don't consider you less than human for having metal fillings, and I appreciate your not considering me less than human for the same reason.



                      > you wish to impose the first step history's most famous fascists
                      > applied to the Jews: make them seem less than human.

                      Your rhetoric is tiresome.  You are the proto-fascist here, obsessed
                      with your vision of "becoming" an ubermensch and taking over the
                      universe.


                      First, I request that you don't resort to ad hominem attacks. Second, if you're going to resort to name-calling, at least be accurate--becoming an "ubermensch" and taking over the universe is not fascist or even "proto-fascist" any more than it's communist or proto-communist. You're misusing the term. Third, I've never made any claim for desiring to take over the universe, although I will concede that I wouldn't mind becoming an ubermensch. Then again, I suspect that admission would be true for most human beings throughout history.

                      The fact remains: I do not judge people as being more or less human based on their physiology. You have admitted that you do.


                      > A functioning mind in a wasted body--or an entirely artificial
                      > body--is no less human than if it's in a perfectly healthy body
                      > I defy you to dispute that.

                      I dispute it.  A human whose body is wasted and close to death is
                      considered a human whose body is wasted and close to death... and if
                      that person is at all aware, we have sympathy for him, and will try to
                      comfort him, keep him company until the end. 


                      I agree entirely with your statement. However, it doesn't even attempt to disputed my statement. Is a person who has a wasted body or artificial limbs considered less than human? No. Would a person who had an artificial body be considered less than human? Again, no. Yet this is what you've claimed all along with statements like "someone who is part human and part artificial is less than human."



                      On the other hand, a "functioning mind in... an entirely artificial
                      body" is an object which does not exist, except in science fiction, and
                      in most of the movies I have seen such an object is not considered human
                      at all.  A human brain kept alive in a box of some kind is generally and
                      properly an object of fear and loathing.  Computer simulations of human
                      minds should be viewed likewise, for many reasons which we have by now
                      discussed ad nauseam.


                      LOL! Of course a functioning mind in an entirely artificial body doesn't exist YET. But that's immaterial, since we are exploring the possibilities that appear on the technological horizon. A human brain kept alive in a box is NOT "properly" considered an object of fear and loathing. The bigotry of your position is appalling. It's not the fact that a mind is disembodied that makes it "evil," it's the mind itself. Scientists have been portrayed in film as creators of monsters like Frankenstein, so should all scientists be regarded with fear and loathing? That's just plain silly. It is the mind--whether or not it's in a human body or a box--that determines what is "evil." A "good" and intelligent mind, in a box or otherwise, is worthy of respect, rights and, yes, humanity. Tell me, would you have thrown rocks at the elephant man?


                      > I suspect this is revealing of what you fear about transhumanism...
                      > do you see it resulting in a world full of cyber-enhanced teenage
                      > boys bringing chaos to the world?

                      No, because such critters are easily dealt with by means of ordinary
                      military force.  It is the ideology itself which is ugly, fascistic and
                      anti-humanistic, as well as founded on covert superstitions and a
                      failure to comprehend the world and the human predicament as it truly
                      is.


                      On what evidence do you base any of this position? How is cybernetic enhancement indisputably ugly, fascist, anti-humanistic, covertly superstitious, and ignorant of the world and the human predicament as it truly is? Without a solid argument supporting such a subjective statement, it's nothing more than baseless sanctimoniousness.

                      You can do a lot of damage with an AK-47.  Technology is very powerful,
                      and humanity is frail.  This is the essential fact which we must
                      respect.  All your cyborg crap isn't going to change this fact.  The
                      evil is in the superman fantasies, in the belief that you would be
                      something "better" than other people if you had nanofiber bones or night
                      vision or whatever.


                      Agreed. Except that I don't maintain the position that a cyborg would be "better" than a normal human, just different. A cyborg may be better at certain tasks than a normal human, but not necessarily a better person. You're accusing me of beliefs that simply aren't true.

                      And since you admit that technology is powerful and humanity is frail, that sounds like a viable reason to toughen up humanity. But if you wish to remain frail, be my guest.


                      > Eventually an entire population will become a new species,
                      > thereby rendering the previous species extinct.

                      Right, exactly. Of course, the demarcation is somewhat arbitrary.  But
                      my point is the same: there is no reason for us to desire human
                      extinction, whether by sudden catastrophe, being driven to extinction by
                      new species, or by "improving ourselves" out of existence.


                      Agreed. I've nothing against some humans staying as they are. I've also nothing against others altering themselves into something different, something they would rather become. You shouldn't be either.


                      > I gather you are also against eugenics?

                      If we look at the human genome, we will see many examples of clear
                      "genetic damage," where a functional gene was mutated to a nonfunctional
                      form, and where it is clear that the result is a state of less health
                      than with the gene in the previous functional form (although there are
                      cases such as sickle cell where the mutated form was somewhat
                      advantageous due to external stresses which can also be removed).  When
                      it is clear that genetic health has been damaged and can be restored by
                      technological means, it will be considered desirable to do so. 

                      This is very different from the attempt to breed superbabies.  Again, we
                      have to recognize that technology is rendering the issue of superman
                      moot.  No one is ever going to be able to store as much information in
                      her head as exists in google's disk farm.  No one is ever going to be as
                      strong as a diesel engine.  We don't need superpeople.  The game's over.


                      Again, you keep missing the point. It's very true that we don't need superpeople. It's also true that almost everyone alive wouldn't mind being at least a little bit of a superperson. Surely even you, Mark, have something that you wish you could do a lot better than you can now. NEED has never been the issue. Do we need most of what we have? Not even close. WANT is another matter entirely. Want is what drives our society, and want is what will bring about genetic enhancement, cybernetic enhancement and machine intelligence.

                      > but I have to say you're living in the wrong country if you have
                      > something against capitalism.

                      Dude, this is my country.  I was born here.  As for capitalism, it has
                      many virtues, and also many vices.  I'm against market fundamentalism,
                      the idea that the market system is a moral value in itself, not
                      something to be judged according to how well it works for people - and
                      not just some people, but all the people.

                      > The control over individual improvement you're advocating is
                      > closer to socialism...

                      Even capitalist countries have laws to protect public safety.

                      > which hasn't exactly proven a successful societal model.

                      Actually, it has.  There is no pure capitalism in the world; every
                      country has some sort of public sector.  Even so-called failed states,
                      to the extent that they still function socially, have structures such as
                      the village and extended families.  Personally, I think the European
                      model, with a larger public sector financed by heavier taxation,
                      compares very favorably with the American model, with our threadbare
                      commons, impoverished underclass and high crime rates.  Our capitalism
                      needs a bigger dose of socialism.


                      Not that I want to go off on a political tangent here, but the reason European countries are economically weak compared to the US is precisely because of the socialist practices. The strongest countries are capitalistic. Hong Kong has no natural resources and should be an economic backwater, yet it's one of the wealthiest nations per capita because of its loose obstacles to doing business. With all other things being equal, you have a choice between capitalism, economic strength, and inequality on one side, and socialism, economic weakness, and equality on the other. Fortunately, we have the full gamut of political ideologies available in this world. If you don't like the way the US does things, there are other countries you could try. But be aware; I've personally lived in more than half a dozen different countries and visited half a dozen more, and I've seen no model as successful as the US model, despite its myriad flaws. There's a reason people the world over think of the US as the "land of opportunity." Once you get out of college, I recommend you spend a few years overseas. I think your black-and-white, "I know what's best" ivory tower view of the world will change dramatically.

                      > presumably that is what you fear most, since unenhanced humans
                      > won't be as competitive.

                      No, no, no.  What I fear is your stupidity in believing that this is the
                      case.  You think "I'm going to become superman and win the big game."
                      It is this proto-fascist ideology that concerns me.  The reality is
                      this: your nanobones and cyborg eyes aren't going to make you one whit
                      more competitive than an "unaugmented" human being who has MORE MONEY
                      and MORE ACCESS TO TECHNOLOGY than you do.  It doesn't matter at all
                      whether the technology is implanted or not; in fact, implanting it can
                      only hamper its performance.  The point is that we have to get over this
                      idiotic obsession with competition, because we are all going to be
                      outcompeted by technology.  THE GAME IS OVER.


                      LOL! Again, you've missed the point entirely. Mark, I have no interest in using cybernetic implants or being made entirely machine for competitive purposes against other people! Why do you see me as scheming to take over the world? I suspect you have read far too many comic books. But I'm 39, Mark, and philosophically about as far away from a teenager as you can get. I really don't mind the idea that unaugmented but wealthier humans may have access to more powerful technology than myself. More power to them! My own interest in augmentation is much more mundane: there are certain activities which I enjoy (or would enjoy) doing but that put me at risk, and I would like to reduce that risk. But I prefer the concept of implanted technology over worn technology. It's as simple as that: preference. The game is not over, Mark, because I am not playing your game.

                      You would transfer the rights of humanity to technology, creating a
                      competitor to our species.  I am all in favor of our continuing to make
                      use of technology, but opposed to letting technology become an end in
                      itself, or worse, a self-interested force in competition with human
                      beings.


                      Actually, I would share the rights of humanity with human-like technology. That's a major difference. If the world were as ruthlessly competitive as you seem to think, Mark, the less physically strong women would be completely dominated by the more physically strong men (in fact, that's the case in some places, but not in the most advanced nations). I happen to believe that "human rights" could and should be shared with all forms of human-equivalent sentience, either encountered or manufactured.

                      Capitalist "property rights" are always enforced by as much military
                      brutality as needed in order to suppress any challenge.


                      As opposed to MORE military brutality than is needed in many other economic systems, including socialist (where there are any property rights at all).

                      > you unyieldingly maintain that we have a moral imperative to not
                      > allow humans to be modified. On what basis do you make this moral
                      > imperative?

                      Protecting humanity against mortal threats is a moral imperative.


                      Talk about overkill. Human modification does not necessarily imply any mortal threat to humanity. It could just as easily imply greater survival potential through diversity.


                      > If there is any moral imperative at all concerning modification of
                      > species (and let me make this perfectly clear: I personally do not
                      > believe there is ANY moral imperative), then nature shows us that
                      > evolutionary change is fundamental to life. Insisting on
                      > stagnation runs counter to nature,

                      Amazing.  You make an argument, claim that you do not believe it, then
                      repeat it as proof.  I take it as proof of what I said in the first
                      place: You are pre-Copernican in your belief that "nature" tells us what
                      is right and what is wrong.


                      Do you not understand the concept of a hypothetical argument? You should--this whole thread runs on the hypothetical position of our eventually having the technological capacity to create "transhumans" and AI. I make no claim that transhumanism, AI, or even nanotechnology will certainly come to pass, yet I argue that IF the technologies arrive, there are certain implications of those technologies that are worth considering. My above statement runs along the same lines: IF there is any moral imperative. I make no claim that there is a moral imperative; in fact, I even go so far as to say I don't believe in moral imperatives. I do not even know whether I believe nanotechnology will be as fully developed as many of us hope, but that doesn't negate my ability to argue about its consequences hypothetically. I'm surprised you have trouble understanding the very nature of this whole debate, Mark. I would recommend you take a course on debate.


                      > what is the reasoning behind your moral imperative for stagnation
                      > of the human species?

                      I DO NOT insist on "stagnation."  With our technology, our sciences, our
                      arts, our relationships, our expansion into the universe, etc., etc., we
                      have lots of growing and exploring and living and maturing to do.  What
                      I insist on is SURVIVAL of the human species.  And that what is NOT
                      human is NOT to be accepted as "human," and is NOT to be allowed to
                      usurp the future from humanity, and NO ONE is to be allowed to create
                      such a thing that would or even potentially could usurp the human
                      future.


                      You're off topic; we were talking about evolutionary stagnation, not technological stagnation. You insist that it is a moral imperative for the human species to remain genetically unchanging. That is the definition of evolutionary stagnation. Humans have been comprised of several slightly differing species in the past; what's wrong with that trend continuing? If there had been a Homo erectus with both your fanatical insistence on genetic purity and the means to enforce it, Homo sapiens never would have come to be.

                      > I wouldn't be so disparaging about competition--it is what has
                      > fostered the development of technology, after all

                      Cooperation, the opposite of competition, has fostered the development
                      of technology.  This is obviously so, since technology represents the
                      cooperative efforts of millions of people over hundreds and thousands of
                      years.  Competition has played a role, as well, but one way overstated,
                      and very often a destructive role, as when, for example, competing
                      software companies refuse to accept each other's standards - causing
                      huge levels of waste.


                      As compared to the waste generated in socialist countries, where lack of competition results in inferior quality goods and a lack of choice? Just look at the products produced by socialist countries; there's little quality, little diversity, little choice. Competition BREEDS cooperation; the two work hand in hand in the development of technology. Too much of one or the other results in inefficiency and waste. Trumping cooperation while disparaging competition doesn't make a lot of sense.


                      > Eliminate competition and you're left with a species that
                      > accomplishes little or nothing.

                      An experiment never performed, and never to be performed; I don't mean
                      to say we should eliminate copetition from human life, but the obession
                      with competition is harmful.  We need to tone it down and recognize that
                      the game is over; it no longer matters who is the smartest or the
                      strongest because machines are stronger and smarter than we are.  We
                      have to learn how to value each other, and to compete in the spirit of
                      fun, but not to drive each other into poverty, death, or extinction.


                      I'll agree with the last sentence. But the competition game will never be over. If two people ever want the same job position, there will be competition. And being the smartest or the strongest will matter then. The cutthroat competition you're referring to IS being toned down, so I'm not sure what you're all upset about. Yesterday, losing out to the smartest or strongest might have meant you starved to death; today, losing gets you on unemployment; tomorrow, losing may well have no negative consequences.

                      Derek
                    • Mark Gubrud
                      First of all, Derek, I ll admit it, you re wearing me out. I have other things to do besides engage in an endless, repetitive wordplay duel with someone who
                      Message 10 of 13 , May 14 5:47 PM
                      • 0 Attachment
                        First of all, Derek, I'll admit it, you're wearing me out. I have other
                        things to do besides engage in an endless, repetitive wordplay duel with
                        someone who isn't at all interested in thinking about any of the
                        arguments that have been made. So I'll probably cut this off soon if
                        you don't.

                        Derek wrote:

                        > you said in no uncertain terms that a person who is partly made of
                        > machine is less human than someone who is 100% normal.

                        I did not say what you accused me of, namely that a person who uses an
                        artificial limb or other prosthetic device is "less human." Such a
                        person is trying to live as fully as possible. The prosthetic device is
                        not part of the person. It is not alive. It is something which enables
                        the person to regain otherwise lost capabilities. However, if you
                        regard the person plus prosthetic as constituting a system, then I
                        suppose that formally such a system would be less than fully human,
                        since part human and part not. The human part of the system, however,
                        is in no sense less than fully human.

                        > then you must also agree that a person with cybernetic augmentations
                        > is no less human either

                        The human part would be no less than human, but if you keep subtracting
                        more and more of a person, eventually you will no longer have a person.

                        > having a functionally superior artificial limb would go a long way
                        > to easing that profound sense of loss, wouldn't it?

                        I imagine it would make it easier to accept the fact, but it would
                        hardly erase the loss.

                        > They're willing to use steroids to sacrifice their sex lives and
                        > even shorten their full lifespans for greater physical performance.

                        Yeah, and I don't mind saying that's sick.

                        > Lopping off limbs to replace them with greater functionality
                        > really isn't much of a stretch.

                        And it would be just as sick.

                        > health" involves decay, sickness and crippling injuries.

                        No, health is the opposite of decay, sickness and crippling injuries.
                        When I used the words "normal life and health" I did not mean normal
                        death and illness.

                        > Would a person who had an artificial body be considered less than human?

                        If it is not a human body (I'll allow that a human body could be created
                        by artificial means), then yes... and it would not even be correct to
                        describe such a thing as a "person."

                        > A human brain kept alive in a box is NOT "properly" considered
                        > an object of fear and loathing.

                        Ugh.

                        > A "good" and intelligent mind, in a box or otherwise, is worthy
                        > of respect, rights and, yes, humanity.

                        Let's get back to talking about real things. Do you mean a brain in a
                        box? A disembodied human brain? God forbid such a thing should ever be
                        created, and if not God, the law.

                        > would you have thrown rocks at the elephant man?

                        Umm, no, why, would you?

                        > How is cybernetic enhancement indisputably ugly, fascist,
                        > anti-humanistic, covertly superstitious, and ignorant of the world
                        > and the human predicament as it truly is?

                        I think we've been over this.

                        > since you admit that technology is powerful and humanity is frail,
                        > that sounds like a viable reason to toughen up humanity.

                        In your ugly, fascistic, anti-humanistic and covertly superstitious way
                        of thinking. But how are you going to "toughen up humanity" to nuclear
                        attack? Or to attack by voracious self-replicating nanosystems? How
                        are you going to make "humanity" capable of numerical processing or
                        database searches at computer speeds? What are you going to do about AI
                        systems of megabrain capability?

                        What I keep telling you is this: Your superman fantasies are a joke!

                        The only way to protect humanity against the power of technology is to
                        keep humanity in control. This absolutely implies the need not to
                        "merge" humanity with technology, and not to create technological
                        systems which claim or are granted the rights of human beings.

                        > almost everyone alive wouldn't mind being at least a little bit
                        > of a superperson.

                        Almost everyone alive might want to remove some imperfections,
                        weaknesses, health problems, etc. But not to be something other than
                        human. Sure, some people might want to try fantasies of being a bird or
                        tiger or something, but I think virtual reality simulations, if properly
                        done, will prove very satisfying in this regard. And as I have said,
                        people can enjoy a lot of different physical experiences with various
                        kinds of outboard gear.

                        The really important thing will be to develop our minds, and if we do
                        this, we will understand the value of preserving our humanity intact.

                        > the reason European countries are economically weak compared to the
                        > US is precisely because of the socialist practices.

                        What do you mean by "economically weak?" The people enjoy a better
                        life. No doubt it's better for corporations to pay lower taxes and to
                        be able to extract more work for lower wages under less safe conditions
                        and with less regard to the environment, but I wouldn't equate the
                        competitive health of corporations with the economic health of the
                        nation.

                        > Hong Kong has no natural resources and should be an economic
                        > backwater, yet it's one of the wealthiest nations per capita
                        > because of its loose obstacles to doing business.

                        Hong Kong and the other "Asian tigers" draw on huge pools of desperate
                        but disciplined low-wage labor, first at home and later in the less
                        developed countries nearby.

                        > you have a choice between capitalism, economic strength, and
                        > inequality on one side, and socialism, economic weakness,
                        > and equality on the other.

                        The choice is not so stark. Some level of inequality is a necessary and
                        not-too-bad evil. Too much inequality produces economic contraction and
                        depression. The US would be better off with more socialism, more like
                        Europe or even Canada. That might not be good for certain US
                        corporations, but I couldn't care less.

                        > If you don't like the way the US does things, there are other
                        > countries you could try.

                        Who are you to say such a thing to me? It's my country, mate.

                        > There's a reason people the world over think of the US as the
                        > "land of opportunity."

                        Because they're dirt poor. They'll come here and work for a pittance
                        and be better off. So what?

                        > I have no interest in using cybernetic implants or being made
                        > entirely machine for competitive purposes against other people!

                        Perhaps not, at least not consciously, yet I will assert that this is
                        the main psychology and motivation underlying the movement you
                        represent.

                        > I would share the rights of humanity with human-like technology....
                        > I happen to believe that "human rights" could and should be shared
                        > with all forms of human-equivalent sentience, either encountered or
                        > manufactured.

                        The question is why would we manufacture such things, especially given
                        that "sharing" with them might mean letting them take most or all of
                        everything... in fact, it inevitably would, unless we maintained
                        control... which is all I am arguing for.

                        > You insist that it is a moral imperative for the human species to
                        > remain genetically unchanging. That is the definition of evolutionary
                        > stagnation.

                        I insist that it is a moral imperative to protect human survival, and
                        you asserted that accumulation of genetic modifications eventually makes
                        a former species extinct. I don't accept your paradigm in which human
                        survival is to be regarded as "evolutionary stagnation," because I do
                        not view "evolution" as an overarching moral framework, particularly not
                        in the biological sense. Our survival is meaningful; we are valuable
                        unto ourselves, not as just another link in some grander evolutionary
                        chain.

                        > waste generated in socialist countries, where lack of competition
                        > results in inferior quality goods and a lack of choice?

                        Again, I take my model of socialism from the European market-socialist
                        countries, where there is competition and lots of choice and the quality
                        of goods often compares favorably with what is produced here or
                        elsewhere.

                        > The cutthroat competition you're referring to IS being toned down,
                        > so I'm not sure what you're all upset about. Yesterday, losing out
                        > to the smartest or strongest might have meant you starved to death;
                        > today, losing gets you on unemployment; tomorrow, losing may well
                        > have no negative consequences.

                        I think it's been the case for a long time that you are more likely to
                        lose to the ruthless or lucky or well-connected than to the smartest or
                        strongest. I agree that over the long term the trend has been toward a
                        softening, but over the past two decades the trend has been in the
                        opposite direction, driven by a resurgence of "individualistic" and
                        hyper-competitive ideology, itself driven by a propaganda counterattack
                        from wealthy, Right-wing and corporate interests (plus some cooptation
                        of the counterculture).
                      • Ooo0001@aol.com
                        In a message dated 5/14/2002 8:42:28 PM Pacific Daylight Time, ... On the contrary, I do indeed think carefully about the arguments that have been made--my
                        Message 11 of 13 , May 15 1:24 PM
                        • 0 Attachment
                          In a message dated 5/14/2002 8:42:28 PM Pacific Daylight Time, mgubrud@... writes:


                          First of all, Derek, I'll admit it, you're wearing me out.  I have other
                          things to do besides engage in an endless, repetitive wordplay duel with
                          someone who isn't at all interested in thinking about any of the
                          arguments that have been made.  So I'll probably cut this off soon if
                          you don't.


                          On the contrary, I do indeed think carefully about the arguments that have been made--my in-depth responses should make that clear. The fact that I've rejected your positions is just indicative of your failure to convince me. That's okay--I've failed to convince you, too. However, if I weren't interested in the arguments, I wouldn't be debating these points. Yes, there is a fair amount of repetition on my part (likewise on yours), but only because you often don't give sufficient answers to support your positions, or you simply refuse to answer the most important questions without my asking them several times. Again, I DO want you to do your best to convince me; I figure that if anyone who supports your perspective can change my mind, it's someone with your tenaciousness. However, I find it heartening that the only evidence you have to support your positions appears to ultimately boil down to your own personal fears and hatreds coupled with a certain insular naivete about how the world works. I have no doubt that a few years of experience will broaden your perspective.


                          > you said in no uncertain terms that a person who is partly made of
                          > machine is less human than someone who is 100% normal.

                          I did not say what you accused me of, namely that a person who uses an
                          artificial limb or other prosthetic device is "less human." 


                          Actually, you did. Several times, in fact. Here's one of your quotes if you need a reminder:

                          "You mean less than fully human, or something part person, part not."

                          You said that in response to the human status of a cyborg. Well, a cyborg is just a high-tech version of someone with an artificial limb. As long as the mind retains its human personality, a cyborg will be considered no less human than you or I, no matter how much flesh has been replaced by machine. As well it should be, since a full cyborg is merely a further extension of the person with the artificial limb.

                          What if a person lost all his limbs and most organs in some accident, but could be saved by replacing the missing parts with cybernetic versions? That's using technology to save a person's life, something not far removed from using a mechanical heart to replace a failing natural heart--which is something you've already given your lofty seal of approval. Is it only "okay" to use artificial limbs and organs when they're to replace body parts that have been lost through accident? Leaving aside the argument that the purposeful removal of one's limbs is "sick," all such a decree would do would cause those who want to become cybernetic to stage limb-removing accidents, or to have the job done illegally and claim the limbs were lost to some accident.

                          Another area that seems to upset you is when an artificial limb or organ exceeds the functional capabilities of the replaced body part. Well, I have news for you: artificial replacements already exceed normal function in some ways. Titanium hip replacements are far stronger than normal bone. When cybernetic replacements become functionally more effective than natural body parts in all ways are you going to insist that replacement parts be "dumbed down" or weakened to make them no more effective than normal flesh? How do you think such a cyborg would feel if someone else could wear a strength-enhancing glove, but his own cybernetics would have to be limited to the strength of common flesh?

                          The issue begins to get complicated, doesn't it? Unless you strip humans with artificial limbs of their human status, these issues will come up and your attempts to control the proliferation of cybernetic replacements will likely prove impossible. But good luck trying to get the world to regard someone who's missing a limb as being somehow not fully human.



                          Such aperson is trying to live as fully as possible.  The prosthetic device is
                          not part of the person.  It is not alive.  It is something which enables
                          the person to regain otherwise lost capabilities.  However, if you
                          regard the person plus prosthetic as constituting a system, then I
                          suppose that formally such a system would be less than fully human,
                          since part human and part not.  The human part of the system, however,
                          is in no sense less than fully human.


                          Ah, very good! But person plus prosthetic isn't considered relevent when regarding whether or not a person as deserving of human status, right? So that leads into the next conclusion:



                          > then you must also agree that a person with cybernetic augmentations
                          > is no less human either

                          The human part would be no less than human, but if you keep subtracting
                          more and more of a person, eventually you will no longer have a person.


                          Ah, but what you're missing here is that as long as the mind is entirely intact--even if EVERY other body part is excised--you still have someone who is no less of a person than someone who is fully intact. It's only once you start removing the functions of the mind that you start having less and less of a person until you no longer have a person.

                          This is an important point because it de-emphasizes the importance of any part of the human body EXCEPT the mind. If the mind is preserved, the result is still human. That is the logic behind why a full cyborg should have no fewer rights than a normal human.

                          Now let's take it one more step: if the flesh isn't important when it comes to defining what is human, and the brain is flesh, then the brain is also unimportant when it comes to defining what is human. The only reason the brain has any importance at all is because it houses the mind, and without the mind you don't have something deserving of--or in need of--human rights. Again, I draw your attention to the disposable status of flatlined but otherwise living brains as proof that this is the case. So if we find some way to copy the entire function of the human brain into a purely machine substrate, then the ONE thing that defines a being as worthy of human rights is preserved. A machine mind is therefore deserving of the exact same human rights as you or I. You may not like the thought, but you can't argue with the logic.

                          Your solution is to preserve the status quo by banning such technologies. Well, it's not that simple. If you allow current artificial limbs and organs, then how can you ban advanced artificial limbs and organs? If you allow advanced artificial limbs and organs, then how can you ban cybernetic limbs and organs? If you allow cybernetic limbs and organs, how can you ban full body cybernetic replacements? And if you allow full body cybernetic replacements, then how can you ban machine mind copies? Where do you draw the line, and--more importantly--what makes you think your line drawing is any more valid than anyone else's line drawing? Profound change often sneaks up from many small, reasonable steps, and I suspect that is how machine minds will become fully accepted by society.


                          > They're willing to use steroids to sacrifice their sex lives and
                          > even shorten their full lifespans for greater physical performance.

                          Yeah, and I don't mind saying that's sick.
                           
                          > Lopping off limbs to replace them with greater functionality
                          > really isn't much of a stretch.

                          And it would be just as sick.


                          Whether or not it's sick is immaterial. The fact is that people do these things all the time, whether or not they're "sick," whether or not they're even legal. Many athletes use performance enhancing drugs even if they're illegal. Do you really think it's realistic to believe that banning future enhancement technologies will work?

                          > Would a person who had an artificial body be considered less than human?

                          If it is not a human body (I'll allow that a human body could be created
                          by artificial means), then yes... and it would not even be correct to
                          describe such a thing as a "person."


                          Ah, now either you're backtracking or I didn't make myself clear. Let me clarify: Would a person who had an artificial body but an otherwise normal human brain be considered less than human? You did admit above that a cyborg could still be considered a person. Although you did specify that the cyborg would be considered less and less of a person as more and more of him was replaced. Presumably you mean only his brain, since otherwise you again run into the problem of defining someone with an artificial limb being considered less than human. However, since societally and legally it is the mind, not the flesh, that is used to define what's deserving of human rights, then even a person with a fully artificial body (including the brain) would be considered worthy of full human rights.


                          > A human brain kept alive in a box is NOT "properly" considered
                          > an object of fear and loathing.

                          Ugh.

                          > A "good" and intelligent mind, in a box or otherwise, is worthy
                          > of respect, rights and, yes, humanity.

                          Let's get back to talking about real things.  Do you mean a brain in a
                          box?  A disembodied human brain?  God forbid such a thing should ever be
                          created, and if not God, the law.


                          Sorry, I thought you'd understand I was speaking metaphorically. A normal human brain in a fully cybernetic body (or some other life support system) is a "brain in a box."

                          > How is cybernetic enhancement indisputably ugly, fascist,
                          > anti-humanistic, covertly superstitious, and ignorant of the world
                          > and the human predicament as it truly is?

                          I think we've been over this.


                          No, we haven't. You make these statements without a logical argument to back your position. Just saying something is so doesn't make it so. You have to support your arguments for them to have any meaning in this discourse.

                          > since you admit that technology is powerful and humanity is frail,
                          > that sounds like a viable reason to toughen up humanity.

                          In your ugly, fascistic, anti-humanistic and covertly superstitious way
                          of thinking.  But how are you going to "toughen up humanity" to nuclear
                          attack?  Or to attack by voracious self-replicating nanosystems?  How
                          are you going to make "humanity" capable of numerical processing or
                          database searches at computer speeds?  What are you going to do about AI
                          systems of megabrain capability?


                          Your argument makes about as much sense as saying one shouldn't exercise because we'll never be as strong as an elephant. The answer is you probably can't toughen up a person to survive nuclear attack. But with the right technologies you can toughen up a person to survive a lot of otherwise fatal conditions. That's worthwhile enough.

                          As for making humanity capable of numerical processing, etc., that question has been addressed far more completely than I could in articles on Singularity. The short answer is that we may have to merge partially or completely with our machines. Copying human minds to machine substrates would be a step closer to that eventuality, at least giving us the option of how complete to make the merge. As for what I'm going to do about AI systems of megabrain capability, I'm going to support their research. Even if they turn out to be undesireable, for whatever reason, I would like to know as much as possible about their potential risks and rewards well in advance of their being developed first by others in more secretive environments.



                          What I keep telling you is this: Your superman fantasies are a joke!


                          Then feel free to laugh all you want. But considering that guns and body armor and nightvision goggles and radio communications and helicopters make even soldiers today "supermen" compared to soldiers of centuries past, I think it is all but inevitable that soldiers of the future will be supermen compared to soldiers today. How will their devices make these "superman fantasies" any less real if they're implanted rather than worn? And if the use of military examples upsets you, feel free to substitute PCs and cell phones and the Internet and passenger vehicles instead. You and I are both supermen compared to people even 100 years ago. The average citizens of 30 years hence will be supermen compared with today. So what's the big deal?

                          I guess the joke is on you. ;-)


                          The only way to protect humanity against the power of technology is to
                          keep humanity in control.  This absolutely implies the need not to
                          "merge" humanity with technology, and not to create technological
                          systems which claim or are granted the rights of human beings.


                          Merging with technology and creating artificial life forms needn't imply that humanity will lose control of technology. It might--just like nuclear power might have and might still destroy our society--but it's not a given.

                          Almost everyone alive might want to remove some imperfections,
                          weaknesses, health problems, etc.  But not to be something other than
                          human.  Sure, some people might want to try fantasies of being a bird or
                          tiger or something, but I think virtual reality simulations, if properly
                          done, will prove very satisfying in this regard.  And as I have said,
                          people can enjoy a lot of different physical experiences with various
                          kinds of outboard gear.


                          Mark, I've never argued that virtual reality simulations wouldn't do a great job in that regard. I'm saying that it won't be enough for everyone. When the technologies arrive to allow marked personal enhancements...those enhancements will be made. It's the nature of our society whether you like it or not.


                          The really important thing will be to develop our minds, and if we do
                          this, we will understand the value of preserving our humanity intact.


                          You're forgetting that different people--educated people--define humanity differently. And the more educated the mind, the more broad and less black-and-white the definitions become.


                          > the reason European countries are economically weak compared to the
                          > US is precisely because of the socialist practices.

                          What do you mean by "economically weak?"  The people enjoy a better
                          life.  No doubt it's better for corporations to pay lower taxes and to
                          be able to extract more work for lower wages under less safe conditions
                          and with less regard to the environment, but I wouldn't equate the
                          competitive health of corporations with the economic health of the
                          nation.


                          <G> You've obviously not lived in other countries for any length of time. I don't mean playing tourist, but actually making a living in other countries for at least six months, preferably a year or more. If you had, your perspective on what life is like there would not be so naive. Do yourself a favor: as soon as you get the chance, spend at least six months in each of at least three different countries (you can even pick Canada, if you wish). I guarantee you will be in for a major broadening of perspective. Until then, there's not much point in discussing political systems with you (don't complain--at least it shortens the list of things to argue about). Unless and until you live in a system you venerate, you really have no idea what works and what doesn't.

                          > I have no interest in using cybernetic implants or being made
                          > entirely machine for competitive purposes against other people!

                          Perhaps not, at least not consciously, yet I will assert that this is
                          the main psychology and motivation underlying the movement you
                          represent.


                          Well, if you're going to make a claim contrary to what we state is our position, then you'll have to back it up with evidence for it to have any meaning. Otherwise this is just your problem, something you need to get over so you can move on.


                          > I would share the rights of humanity with human-like technology....
                          > I happen to believe that "human rights" could and should be shared
                          > with all forms of human-equivalent sentience, either encountered or
                          > manufactured.

                          The question is why would we manufacture such things, especially given
                          that "sharing" with them might mean letting them take most or all of
                          everything... in fact, it inevitably would, unless we maintained
                          control... which is all I am arguing for.


                          The same argument can be made for not having kids--hell, they'll take the world away from us and inherit all we've got, sure as shit! Yet most adults accept the sacrifice. And children we rarely succeed in molding precisely to our preferences. Machines, however, should much more definable. I see no problem with passing on property to machines that think like we want them to.

                          I insist that it is a moral imperative to protect human survival, and
                          you asserted that accumulation of genetic modifications eventually makes
                          a former species extinct.  I don't accept your paradigm in which human
                          survival is to be regarded as "evolutionary stagnation," because I do
                          not view "evolution" as an overarching moral framework, particularly not
                          in the biological sense.  Our survival is meaningful; we are valuable
                          unto ourselves, not as just another link in some grander evolutionary
                          chain.


                          Hey, you don't have to like it, but the survival of humans as you define them IS evolutionary stagnation. That's true by definition. I'm not saying evolutionary stagnation is a bad thing, but neither is evolutionary dynamism. As I explained (which you seem to forget quite quickly), I don't attach a moral prerogative to either position. But I certainly don't think your status quo values should be imposed upon others with a desire to at least try improving themselves (however they choose to define that).

                          All these arguments boil down to the basic idea that you find the idea of modifying humans and creating intelligent machines, and then giving them the status of normal humans, deeply abhorrent, and your solution is to ban those technologies. Others, myself included, are at least interested in considering both options. Since you apparently don't support the banning of the underlying genetic engineering, nanotech, etc. technologies that would ultimately permit the development of both transhumans and AI (and most certainly couldn't stop them even if you wanted to), and considering that there are always nations and individuals who will pursue interests contrary to your positions once the underlying technologies become available, I think you will find your intents sidelined in short order.

                          If you don't believe me, take another look at the issue of cloning. Just today a person who intends to clone a person this year is speaking before the House. Despite laws banning human cloning, the government can't stop him, since there's always a way to work around banning laws. Your attempts to ban transhumanism and AI will very likely prove just as effective.

                          Derek
                        • soreff
                          ... technology.... ... shared ... or ... given ... of ... take the ... adults ... precisely to ... see no ... them to. Excellent point, Derek. I d regard a
                          Message 12 of 13 , May 15 4:19 PM
                          • 0 Attachment
                            --- In nanotech@y..., Ooo0001@a... wrote:
                            > In a message dated 5/14/2002 8:42:28 PM Pacific Daylight Time,
                            > mgubrud@s... writes:

                            > > > I would share the rights of humanity with human-like
                            technology....
                            > > > I happen to believe that "human rights" could and should be
                            shared
                            > > > with all forms of human-equivalent sentience, either encountered
                            or
                            > > > manufactured.
                            > >
                            > > The question is why would we manufacture such things, especially
                            given
                            > > that "sharing" with them might mean letting them take most or all
                            of
                            > > everything... in fact, it inevitably would, unless we maintained
                            > > control... which is all I am arguing for.
                            >
                            > The same argument can be made for not having kids--hell, they'll
                            take the
                            > world away from us and inherit all we've got, sure as shit! Yet most
                            adults
                            > accept the sacrifice. And children we rarely succeed in molding
                            precisely to
                            > our preferences. Machines, however, should much more definable. I
                            see no
                            > problem with passing on property to machines that think like we want
                            them to.

                            Excellent point, Derek. I'd regard a machine which contained a
                            copy of my mind as _vastly_ closer than a child which merely had
                            copies of some of my DNA.

                            Best wishes,
                            -Jeff
                          Your message has been successfully submitted and would be delivered to recipients shortly.