Re: [nanotech] I am the walrus
- Mark Gubrud wrote:
>You make pointless assertions based on views that are debatable and have
> On Thu, 30 Mar 2000, Samantha Atkins wrote:
> > I have a big problem labeling the real possibilities of transcending so
> > many of the limits that some associate with being human and cling to
> You have still not explained what it is that will "transcend." Does an
> airplane transcend the human inability to fly? Or do humans just ride
> in, and sometimes pilot, airplanes? Perhaps sometimes a pilot has the
> feeling of becoming one with his craft, but then perhaps someone flying
> on LSD has a feeling of being one with a tree. Is this transcendance?
> > being simply due to "greed, fear and perversity(!)".
> I was perhaps being a bit too glib. Greed: capitalist profit. Fear:
> nationalist militarism. Perversity: rejection of humanity.
> > I am greedy. Greedy for far more abundant life and living
> You can't live except as what you are. You exist only in the present,
> within a local region of spacetime. It is only evolutionary conditioning
> that makes you even care about tomorrow. But if what you want for
> tomorrow is that a functional copy of your brain should exist in a
> machine, not that your body should continue to live and be healthy, then
> you do not even want to be alive. You might as well be dead; indeed, you
> would be.
been debated without adequate answer from you. It is a very
questionable tactic that you employ.
> > and working intelligence than evolution has dealt to our species.Says you. Again, you offer nothing but assertions. Yes, huge
> Humans normally strive to increase their intelligence. But when
> "transhumanists" talk about multiplying personal intelligence by
> astronomical factors, they are talking nonsense. You could never
> incorporate such huge information flows into your consciousness, and any
> entity that did could not possibly be "you" in any meaningful sense.
information flows and the ability to handle them will necessitate other
changes in consciousness, but human beings are not static in any case.
That you change does not mean you are not you. It means that you
change. You are still you in every bit as meaningful a sense as you are
you when you go from being a fetus to a baby, baby to child, child to
adult. That these changes don't just happen naturally does not justify
an argument that these will make you not you.
> > I guess that makes me "perverted" eh?Why thank you but I would rather be perverted, especially if your
> To reject humanity is perverse. However, I feel that you are merely
misguided limitations on humanity are the standard. More interesting
> > > Human is US.No, you are not referring to objective fact at all but to Mark Gubrud's
> > Human is therefore whatever we define it as. Don't limit it overmuch.
> No, I am referring to an objective fact. Not that "human" is what we
> chose to be, but that it is what we were given, it is what we are, and it
> is all that WE ever can be. The question that we are discussing here is
> really what will come after us, in the future. Will it be human, or not?
opinions on where the boundaries are.
> > > Interesting... good... worthwhile... TO WHOM?Sly but ineffective. We need to evolve and change fairly quickly if we
> > To US. And to all the intelligences that are and will be that we become
> > or bring into being.
> Good that you begin by agreeing that the standard of goodness ought to be
> what is good to US, not some abstraction such as "evolution" or "filling
> the universe with optimally-efficient computronium" for godknowswhat
> purpose. Not so good that you then extend goodness-defining rights to
> other "intelligences," especially not if some of them might be dangerous
> rapacious machines. You seem to have some kind of notion of a "Peaceable
> Kingdom" inhabited by superintelligent furbies, augmented humans and
> perhaps some others gone native. Sounds nice enough, but it will not come
> about without strict laws to prevent the creation of self-replicating,
> bloodsucking furbies and limit the ambitions of augmented entrepreneurs.
will survive (larger we than you might believe is us). Stasis is death,
or at least serious regression. Again, this has been covered. Again, I
don't cast entrepreneurs as demons.
> > I care passionately about live. That is why I wantLife (as a human) is about dissatisfaction and growth and change.
> > more of it quantitatively and qualitatively.
> If you really love life, you will not be dissatisfied with it. You
> might want more varied experiences, you might want better health, you
> might want not to have to face death. Medicine, and nanomedicine, may
> help you to avoid what you fear. But you can't escape death by committing
> suicide, even if you have created a clone of yourself.
> > I am as human as you and I do not agree with some of your limits andI refer to humanity as it can be and with a bit of luck and lots of
> > reasons. You seem at times to want to speak for all humanity or for
> > humanity as a reified concept all nicely laid out and delimited in your
> > mind.
> I refer to humanity as it is. Human beings are limited. Human beings
> create reasons, or perhaps provide them merely by their existence. I am
> human, and I love humanity. I don't want us to be destroyed by our own
> creations, or by our own hands, or by anything.
work, will be. I am dissatisfied. I am against simply perpetuating the
status quo. To not be destroyed, imo, requires that we augment and
> > > > You can be just as human as you like and not ever die by old age.Huh? If there were no algorithm, no method, no knowledge - then there
> > >
> > > Okay, I like this idea. I don't want to face death, and I would be happy
> > > to have my aging process stopped or even reversed by nanotech or other
> > > means. This use of technology would be a wise choice for our species.
> > Great! Something we agree on.
> Yes, very good, and there will be people who will say it is against Nature
> to do this. Maybe they are right, but Nature is just another god of the
> cosmos, as illusory as the gods of "progress" or "greater intelligence."
> As I have said over and over, we have to find our guiding values within.
> > > Why would the existence of a such a Xerox copy be better than having
> > > children the natural way? Again, I have to focus on the language you use
> > > to point out the hidden mysticism. A recording of "yourself"?
> > > "Incarnated"? Why is this better than "nothing at all"?
> > Who said better? Why not both/and?
> I assume here you're referring to the question about children. Okay, so
> you could have yourself Xeroxed and also have kids. But you missed my
> point, which was, what do you gain by leaving behind a Xerox copy that you
> do not gain by leaving behind natural children?
> > Why is existence better than not existence?
> No: If you die, how are YOU any better off if someone saved a copy?
> > Perhaps you should ask yourself why or if you have a wish to
> > become humus.
> I admit that it is irrational; it is biological. It really makes no
> difference to me tonight if, unbeknownst to me, a terrorist has planted a
> bomb in Washington, D.C. which tomorrow will kill me. I would rather know
> that this is not the case; again, it is irrational and biological. We all
> need to be able to live without constant fear of being killed. That's
> just the nature of the beast, of what we are. There's no particular
> justification for it in terms of quarks or the cosmos, although you can
> obviously explain its origin in terms of evolution. It also doesn't
> depend on any mystical beliefs, unlike the desire to escape death by
> "migrating to a new embodiment."
> > > Consciousness is an algorithm? I know what an algorithm is; you can write
> > > it down on paper. So, if you wrote out an appropriate algorithm, that
> > > would be a consciousness?
> > Many algorithms cannot meaningfully be written down sequentially because
> > there are too many interlocking components to conveniently capture
> I didn't say you had to write it "sequentially." You can write it as a
> bunch of object modules interconnected any way you like.
> > What is it that you use the pen and paper
> > to capture? The marked up wood pulp is obvious not the algorithm but
> > simply an expression or rendering of it.
> Okay. Actually, the marked up wood pulp is the only reality. I don't
> believe "the algorithm" exists, actually. I don't believe in Plato's
> garden of ideals. But if you do believe that algorithms exist, as implied
> by your invocation of them as the substance of the soul, then surely you
> have to admit that the algorithm is present when it is written up on
> paper. If it exists when it is rendered one way, to use your terms,
> then surely it must exist when it is rendered a different way... right?
would be nothing to write down. This is not Platonism but a simple
acknowledgement of the reality of concepts, ideas, and knowledge. The
paper is not the algorithm but a representation of it. For some
algorithms, marks on paper might be a really poor representation. And
there is a difference between a static representation and an actually
operational implementation of an algorithm.
The point is that as you say the algorithms exists regardless of how it
is rendered. Therefore, to the extent consciousness is algorithmic, it
also can be rendered in multiple ways and further, in executable form.
I have shown by this line of thought that consciousness can be
transferred without needing any mystical transmigrations.
> > It can be rendered in a lot ofI can write down the code for a piece of software. It is the code, a
> > different ways including, if it is computable or somehow runnable, in a
> > machine, whether meat or otherwise.
> Now you seem to recognize the need to actually have something active
> before you can claim the existence of something that deserves to be called
> "consciousness." So consciousness is not some abstraction, such as an
> algorithm, that can be separated from its "embodiment" or "rendering."
representation of an algorithm. But I can only run it by loading it on
an appropriate device. Your introduction of "some abstraction" is
really pointless here and an argument tactic. I won't play. It can be
as separated as the example of software can be separated from its
machine embodiment and later reloaded and rerun.
> > > I agree with theI refuse to admit it because you certainly have not done so to my
> > > idea that it is possible, in principle (and nanotech suggests that it may
> > > someday be possible in practice), to COPY the pattern of function and
> > > interconnection of the human brain to some non-human system that would
> > > then function in a similar way. However, the community of enthusiasts for
> > > such a technology do not use the word "copy." They use a more seductive
> > > language which, I am trying to point out to you, harbors a kind of
> > > mysticism essentially identical to the ancient idea of "the spirit."
> > I can see why you might think that way although I still disagree about
> > it being mystical. Just because it reminds you of mysticism does not
> > make it so. But I will say the dreams of trascendence are at the heart
> > of mysticism, religion and science and much of technology also. Such
> > trancendent yearnings are deeply part of the human psyche.
> Very good, very true, very honest, except for your refusal to admit that I
> have unmasked the mysticism embedded in the language used to sell the
> ideas of "transhumanism."
satisfaction. It is not a refusal to not admit something you are not
convinced is so.
> But let me focus on the latter part of yourNot so false. Several types of trancendence of the human condition have
> response. It is true that people have dreamed similar dreams of
> transcendance for ages. Indeed, "transhumanism" is not much more than a
> repackaging of the same old dreams, with the false promise that technology
> can make them real.
been made so real we forget that to our ancestors they would be
considered nothing short of miraculous.
> But the threat from technology is real. And thereforeThe promises are real. The consequences and possible gotchas are to be
> it is very important that we recognize the falseness of these promises.
worried about and dealt with.
> Unfortunately, that requires coming to terms with the essential absurdityHere I will not follow you. The dreams are not in the least absurd and
> of the human condition, which implies also recognizing the silliness of
> some of our ancient dreams.
I will follow them fully to their conclusions.
>But there are other ancient dreams that weSome would argue (and have at every point of progress) that doing away
> can fulfill with technology. If we accept our inescapable humanity, if we
> learn to love ourselves and respect our place in the Universe, we can do
> away with poverty and disease, postpone death by billions of years, and
> live not as gods but as proud, happy, human beings in a realized paradise.
with poverty and disease and especially postponing death was dishonest
and an insult to the human condition. You have your limits. I have
mine. But don't pretend yours are the final word and that those who
disagree are "perverted".
> There is room in this scenario for all kinds of wonderful things, butBy your reckoning, there is no room for superintelligent humans of any
> there is no room for rapacious superintelligent egomonsters.
kind. Apparently you believe (a) that to become super-intelligent would
destroy some mysterious essence you believe essential to being human and
(b) that becoming super-intelligent would so ineveitably lead to
"rapacious egomonsters". But intelligence itself is neutral.
> > > > I would certainly want such a recording of me ready to beOne embodiment of the algorithms that make up me halted. That does not
> > > > reloaded in the event of my death.
> > >
> > > In the event of YOUR death. Very good. But why would you want this? Why
> > > would you care? Perhaps to look after your affairs. But we can probably
> > > get by without you. That might hurt your ego a bit, but it might help
> > > your soul to recognize its truth.
> > Because I have things to do that I would like to see continue without
> > being stopped by death.
mean another cannot be loaded and run.
> But if YOU were dead, how would you see them?This confuses an implementation of me, with me, the set of algorithms,
knowledge and experience. They are not necessarily identical.
> > You can get by without me but why should youIt is deplorable and can and will be fixed. I think we desperately need
> > need to? Why lose what skills and unique vantage points different
> > persons have gathered? It is really a waste.
> I agree there is something lost, but that is the human condition. It is
> really not so bad. You are the one who thinks we need your contribution.
a lot of those who have been lost.
> > Now who is being mystical? What do you mean by the term "soul"?That is not an answer.
> Look within yourself. I am speaking directly to you.
> I use the word for psychological effect. But you will only understandNor is that. I am beginning to thing I am talking to someone who
> what I mean by this if you reread the the two sentences above, and believe
> that they mean what they say.
> Everything that exists can be described, but description is not the only
> form of communication.
believes in some mysterious essence or soul. The vital animism within
or some such.
> > > Maybe such a copying into a machine would be useful from the point of viewI can add capacity by replacing less efficient circuitry with more
> > > of solving some hard problem or getting a large amount of work done. But
> > > how would you copy the generated experience back into the human brain?
> > > Face it, YOU can't "shift into computer being," and even if some amount of
> > > computer-generated information could be copied into your brain, you could
> > > never handle huge amounts of it.
> > I am not sure you would really need to copy the computer experience back
> > in order for it to be quite worthwhile.
> It would be needed in order to fulfill the promise that "I" can "shift
> into computer consciousness" and back again.
> > But to answer the question, I
> > assume that enough technology to do this at all would also allow
> > augmenting the human brain to receive a more efficient input of new
> > information/experience than can be done the old fashioned way.
> If it is only an I/O augmentation, you still cannot handle the
> information; there isn't enough capacity in your skull.
> > I expectIf you assume the brain will remain static then maybe, but it leaves out
> > by the time we get to requisite level of technology that sophisticated
> > mind/computer linkups will be commonplace.
> It is very unlikely that the human brain can usefully cope with
> significantly greater information flows than are already presented to us
> through the senses.
the possibility of inputting information in a meaningful way more
densely through current channels. But why should the brain remain
static? It can be augmented in its hardware. It has already begun with
some of the current medical implants.
> > In short, the tech that allowed me to download consciousness,I have clarified and explained it all I intend to for this discussion.
> You have still not explained how any technology can possibly do this, or
> even clarified what these words mean.
I cannot shake the feeling you are being purposefully obtuse.
> > intention, knowledge and someThat does not follow even with your assumptions. The conclusions would
> > measure of participation in the process into a computer will also
> > support uploading the results. It is a good bit more seamless than
> > anything we have today.
> If your notion is that you can Xerox your mind into a computer having a
> capacity say 10^6 times greater than that of your brain, run the machine
> for a while, then download the results back to your brain, then I say you
> could never download more than about 10^-6 of the results.
not necessarily be 10^6 times as volumnous as the unaugmented human
>You don't? Huge segments of the population are refinding Jesus and
> > I think that without significant evolution of human effective
> > intelligence the race is doomed to a major regression at best.
> I don't know why you think this. I see no sign of it. Elitists talk this
> way, but they mostly mean people get college degrees without reading the
> dusty old Great Books and they listen to music they like instead of Elgar.
> The truth is probably that more people than ever are reading even the
> dusty old books; the population is steadily more educated and plugged-in.
dropping out of being responsible aware citizens in the US. There are
anti-rational and anti-science sentiments and movements in almost all of
the developed countries. The complexity of an increasingly
technological and high speed world is such that most of the legislative,
executive and judicial powers of the world are frankly functioning with
vastly inadequate understandings of the relevant technical issues.
In the US random tests of the population, even the college educated
population for scientific knowledge are disgustingly dismal. Even on
knowledge of things most school children learn before they even hit
junior high school. It is a bad joke to claim the people are more
educated and plugged-in. They may be more plugged-in to TV and the web
but the diet they are consuming and regurgitating is pretty sad stuff
for the most part.
> > OurThis does not change the fundamental equations of information neeeded to
> > current brain/mind configuration mixed with our level of mortality and
> > the degree of learning/wisdom necessary to run/live in an increasingly
> > complex world is already at the breaking point. We are not doing all
> > that very well.
> Information overload? Okay, why not get off the internet for a while.
> Take a walk in the park/woods/sand/whatever. Relax. Breathe in the air.
understand one's world and participate well or of the inability and/or
unwillingness of many to do so.
> > I am not convinced you can firmly draw a fence to keep computers withinMy point is that we are in great peril in any case. We will not get out
> > or keep them firmly separted from us. To use them most effectively they
> > will more and more be merged into us.
> Maybe we don't need to use them "most effectively." In any case, we need
> to maintain boundaries and to remain in control, as human beings;
> otherwise we are in great peril.
of peril by any freezing of our capabilities and nature where they are
> >As a hacker I quite disagree. Only hackers have enough computer
> > So, one way to fight that is to make the information needed as public as
> > possible, WITH somehow, safeguards against its misuse. Quite a job.
> > But I'm not sure anything else is workable and survivable.
> Corporations and governments are going to be able to afford much greater
> computer power than individuals. The twin threats of runaway economic
> competition and runaway military confrontation cannot be addressed by a
> network of hackers. It can only be addressed by CITIZENS, organizing
> themselves and using the tools of democracy (yes, including computers),
> in order to make policies and laws which will be enforced by government
> power. Anything less than this will fail.
knowledge to have a prayer of mastering some of the technical challenges
afoot. CITIZENS, in general, are dumb as a post about the technology
around them. I do not see that being ruled by a bunch of ignoramuses is
a good thing at all. If the average citizens in the majority might rule
over the rest of us then in their great ignorance they will be the end
of us all.
>In a way, yes. More precisely, when citizens are prohibited from
> > Back up. Laws attempting to outlaw technology and lines of inquiry
> > mainly stop honest citizens and groups.
> You mean, when superintelligent bloodsucking furbies are outlawed, only
> outlaws will have superintelligent bloodsucking furbies.
certain lines of research, the benevolent uses of the technology tend to
be suppressed and the malevolent and violent ones get developed by those
outside or "above" the law.
> > I don't believe in totalDon't be absurd. I said nothing of the kind.
> > anarchy. But neither do I believe in passing laws just because you are
> > scared of a possible outcome that trample what I view as human rights to
> > question and explore such things and to build using them.
> So would you legalize home nuclear devices? Should it be every hobbyist's
> right to experiment with genetically-modified anthrax? Etc.
> > There have toI am getting tired of this argument style. Please save the rhetorical
> > be laws concerning liabilities and responsibilities and so on of course,
> > or some equivalent.
> So it's okay to endanger the planet, as long as you've got insurance?
comebacks for someone impressed by them.
> > Does that include the majority having the right to overrun the rights ofI totally disagree as did the framers of the Constitution. Rights are
> > minority dissenters?
> "Rights" can be said to exist only within the context of some system of
> law; claims of the existence of "rights" contrary to prevailing laws can
> be understood only as claims of the existence of overriding laws, or else
> as attempts to persuade that the laws should be changed. No one can claim
> to have the right to break the law. And yes, in a democracy, laws are
> almost never made by universal consent. You may have various systems for
> making laws, but they almost never require that everyone agree.
natural, ie, derived from the nature of human beings. They are not
created or bestowed by a government. They are protected by such bodies
in a sane society. The only legitimate laws are those that protect
these natural rights. Beyond that the law is a curse rather than a
So you support the right to force Socrates to drink hemlock heh?
I find your majoritarian dictatorship notions of government much more
frightening than super-intelligent computers.
> > Feeling threatened, potentially threatened and actually threatened areFor the simple reason that there is no proof of harm or sufficient study
> > very different things in the eyes of the law. You should not generally
> > prohibit technology that does not even exist. That would be extremely
> > arbitrary.
> If we can specify technologies that we expect may be created, I don't see
> why we can't regulate or prohibit them prior to their actual realization.
of benefits and risks to base such a blanket prohibition on. And you
cannot specify what technologies are expected. Not in detail enough to
make the law objective and non-arbitrary.
> > It will be really dicey trying to establish proper limitsIt will be imho utterly impossible to do correctly.
> > that respect sufficiently all that must be considered.
> I did not say it would be trivial. But I don't think it would be too
> difficult. It is a good thing for people with your kind of awareness and
> enthusiasm to be working on.
>Actually, with super-intelligent and self-evolving AI loose in the mix
> > If the machines are so powerful as to be able to easily wipe out
> > humanity then we are no threat and there is less need and incentive to
> > do so.
> If they are at some point in a position to execute a coup d'etat, it does
> not follow that they would always be in such a position.
it certainly does follow that if they are ahead that much they will
continue to be.
> > I do not trust monkeys in my house but it is fine if they liveWhich heritage is that? Why should the machines even want much of what
> > in the woods and especially if they do so far away. They are not enough
> > of a threat to need to wipe them all out.
> The monkeys are not going to get themselves organized, develop a powerful
> technology (perhaps with the aid of turncoat human advisors) and suddenly
> strike to take back their heritage.
is important to you and I? Soveignity of being the brightest beings on
the planet? That heritage? If we want to keep that we need to continue
to grow and transform.
> The situation is different with machines of human-equivalent or greaterThis is an assumption. Intelligence must ultimately be autonomous to be
> intelligence. If we allow them to have autonomous intelligence, purposes,
> and freedom of action, then they will constitute a threat to us. If they
> allow us to have autonomy and freedom, we will be a threat to them.
full intellgience. Intelligent, self-aware beings will have some rights
for the same reason we have them, by the nature of the types of beings
they are. Are we served by this xenophobia against our own creations
and even against those of ourselves and our children that choose to
transform beyond our own comfort level? I suggest that this xenophobia
is a large part of the problem. I don't think we can all survive in any
sort of peace (however much is possible) without overcoming it and
learning some mutual tolerance and respect for rights.
> > If the machines are not soBULL. You are a machine. I am a machine. Do we care?
> > powerful then fighting with humanity will be much more costly and will
> > waste valuable abilities, resources and time in warfare.
> Since they are machines, they don't care how many losses they suffer, as
> long as they win.
> > If they areThe second scenario places them not so far above us or even necessarily
> > not so powerful then they would also most likely still benefit from some
> > type of trade and relating to human beings in some respects.
> Certainly not if they are of superhuman intelligence. Trade? ?
above us, overall, at all. Yes, trade. There may well be things each
side has the other needs and wants. It is a time honored alternative to
war. Especially to costly and miserable war. If we don't mutually have
things the other wants then there is no real conflict because we aren't
after similar enough things.
> > The self-aggrandizement in both cases if at all rationally applied couldWho says? It will take quite a bit of development before even
> > lead away from war.
> The motive of self-aggrandizement would imply that the machines want to
> put the available matter and energy into one configuration, not probably
> the one that humans would choose. But the imperative for war would not
> come from scarcity, but from mistrust. They would have no reason to trust
> us nor we them, and they, at least, would have no reason to seek a modus
> vivendi with us as long as they knew they could win.
super-intelligent beings could even do somethig with "all the available
matter and energy". And it is not at all clear that their interest and
goals would require any such thing for a very long time.
Mutual needs and respect for another beings rights, and perhaps a high
cost of conflict are usually what trust (or at least non-aggression
pacts and some cooperation) is based upon.
I think your analysis is unimaginative.
> > > Unfortunately, here common usage goes against me. However, life thatBut it is useless for the purpose you intend. Your argument above is
> > > evolved on another planet would almost certainly be completely different
> > > at the molecular level. It is remarkable that all life on Earth has
> > > almost exactly the same basic biochemistry. Therefore this represents one
> > > class of phenomena which is unambiguously distinct from any other.
> > That isn't all that certain. Any life within certain temperature,
> > pressure, and so on conditions, would be likely to have a lot of
> > similarities. Its genetic structure is unlikely to be very similar at
> > all if there were no common ancestors. But in a common biosphere it is
> > not at all remarkable that all life would have the same basic
> > biochemistry.
> It is at most plausible that you'd see some of the same chemistry. In any
> case, there will be no ambiguity about making the distinction.
based on serious misconceptions.
> > Since you have avoided giving a definition of life except what you pointThere is no word for it. It is not life as that includes more than you
> > to
> And what is it that I point to? What is the word for it?
wish to include. Strictly biological earth life might come close to
> > it is difficult to argue the point. Definitions that simply restrictAs I pointed out, common understandings are based on assumptions that do
> > the set to where you are comfortable that what you want inside is and
> > what you don't want inside isn't are inadequate.
> I am not the one trying to redefine anything. I am only insisting that
> words be used as they are commonly understood. That is as essential to
> clear communication as it is anathema to obfuscation.
not hold in the world you would like to restrict. There is no clarity
if you ignore such points. You have continually acted as if there was a
clear line dividing the us that needs protecting from the them. But you
have not provided such clarity. The attempts so far have serious
lacks. Now you duck and say you simply rest on murky common usage?
> > You still haven't given any grounds why a self-replicating artificiallyTell me what life is if you know so well.
> > created being is not life.
> It's not life. Invent another word for it, if you think "self-replicating
> artificially created being" is too unweildy.
> Try this experiment: Say the word "life" to an unsuspecting person. ThenI will bet that a replicating, consumer of energy/resources is part of
> ask if the image that came to mind was of a "self-replicating artifically
> created being."
the picture if they attempt to abstract it to its essentials at all.
> > As it is commonly understood is notWhy? You cannot have such a distinction if you refuse to define what
> > adequate. It is precisely "common understandings" that must be called
> > into question and examined carefully as the assumptions those common
> > understandings are built become increasingly untrue.
> Not at the cost of erasing the distinction between alive and not.
life is and your pretense that you can is unworkable and built on air.
> > > I would also be very happy to seeWhat makes you claim this? Given integrated short and long term memory
> > > advanced biotech/nanotech succeed in fully restoring damaged bodies.
> > How about enhancing functionality like vastly improved memory or powers
> > of concentration?
> As long as this doesn't result in the creation of an inhuman monster that
> threatens the human species. But again, I don't know what you mean by
> "vastly." Human memory and powers of concentration cannot be improved by
> large order-of-magnitude factors.
circuits I certainly could improve human memory at least by a few orders
of magnitude. Concentration limits are in part a function of limits of
short term memory and of how many distinct things the human mind can
hold in conscious focus at once. Better short-term memory would help a
lot along with expanded whatever the equivalent of registers are in
human conscious awareness and some enhanced ability to filter out
> > But enhancing mental processes would potentially put others without suchAnd so? Would you outlaw this? Some would outlaw naturally occuring
> > enhancements at a large disadvantage.
> Yes, precisely.
genius or anything to far different from the norms as being an
"injsutice". What is your take on this?
> > That "silly" internal cell phoneNo, I am seeing the signs of human augmentation by technology. It is
> > could be used to have instantaneous and always available access to the
> > world's computers, knowledge banks and processing power - at least
> > enough of it to make for a compelling advantage. Also such persons
> > would have the ability to instantly communicate with anyone else they
> > chose who was available (and hadn't restricted communciation at the
> > moment). This is a powerful social force with some potentially deep
> > implications. Already in Finland and other countries heavily saturated
> > by cell-phones you see different patterns of behavior like school
> > children moving in unison like schools of fish from event to event by
> > keeping in near constant touch by cell-phone and cell-messaging.
> > Sometimes the effect is eerie. Like watching a hive mind at work. We
> > don't begin to see some of the implications and working out of something
> > that seems at first so silly.
> Well, as you see, you don't need an INTERNAL cell phone at all, at least
> not as long as you are only talking about a cell phone. That's why I said
> it was a silly idea. But in any case, your observations here are very
> keen. This does not yet look so serious, but you are seeing the signs of
> technological dehumanization, the destruction of a more uniquely human
> culture and its replacement by one dominated by an increasingly autonomous
you who see it as dehumanizing. I see it as an expansion of what it
means to be human and welcome it as such. Cultures change, grow,
evolve, die. It is part of the way of things. There is nothing so
sacred about this one or the current limits and boundaries that we must
fight any/all really significant change to something different
What autonomous technology? In the example humans simply gain an
increasing power to communicate and share almost instaneously and
> > Sometimes wisdom will consists of not drawing a line but letting the"k. But you keep claiming things that are far from "superintelligent
> > force of the evolving experience of those concerned create what balance
> > it can. No central committee has enough processing and data gathering
> > power in principle to prejudge many of these things fairly and
> > fruitfully. Erring on the side of caution has its own undesirable
> > consequences.
> We shouldn't outlaw cellphones. We should outlaw self-replicating
> superintelligent bloodsucking furbies. And things like that.
bloodsucking furbies" are also inhuman or at least dehumanizing threats
and should possibly be outlawed.
> > An algorithm is not physical regardless of whether it is writtenVery interesting. Is the quantum flux purely physical? What does its
> > down, encoded in neurons and synapses, in a computer or whatever. You
> > can take apart all the physicality of my computer and it will not
> > compute in that condition but you will not have laid hands on the actual
> > software that makes the hardware do useful things.
> Only physical things exist. Or else you believe in something
> extraphysical, something outside physics. That is supernaturalism.
physicality consists of? Are the directions for building a rocket
engine only physical? Is the music of Mozart only physical and somehow
the notes on the paper affect the physicality of the orchestra to
produce the physical pressure waves on your ears? Everything is
reductionistically physical but the physical level by itself does not
adequately express much very interesting in many domains of considerable
interest. A physical analysis of a ballet performance will not give you
one ounce of aesthetic appreciation or upliftment. The reduction, while
obvious, is singularly uninteresting. And a set of ideas is not
physical regardless of being held in physical brains, books, comptuers
and so on. This does not necessitate that they exist in some region of
Pure Form at all.
> There is no need for a "hardware-software" dualism. Hardware exists,BULLSHIT. Without software the hardware of your computer would be a
> software does not. We speak as if there were software-like things that
> exist, but this is a psychological aspect of ourselves, which are, in
> turn, identical with our bodies, at any given time.
lump of metal and silicon useful for nothing at all.
> I am not saying this is easy to understand. It is not even easy to state,It is hard to understand because it is incorrect. What is it that is,
> as I believe I have managed to do here, without circular references.
> But I am reasonably confident that I do understand it myself, and quite
> certain that it is correct.
and that it is to be, certain? A bunch of chemical reactions are
reasonably at peace with this set of interconnections and stimuli within
your brain-case? Yep, tremendously circular and quite empty of meaning
or even of roon for any such concept as meaning or certainty.
> > By the danger criteria we should avoid having any children!Sure it is. As unreasonable as saying we should not seek to evolve
> No, that is unreasonable.
beyond our current limits.
>So, would you condemn the not-rich parts of the world to never become
> > You do not think there is plenty of drudgery and needless limits left to
> > overcome? That much of the world does not still live in poverty? I
> > will you overcome some of these things for good without the very
> > advanced technology that is also dangerous?
> We in the rich countries don't do a whole lot of backbreaking work these
> days. We'd like better medicine, cheap, non-polluting energy, more
> bandwidth. Our Shopper's Warehouses are stocked to the ceiling and we
> prefer certified organic to "frankenfoods"; who cares about crop yields
> these days? It's hard to see a need for humanoid artificial intelligence
> and robots, let alone automated molecular assembler manufacturing. These
> latter developments represent more of a threat than an opportunity to most
> of us. Not everyone wants to work 80-hour weeks for IPO stock so they can
> cash out on automating their cohorts out of semi-skilled occupations.
> Most people would just like to raise their kids, take vacations, fix up
> the house.
more rich or at least not back-breaking and belly distending poor?
Because without a lot more technology that is precisely what the outcome
will be for them. Who cares about crop yields? We do, that's who.
Unless you would like most of the arable land around you plowed under.
It is precisely because we have cared very much about crop yields that
there is open land in many countries and that we can actually manage to
feed this many people. Drop the green revolution advances and you will
care about crop yields far more quickly than you might imagine.
I find your lack of insight beyond your local supermarket shocking and
quite distasteful. We still destroy much more land than necessary to
feed ourselves. Pollute much more than we need to to produce the goods
our standard of living (as well as much lower ones) requires. Higher
tech would lessen those negatives and enable us to actually feed,
shelter, clothe and provide decent lives for the rest of the world
outside our little suburbia oasis. It would allow us to use far, far
less non-newable and polluting resources and to clean up most of the
pollution we have. It would allow us to finally get to space in a
meaningful way. The store of knowledge and our access to it would
expand greatly. Do you want to tell me that none of these things move
you at all or are worth working toward? If so then you are far more
alien to me than I thought.
> It's hard to see why the less-developed countries would need these futureExamine the actual resources avaialable today and their cost in terms of
> technologies in order to catch up with our level of comfort. Really, it
> is only the energy problem that cries out for attention.
wealth, pollution and so on and it will become very clear why they
cannot catch up without better technology. No, it is not only the
energy problem. It is also the physical resources, metals, plastics and
so on that would be required and simply are not available at our current
> Nevertheless, I agree that the advanced technologies will be developed andThe social order needs to be challenged. Strongly. Too much is based
> applied, and they will have some other benefits in addition to longevity,
> energy, and bandwidth. They will also pose very serious challenges to
> social order and, if not adequately controlled, threats to our survival.
on old world assumptions that are increasingly false and on protecting
existing antiquated power/wealth concentrations.
> > I agree not every direction may be good for us. I guess my fundamentalOf course we will try to guide where we can and to prohibit the most
> > disagreement is the notion that we can prejudge which is which well
> > enough to restrict discovery and invention in the true problem areas. I
> > frankly don't think we are up to the task. And I have a suspicion it is
> > simply an intractable problem.
> Don't give up before trying. Again, I really believe you should try to
> think of yourself as someone who might be well-equipped for the task.
dangerous and uncontrollable developments. But there is much
disagreement on what is and is not for the good of the species contained
in this exchange.
I really have very little interest in trying to set policy over the
future. I am more concerned with welcoming and guiding the developments
from the inside as a technologist and in possible changes to society and
our views of ourselves that are necessary if we are to live peacefully
and well as the technological advances come with increasing speed and
- On Sat, 1 Apr 2000, Samantha Atkins wrote:
> human beings are not static in any case.This notion of continuous identity is synthetic. It can be justified
> That you change does not mean you are not you. It means that you
> change. You are still you in every bit as meaningful a sense as you are
> you when you go from being a fetus to a baby, baby to child, child to
unambiguously in the context of natural life, as you describe here. As
soon as you start talking about the possibility of making copies,
"uploads," etc., you destroy the basis of this clarity. "I am now" is a
clear, unambiguous, meaningful statement, but "I am the same who was and
who will be" is not so unambiguous or meaningful. Actually I am and will
be different. If you were to divide me in two and supply the missing
halves, it would be impossible to maintain the fiction of a single
continuous identity. Nor can this identity be extended to an "upload,"
since, if you make one copy, you can just as well make N copies, and
since, if you don't destroy the original, the original will go on as
before and not recognize the copies as "herself;" she will see them as
copies, nothing more.
> To not be destroyed, imo, requires that we augment and transform.Transformation to nonhuman form => destruction of human form.
> > Okay. Actually, the marked up wood pulp is the only reality. I don'tThe notion that "concepts, ideas, and knowledge" have a reality apart from
> > believe "the algorithm" exists, actually. I don't believe in Plato's
> > garden of ideals. But if you do believe that algorithms exist, as implied
> > by your invocation of them as the substance of the soul, then surely you
> > have to admit that the algorithm is present when it is written up on
> > paper. If it exists when it is rendered one way, to use your terms,
> > then surely it must exist when it is rendered a different way... right?
> Huh? If there were no algorithm, no method, no knowledge - then there
> would be nothing to write down. This is not Platonism but a simple
> acknowledgement of the reality of concepts, ideas, and knowledge.
their physical representations (including within our brains) is Platonism.
> The point is that as you say the algorithms exists regardless of how itI didn't say this, I said that you said this, and it implies that if, as
> is rendered.
you say consciousness = algorithms, then any representation of the
algorithms must equal a consciousness. How do you get out of this?
> Therefore, to the extent consciousness is algorithmic, itYou use "consciousness is algorithmic" to explain how "consciousness" is a
> also can be rendered in multiple ways and further, in executable form.
> I have shown by this line of thought that consciousness can be
> transferred without needing any mystical transmigrations.
thing that "can be transferred." But your equation doesn't work. I
have already shown this one way; here's another: What if you have two
"renderings" of the same algorithm? Since it's the same algorithm, it
should be the same consciousness, right?
> > Now you seem to recognize the need to actually have something activeSo if you copy the code for a consciousness algorithm, say, onto a hard
> > before you can claim the existence of something that deserves to be called
> > "consciousness." So consciousness is not some abstraction, such as an
> > algorithm, that can be separated from its "embodiment" or "rendering."
> I can write down the code for a piece of software. It is the code, a
> representation of an algorithm. But I can only run it by loading it on
> an appropriate device.
disk, then it only becomes a consciousness when you "run" it? What does
"running" it do to change the situation? You have charge states in RAM
capacitors and current states in CPU registers, and you have magnetic
states on your hard disk. So it's only a consciousness if the states are
charge and current states, not magnetic states? Or, let's say, if you
hook up a TV camera and a microphone so the computer can "see" and "hear,"
and a robot so it can "do something?" What if you only hook it up to a
simulator, that simulates the robot and feeds the computer simulated
sensory input? So now, all you have is charge states, current states, and
magnetic states! Is that a consciousness? Let's say the computer is
fully binary, and the program evolves deterministically. Now let's say we
have 10 of them running in lockstep. One consciousness or 10? Let's say
we let one of them get one step ahead of the others? How many
consciousnesses now? Feed each one a few bits of random data. Now how
many? Come on, this is all a lot of nonsense. There is no consciousness,
at least not as some kind of extra thing that comes into existence when
you "run" the appropriate program.
> Several types of trancendence of the human condition haveWe have made many improvements (as we judge them) in our condition. But
> been made so real we forget that to our ancestors they would be
> considered nothing short of miraculous.
we are still human, and our ancestors would be appalled by the notion that
the generation most generously endowed would be so prepared to give up the
> > There is room in this scenario for all kinds of wonderful things, buta) It is impossible for humans to be much more intelligent than they are.
> > there is no room for rapacious superintelligent egomonsters.
> By your reckoning, there is no room for superintelligent humans of any
> kind. Apparently you believe (a) that to become super-intelligent would
> destroy some mysterious essence you believe essential to being human and
> (b) that becoming super-intelligent would so ineveitably lead to
> "rapacious egomonsters". But intelligence itself is neutral.
a1) There is no need for superhuman humanoid intelligence for any
scientific or engineering purpose. We have computers that can add and
multiply orders of magnitude faster than any human, and we many develop
machines that can do more complicated intellectual tasks. But there is no
need to make them humanoid in a way that would threaten our own survival.
b) "Uploading" a human intelligence to a supercomputer seems a likely way
to create a humanoid superhuman intelligence, or "rapacious egomonster,"
which would threaten the rest of us. So we shouldn't let anyone do it.
> > If it is only an I/O augmentation, you still cannot handle theThen it isn't you any more; as you have described it yourself, it is a
> > information; there isn't enough capacity in your skull.
> I can add capacity by replacing less efficient circuitry with more
> > It is very unlikely that the human brain can usefully cope withI said human brain. Once again, the question is not whether it may be
> > significantly greater information flows than are already presented to us
> > through the senses.
> If you assume the brain will remain static then maybe, but it leaves out
> the possibility of inputting information in a meaningful way more
> densely through current channels. But why should the brain remain
> static? It can be augmented in its hardware. It has already begun with
> some of the current medical implants.
possible to create monsters.
> > If your notion is that you can Xerox your mind into a computer having aIf all you want is the conclusions, just let the computer work them out
> > capacity say 10^6 times greater than that of your brain, run the machine
> > for a while, then download the results back to your brain, then I say you
> > could never download more than about 10^-6 of the results.
> That does not follow even with your assumptions. The conclusions would
> not necessarily be 10^6 times as volumnous as the unaugmented human
> brain's capacity.
and tell you.
> Huge segments of the population are refinding Jesus andThis is nothing new. But in any case, look again. You have bought into
> dropping out of being responsible aware citizens in the US. There are
> anti-rational and anti-science sentiments and movements in almost all of
> the developed countries.
an irrational and unscientific movement, and for that matter, one that
hardly promotes the idea of "being responsible aware citizens."
> The complexity of an increasinglyThis is true, because technology is moving so fast; but everyone,
> technological and high speed world is such that most of the legislative,
> executive and judicial powers of the world are frankly functioning with
> vastly inadequate understandings of the relevant technical issues.
including government leaders, is more aware of it than ever. They just
can't keep up!
> In the US random tests of the population, even the college educatedSorry, I don't agree, and the statistics bear me out. This generation is,
> population for scientific knowledge are disgustingly dismal. Even on
> knowledge of things most school children learn before they even hit
> junior high school. It is a bad joke to claim the people are more
> educated and plugged-in. They may be more plugged-in to TV and the web
> but the diet they are consuming and regurgitating is pretty sad stuff
> for the most part.
on the whole, better educated than its parents, and they were better
educated than their parents, and so on. Partly this is just the fact that
more people are receiving more education than ever before; but this
expansion also is the source of oft-bemoaned "decline." When you educate
more people, by definition you are "lowering the standards" of
competition; but from what I have seen, today's youth are are the most
competitive generation yet.
> > > Back up. Laws attempting to outlaw technology and lines of inquiryThis is what police are for.
> > > mainly stop honest citizens and groups.
> > You mean, when superintelligent bloodsucking furbies are outlawed, only
> > outlaws will have superintelligent bloodsucking furbies.
> In a way, yes. More precisely, when citizens are prohibited from
> certain lines of research, the benevolent uses of the technology tend to
> be suppressed and the malevolent and violent ones get developed by those
> outside or "above" the law.
> > > I don't believe in totalSo you agree that if there are things that to "explore...and to build
> > > anarchy. But neither do I believe in passing laws just because you are
> > > scared of a possible outcome that trample what I view as human rights to
> > > question and explore such things and to build using them.
> > So would you legalize home nuclear devices? Should it be every hobbyist's
> > right to experiment with genetically-modified anthrax? Etc.
> Don't be absurd. I said nothing of the kind.
using them" poses a danger to the human community, then it is not
necessarily a human right to do so, but it could be outlawed?
> > If we can specify technologies that we expect may be created, I don't seeSo why bother having a nanotechnology list?
> > why we can't regulate or prohibit them prior to their actual realization.
> For the simple reason that there is no proof of harm or sufficient study
> of benefits and risks to base such a blanket prohibition on. And you
> cannot specify what technologies are expected.
> > The situation is different with machines of human-equivalent or greaterDo you believe there is such a thing as "full intelligence"? An end-of-
> > intelligence. If we allow them to have autonomous intelligence, purposes,
> > and freedom of action, then they will constitute a threat to us. If they
> > allow us to have autonomy and freedom, we will be a threat to them.
> This is an assumption. Intelligence must ultimately be autonomous to be
> full intellgience.
the-line? THAT is an assumption.
> You have continually acted as if there was aIf we don't twist words, if you adhere to common usage, then I have no
> clear line dividing the us that needs protecting from the them. But you
> have not provided such clarity. The attempts so far have serious
> lacks. Now you duck and say you simply rest on murky common usage?
fear that the majority of people will not be able to discern the clear
line dividing what is human from what is not and from what threatens us.
> Tell me what life is if you know so well.The word is understood.
> > As long as this doesn't result in the creation of an inhuman monster thatYou have the same problem as a person who accumulates vast files but
> > threatens the human species. But again, I don't know what you mean by
> > "vastly." Human memory and powers of concentration cannot be improved by
> > large order-of-magnitude factors.
> What makes you claim this? Given integrated short and long term memory
> circuits I certainly could improve human memory at least by a few orders
> of magnitude. Concentration limits are in part a function of limits of
> short term memory and of how many distinct things the human mind can
> hold in conscious focus at once. Better short-term memory would help a
> lot along with expanded whatever the equivalent of registers are in
> human conscious awareness and some enhanced ability to filter out
doesn't know where to find anything in them. Maybe you can provide a very
powerful information resource which offers suggestions to its user. But
there is no way to vastly increase the user's internal memory without
altering the user to such an extent that it would clearly no longer be
> > > But enhancing mental processes would potentially put others without suchIt will need to be regulated.
> > > enhancements at a large disadvantage.
> > Yes, precisely.
> And so? Would you outlaw this?
> Some would outlaw naturally occuringNone of us is THAT much smarter than any other. Even if you believe that
> genius or anything to far different from the norms as being an
> "injsutice". What is your take on this?
IQ numbers are meaningful, you are talking about +- a factor of 2 for the
outliers. But really, I don't think people are born with genes for string
theory or learning a dozen languages. That comes from single-minded
effort. It's a lot "fairer" than letting Bill Gates, say, spend ten
billion to buy a supercomputer equal in raw capacity to 10 million brains,
"upload" the contents of his skull and set it to work figuring out how to
bilk the rest of us out of the OTHER half of our fortunes.
> > > An algorithm is not physical regardless of whether it is writtenYes.
> > > down, encoded in neurons and synapses, in a computer or whatever. You
> > > can take apart all the physicality of my computer and it will not
> > > compute in that condition but you will not have laid hands on the actual
> > > software that makes the hardware do useful things.
> > Only physical things exist. Or else you believe in something
> > extraphysical, something outside physics. That is supernaturalism.
> Very interesting. Is the quantum flux purely physical?
> What does itsThat is an unanswered question. The answer will not justify "uploading."
> physicality consists of?
> Are the directions for building a rocketThe directions for building the rocket and the music of Mozart are both
> engine only physical? Is the music of Mozart only physical and somehow
> the notes on the paper affect the physicality of the orchestra to
> produce the physical pressure waves on your ears?
creations of human beings for the purpose of communicating to other human
beings. You can describe the orchestra playing as simply a physical
computing system reading notes and going through movements which produce
pressure waves etc. Or you can describe it as Mozart speaking to us
through this medium. The one and only, physical, then, Mozart.
> Everything isAgain, you have a form of communication. The dancer communicates with us
> reductionistically physical but the physical level by itself does not
> adequately express much very interesting in many domains of considerable
> interest. A physical analysis of a ballet performance will not give you
> one ounce of aesthetic appreciation or upliftment.
by moving her body; we appreciate both the visual spectacle and our own
internal simulation of how the kinesthetic experience of dancing.
> The reduction, whileThe author communicates with us through the medium. The "ideas" don't
> obvious, is singularly uninteresting. And a set of ideas is not
> physical regardless of being held in physical brains, books, comptuers
> and so on. This does not necessitate that they exist in some region of
> Pure Form at all.
exist outside of a person's consciousness. You have paper and ink,
magnetic spots, whatever. These acquire meaning only in relation to a
person. And a person is a real, physical being.
> So, would you condemn the not-rich parts of the world to never becomeWhen you say "a lot more technology," what do you mean?
> more rich or at least not back-breaking and belly distending poor?
> Because without a lot more technology that is precisely what the outcome
> will be for them. Who cares about crop yields? We do, that's who.
Superintelligence? Nanotechnology? I don't think these are needed at all
just in order to bring the rest of the world up to our standards. We
don't have them. Now, there is going to be an energy problem, and there
is a need for some new technology in this area. And no doubt it will be a
boon to have more productive crops; my point was just that we, in the rich
countries, seem to be surfeited. You would have understood this if you
had read carefully, instead of looking for an excuse to blow up.
> We still destroy much more land than necessary toUndoubtedly it could.
> feed ourselves. Pollute much more than we need to to produce the goods
> our standard of living (as well as much lower ones) requires. Higher
> tech would lessen those negatives
> and enable us to actually feed,But it is not lack of technology that prevents this today. We could
> shelter, clothe and provide decent lives for the rest of the world
"provide" a great deal to the underdeveloped world, but not under
capitalism. They could provide for themselves, without needing any more
advanced technology than we have; but they don't have the technology, at
least not integrated into their societies. It is not immediately clear
that more advanced technologies will do much to alter this situation.
> It would allow us to use far, farI don't disagree with any of this. Again, you did not read carefully.
> less non-newable and polluting resources and to clean up most of the
> pollution we have. It would allow us to finally get to space in a
> meaningful way. The store of knowledge and our access to it would
> expand greatly. Do you want to tell me that none of these things move
> you at all or are worth working toward?
I said that most people in the advanced countries really don't feel a
pressing need for these things, and they have good reason to view
advancing technology as a threat to their well-being. Oh sure, future
technologies could be used to do all sorts of wonderful things. But if
there is a great danger that, instead, it will be used to automate people
out of work, to set off a new, apocalyptic arms race, and then to create a
master race of ego-mad supercomputers, I think a lot of people may not be
too sure that's what they want.
> > It's hard to see why the less-developed countries would need these futurePlastics are made of oil, a very small part of overall oil consumption.
> > technologies in order to catch up with our level of comfort. Really, it
> > is only the energy problem that cries out for attention.
> Examine the actual resources avaialable today and their cost in terms of
> wealth, pollution and so on and it will become very clear why they
> cannot catch up without better technology. No, it is not only the
> energy problem. It is also the physical resources, metals, plastics and
> so on that would be required and simply are not available at our current
And we could certainly get by with a lot less us of plastics in this
country. As for metals, we are getting by with less. I don't think there
is a need for super-technologies in order for everyone on the planet to
have health and comfort and opportunity. I don't think having
super-technologies will automatically give that to everyone. But still, I
am not saying we shouldn't develop new technologies. I am only saying we
should not develop them AT ANY COST. Rather, we need to think about what
we do and don't want to do with new technology. There is no reason to go
madly rushing into an unknown future with no attempt to exert control so
as to avoid unwanted outcomes. I don't think this means slowing
technology, either. I think it means not doing certain things; but the
things that are desirable do not have to be slowed.
> Of course we will try to guide where we can and to prohibit the mostGood.
> dangerous and uncontrollable developments.
> But there is muchThat's the reason for having it.
> disagreement on what is and is not for the good of the species contained
> in this exchange.
- There is a question I have been thinking about for several years. I
still don't have an answer, so I will present it here for refutation.
Here's the hypothesis:
"The only protection against nanotechnology will be to make death
Any possible means of death will be a weakness that can be exploited
by a nanotechnological weapon to kill anyone, any group, or everyone.
The tools to create nanotechnology will at some time become so widely
available, that it is almost certain some small group or individual
(with a death wish) will want to take out a lot of other people with
them. Perhaps even the entire human race. Such people and groups are
not unknown in the world today, and are likely to be around for the
foreseeable future. Read "The Shoemaker" for a look inside the mind
of such a person.
If we make nanotechnological defences that protect against grey goo,
but not against volcanoes or tornadoes, then someone who hates the
human race will arrange to provide the world with all sorts of
calamities. All it takes is some re-programming; like a video game,
only real. If it's possible for anyone to die of a heart attack,
someone will find a way to make it happen to a lot of people. The
attack could come through the food or water supply, the air, ground,
or something I haven't even thought of yet.
It will be ludicrous for any government to even consider using
nanotechnology as a weapon, because all it will take is for it to
fall into the hands of someone who knows how to change a program loop
to read "do forever".
I look forward eagerly to the benefits nanotechnology can provide,
but the time to plan for all possible contingencies is now, before
the first assembler arrives. I am not blind to the dangers, but don't
believe that nanotechnology can be stopped except by a world-wide
"Big Brother". Even if that happened, could we believe that B.B.
would not develop it secretly as a weapon of control?
I hope someone can refute some of this, because a total loss of
freedom would be worse than death; at least to me.
To design the new structures of writing for screens is a profound
issue of literary structure. It is important to provide the best
literary structure that we can, for hypertext, as the literature of
tomorrow, determines in part the new structure of civilization.
Civilization is in large part about, and around, what is written.
This is what we call literature. Literature is an endless river,
connected, like water, in all directions. Document connections go
forward and backward in time, and sideways between documents.
Scholarship and fiction, political speeches and criticism,
advertising, journalism and technical reports-- all affect each other
and evolve in a constant flow of ideas and writings. ... Ted Nelson...
Jack Seay jackseay@...
- On Wed, 5 Apr 2000, Jack Seay wrote:
> "The only protection against nanotechnology will be to make deathThis is impossible.
> The tools to create nanotechnology will at some time become so widelyEven if you assume that eventually everyone would have access to the means
> available, that it is almost certain some small group or individual
> (with a death wish) will want to take out a lot of other people with
> them. Perhaps even the entire human race.
to "create" or otherwise use nanotech for any purpose of his or her
choosing, which is not necessarily true, still it makes a big difference
when different capabilities become available and to whom. Take an extreme
case. If Drexler's "flexible engines of creation" which have human-style
intelligence, superhuman engineering capabilities and full access to
assembler-level production facilities, suddenly became available to each
individual on the planet, no doubt you'd have a big mess overnight. At
the other extreme, if the technology is tightly-controlled by a consortium
of democratic governments, if its benefits are distributed to the world's
population in what is perceived as a fair manner, if old grievances are
defused by the combination of rapidly expanding personal wealth, the
prospect of practical immortality, and opportunities for expansion into
space, etc., and if no nanotechnic confrontation between rival military
forces is allowed to develop, then I think we have a good chance of
avoiding any large-scale violence.
> Such persons or groups areI don't think there are that many people around who are so completely
> not unknown in the world today, and are likely to be around for the
> foreseeable future. Read "The Shoemaker" for a look inside the mind
> of such a person.
filled with hate for no reason. Our propaganda projects this image in
order to delegitimize the grievances of people who people who use violence
against us, yet in many cases they or their people have have either been
dispossessed or suffered violence at our hands. We are certainly creating
a lot of people in Iraq who have very clear reasons to hate the United
States, for example. Yet somehow yesterday's terrorist turns out to be
tomorrow's ally. A generation ago, it was the Palestinians who were
viewed as irrational terrorists; now the PLO is practically a US client
and most of us understand that they do have some legitimate grievances.
On the other hand, we recognize the legitimacy of Israel's right to live
in peace. If you try to make sense out of the world in terms of Good Guys
vs. Bad, you almost always oversimplify. I think if you subtract the
people who have legitimate issues that they believe put them at odds with
the "New World Order," you are left with very few people who really just
irrationally want to do harm to others.
Also note that most of the really large-scale evil is done by governments
(usually not democratic) and often in the name of anti-terrorism. In that
case, it is usually done to gain or keep power and privilege, not out of a
deep irrational hatred.
> If we make nanotechnological defences that protect against grey goo,All of these scenarios can be countered. It will of course be necessary
> but not against volcanoes or tornadoes, then someone who hates the
> human race will arrange to provide the world with all sorts of
> calamities. All it takes is some re-programming; like a video game,
> only real. If it's possible for anyone to die of a heart attack,
> someone will find a way to make it happen to a lot of people. The
> attack could come through the food or water supply, the air, ground,
> or something I haven't even thought of yet.
to anticipate what malicious persons might do, and to deploy defenses.
> It will be ludicrous for any government to even consider usingUnfortunately, it is very likely that governments will consider using
> nanotechnology as a weapon, because all it will take is for it to
> fall into the hands of someone who knows how to change a program loop
> to read "do forever".
nanotech to make weapons. That is really the most critical danger.
> I look forward eagerly to the benefits nanotechnology can provide,Agreed.
> but the time to plan for all possible contingencies is now, before
> the first assembler arrives.
> I am not blind to the dangers, but don'tWe have had more than two centuries of democratic government in this
> believe that nanotechnology can be stopped except by a world-wide
> "Big Brother". Even if that happened, could we believe that B.B.
> would not develop it secretly as a weapon of control?
country, and have not adopted such measures of repressive control as were
used very effectively in fascist and communist societies. Our democracy
has broadened its base from landowning white men to almost anyone who
wishes to participate. Unfortunately, there is a tendency not to
participate, but perhaps the internet will help to reverse this trend.
Government officials are not above the law, thanks to oversight and
recordkeeping. It is hard to keep wrongdoing a secret. So instead of
viewing government only as "Big Brother," we should take seriously the
proposition that, in a democracy, the government is us.
>I think if you subtract the people who have legitimate issues thatUnfortunately, a "very few people" is all it would take to bring
>they believe put them at odds with the "New World Order," you are
>left with very few people who really just irrationally want to do
>harm to others.
things to a screeching halt, if they had access to nanotech weapons
or the means to produce them.
Even leaving aside the complete nihilists, many terrorist groups with
"legitimate issues" (a phrase I have more than a little difficulty
with, but never mind) wouldn't balk at causing huge amounts of damage
to their enemies, consequences be damned. The saving factor, in my
mind, is that such people seem to lack the capacity for creative and
innovative thought (car bomb the embassy? Blow up a school bus?
Knock down a commercial airliner? Come on, people, it's already been
done), whereas superceding the sort of adaptive, responsive defenses
that nanotech will make possible will require huge amounts of both.
In the future a particular act of terrorism might work once, and
might impact on many, many people's lives, but chances are it will
only work once before the defensive systems learn to deal with it.
And an act so big as to directly affect all of society, or even a
majority of it, seems unlikely in the extreme.
My greatest hope, though, is that a nanotech-based economy will
improve the global standard of living enough that terrorism will
become a thing of the past, that people will have too much to lose to
indulge in territorial, political, or religious strife. This may be
starry-eyed and naive of me, but one can always dream.
I am * shig@...
Shig the Unmentionable, *
and I have spoken. * http://www.pd.net/~shig/
A man with two watches can never be sure what time it is.
Neither can a man with one watch, but he thinks he's positive.
- The person with one watch isn't "positive" of the time, just
dependent on a single source for their information. Time is
a "Relative" quantity anyway and perspective ultimately is all that
matters. I for one don't think it is worth it to rely on "faith" in
this game of chance we are playing with Social Evolution, so how do
we stack the odd's? Is "Fairness" relative too and just who are
the "Good" as opposed to "Bad" Social Mechanics? ? kxs