Loading ...
Sorry, an error occurred while loading the content.

Re: nanomedicine...?

Expand Messages
  • marilyn1mew
    i know he was joking. so was i, hehehe. i m not into religion or any other superstitions, but a technocult might be fun - especially when we have the nano
    Message 1 of 40 , Apr 10 3:45 AM
    • 0 Attachment
      i know he was joking. so was i, hehehe.

      i'm not into religion or any other superstitions, but a technocult
      might be fun - especially when we have the nano virtual reality
      online. may i be a high priestess <-not a nun->?

      y'all are probably all phd's (SOOOO kool) or close to it, so it is
      easy for me to see that i'm way out gunned intellectually, but i bet i
      can kick your butt in chess. i'm really good.

      k, i'll be quiet.

      ^_^

      marilyn






      --- In nanotech@y..., Ooo0001@a... wrote:
      > In a message dated 4/9/2002 10:21:37 AM Pacific Daylight Time,
      > marilyn1mew@h... writes:
      >
      >
      > > 24,000 years? there are several noted futurists who say the
      > > 'singularity' will occur in 2035 and maybe as early as 2025. at that
      > > point in time we would be come posthuman and be immortal and super
      > > intelligent. are they that wrong?
      > >
      >
      > Pssst. He was joking--referring to solving the rest of the world's
      problems.
      > B-)
    • Mark Gubrud
      ... So what is your source for this fact ? It seems to me you can make a pretty good argument either way. And the current situation seems extremely
      Message 40 of 40 , Apr 12 6:35 PM
      • 0 Attachment
        Derek writes:

        > Sure, things could still go all to hell pretty quickly, but the fact
        > remains that it's less likely to occur now (certainly on a global
        > scale) than it was during the height of the Cold War.

        So what is your source for this "fact"? It seems to me you can make a
        pretty good argument either way. And the current situation seems
        extremely dangerous, both the immediate global crisis (primarily, but
        not limited to, the Mideast) and the longer-term consequences of
        abandoning arms control and international institutions of law and civil
        society.

        > Risk isn't zero by any means, but if zero risk is what you insist
        > upon,

        I don't. But we ought to think about risk in terms of a product of
        likelihood and magnitude. When it comes to a danger of the highest
        magnitude, we should not tolerate any unnecessary risk.

        > I'm no hawk, but it wasn't arms control and international
        > security that brought down the Soviet Union.

        No, it was the internal process of decay of a moribund system and the
        demand from its people for change. However, I would credit arms control
        and the willingness to stand down from aggressive impulses with our
        avoidance of nuclear holocaust during the Cold War.

        > discontinuity of singularity depends on perspective.


        > > I think this notion of a "seed AI" is kind of cartoonish to
        > > begin with.
        >
        > Cartoonish, like flying machines, rockets to the moon, telephones you
        > carry in your pocket, and sending text messages to thousands of people
        > around the world ;-)

        No, cartoonish like Leonardo's designs for flying machines, like
        Tsiolkovsky's moon rockets, etc.

        > If you're referring to the Middle East situation, it's a mess that
        > could spread further, but it's a bit premature to predict global
        > holocaust just yet.

        I'm not predicting it, but observing it's unacceptably high likelihood.
        If it were not "premature," there would be little point in doing so.
      Your message has been successfully submitted and would be delivered to recipients shortly.