Loading ...
Sorry, an error occurred while loading the content.

48734Re: Book: Who speaks for the Earth?

Expand Messages
  • bowmanthebard
    Jan 1, 2007
      There is an old religious idea that moral viciousness is a sort of
      factual error, and that the "higher" your mind is, the more you will
      see the "errors" of your wicked ways. According to this idea, lack of
      compassion is like an inability to do long division. Aliens who can
      travel through space must be able to do long division, the idea goes,
      so we can rely on them to be compassionate as well. Alas -- this
      religious idea is itself mistaken, and it exemplifies the fallacy of
      confusing "ought" and "is". It also exemplifies the mistake of
      supposing there is a "great chain of being" with the most intelligent,
      compassionate creatures at the top. In reality, there are cooperative
      strategies and uncooperative strategies, and both can work, regardless
      of intelligence.

      I'm not sure where the idea came from that all intelligent life runs
      the risk of self-extinction at the hands of its own technology. A
      "technology is evil" thread that runs through continental European
      thinking, from Rousseau to Heidegger of the woodcraftfolk, the
      card-carrying (and badge-wearing) Nazi. Or maybe it was thought up by
      a couple of nerds waiting in line for the toilets at Woodstock?

      Wherever it came from, it supposes that the main problem is technology
      rather than tribalism. Humans have always killed each other in large
      numbers because we are a tribal species. As a rule of thumb, the more
      advanced our technology, and the less backward our civilization, the
      fewer humans are killed by tribal conflict. The highest rates of
      violent death are found among "noble savages" such as the Yanomamo.

      We have no idea whether alien life would be tribal like us. But for
      all we know, it might take the form of a "colony" like a Portuguese
      man o' war, with each individual working like a single brain cell. No
      such colony would be in danger of self-extinction. There is an
      indefinite range of possibilities, and we can reliably predict none of

      The most reasonable strategy is to assume the worst, but don't spend
      much money trying to forestall potential disasters that we're very
      uncertain about. It was completely crazy to spend extra cash assuming
      the best, thereby increasing the risk of the worst. It was a fine
      example of good sense overruled by 1960s fashion.

      Jeremy Bowman
    • Show all 6 messages in this topic