Loading ...
Sorry, an error occurred while loading the content.

Complexity Threshold (Was: The moral questions)

Expand Messages
  • David Cymbala
    ... The book that At Home in the Universe is based on ( The Origins of Order ) actually gives an answer to this. It has to do with the way that networks of
    Message 1 of 1 , Dec 11, 2001
    • 0 Attachment
      >Uwe Zdun writes:
      >
      >>>This seems to be pretty much the same argument as complexity theory as
      >>>applied to biology (see Stuart Kaufmann's "At home in the universe"). The
      >
      >>>basic argument here is that molecular systems "spring" into life in an
      >>>autocatalytic process. There is a critical threshold of complexity in
      >which
      >>>life emerges.
      >
      > Jeff Replied:
      >I haven't yet read this book. Does Kaufmann provide any lower limits for
      >the complexity threshold? This is where I feel we should be focusing more
      >simulation time. Can we, with either a physical system or a simulation,
      >duplicate this complexity threshold in a meaningful way?

      The book that "At Home in the Universe" is based on ("The Origins of Order")
      actually gives an answer to this. It has to do with the way that networks
      of control genes (not structural genes) must be structured. To come
      straight to the point, it is when the number of controlling relationships
      between genes are between greater than the number of genes.

      When the relationships between genes exceed twice the number of
      genes that there is an actual loss of coordinated complexity, since
      the control relationships tend towards non-convergent chaotic patterns.
      Kauffman compares such systems to "gas", and identifies them
      as being in the "chaotic regime" of systems.

      When the number of relationships are less than the number of genes,
      the system tends to have isolated transitions that peter out quickly.
      Kauffman compares such systems to "solid", and identifies them
      as being in the "frozen regime" of systems.

      The space in the middle is where "complex" systems (Kauffman's term)
      emerge. Systems having significant "frozen" portions between portions that
      "percolate" seem to be the most adapted to change because they are stable
      but also explore variation, making "adaptive walks to the edge of chaos."

      Sounds like a good software development house, right? ;-)

      The discussion of autocatalysis has to do with systems that
      transform "food" molecules into the components that make it up,
      thus ensuring a configuration that doesn't just fall apart or
      come to a standstill. He is talking about organic molecules
      like carbohydrates, amino acids, and nucleic acids, of course.

      I'm not sure what the exact analog would be for digital systems.

      He postulates that living systems arose from "hypercritical"
      systems (rapidly permuting polymer combinations)
      that cooled down into "hypocritical" systems (slowly permuting
      polymer combinations). What I have read about "synergistic"
      systems is that they go through a period of "noise" where
      they are basically moving through very high dimensional state
      spaces until they find an attractor, and then they stabilize.
      This sounds like certain kinds of neural networks to me.

      The complexity threshold necessary to do complex
      things seems to require a degree of self-referential
      structure, and then some kind of promotion and inhibition
      system. I've heard about fuzzy logic systems that use
      this kind of technique.

      -David
    Your message has been successfully submitted and would be delivered to recipients shortly.