Loading ...
Sorry, an error occurred while loading the content.

PS: [nanotech] toasted sentience

Expand Messages
  • Eliezer S. Yudkowsky
    ... You ve exceeded my physics training, so I can t reply to this statement. I can t say that I feel this conversation has achieved closure, since the parts I
    Message 1 of 1 , Dec 9, 2000
    • 0 Attachment
      Mark Gubrud wrote:
      > Spin-1/2 states such as [ 1+i , i-1 ] (normalization assumed) are pure
      > states. They have entropy zero. This is very easy to show:
      > S = - tr( R log(R) ); R = density operator
      > Suppose U is a unitary operator, and R = U P U+. Then
      > S = - tr( U P U+ log( U P U+ ) ).
      > Now, A = log( B ) means B = exp(A) = 1 + A + A^2/2 + ...
      > So, if A = log(R) = log( U P U+) then
      > P = U+ R U = U+ exp(A) U = U+ U + U+ A U + U+ A^2/2 U + ...
      > = 1 + U+ A U + (U+ A U)^2/2 + ... = exp( U+ A U )
      > therefore log(P) = log( U+ R U) = U+ log(R) U and
      > S = - tr( U P log(P) U+) = -tr( P log(P) ).
      > So entropy is invariant under a unitary transformation (basis rotation).
      > Now, given any normalized state vector, we can construct (Gram-Schmidt) a
      > complete basis which has this state vector as one of its basis vectors.
      > In this basis, the density matrix of the state will have only one nonzero
      > element, so the entropy is zero. If we first wrote the density matrix in
      > some other basis, we could find a unitary matrix which would bring us to
      > the basis in which the density matrix has only one nonzero element.
      > You constructed 2-particle states; if the particles are distinguishable,
      > the appropriate density operator will be a 4x4 matrix which, because you
      > specified a pure state, will have zero entropy at the start of your
      > unspecified process. If the particles are indistinguishable, you can
      > treat them as a statistical mixture described by a 2x2 density matrix of
      > nonzero entropy. In either case, any type of unitary evolution, such as
      > would be described by an interaction hamiltonian, will preserve the
      > entropy. Measurement of the final state, which involves interaction with
      > uncontrolled environmental degrees of freedom, may increase the
      > entropy, but will never decrease it.

      You've exceeded my physics training, so I can't reply to this statement.
      I can't say that I feel this conversation has achieved closure, since the
      parts I did understand did seemed to say only that unitary evolution
      preserves entropy, which I already knew. But perhaps you provided a
      stunning refutation that simply exceeded my physical-sciences
      understanding. I'm afraid that I don't know enough physics to say that
      you've won, but it certainly does seem I've lost.

      -- -- -- -- --
      Eliezer S. Yudkowsky http://singinst.org/
      Research Fellow, Singularity Institute for Artificial Intelligence
    Your message has been successfully submitted and would be delivered to recipients shortly.