[Computational Complexity] Favorite Theorems: Probabilistic Complexity
After we saw several randomized algorithms for primality we needed a way to classify and analyze the power of probabilistic computation. The power of computational complexity comes from not just studying old models of computation but taking new ones and finding ways to analyze their computational power as well.
Computational Complexity of Probabilistic Turing Machines, John Gill, STOC 1974, SICOMP 1977.
A Complexity Theoretic Approach to Randomness, Michael Sipser, STOC 1983.
Sipser's main result showed the BPP is contained in the fourth level of the polynomial-time hierarchy and the paper also includes Gács improvement putting BPP in Σ2∩Π2. More importantly Sipser introduced new techniques into the complexity theorists' toolbox including
- A new resource-bounded Kolmogorov distinguishing complexity, and
- Using Carter-Wegman hash functions to focus randomness. Perhaps the first extractor.
How about that newer model of a quantum computer? Bernstein and Vazirani's paper plays the Gill role, in formalizing efficient quantum computation and definining the basic classes like BQP. But while we have had some limited success in understanding the computational complexity of BQP, not only do we not know whether BQP is contained in the polynomial-hierarchy we have not yet developed great tools for understanding "quantumness" the way Sipser has shown we can do for randomness.
Posted by Lance to Computational Complexity at 6/22/2006 11:31:00 AM