Loading ...
Sorry, an error occurred while loading the content.

[Computational Complexity] Favorite Theorems: Primality Algorithms

Expand Messages
  • Lance
    April Edition For many years the standard example of a probabilistic algorithm checked whether a number was prime. Riemann s Hypothesis and Tests for Primality
    Message 1 of 1 , May 19, 2006
      April Edition

      For many years the standard example of a probabilistic algorithm checked whether a number was prime.

      Riemann's Hypothesis and Tests for Primality, Gary Miller, STOC 1975, JCSS 1976.

      A Fast Monte-Carlo Test for Primality, Robert Solovay and Volker Strassen, SICOMP 1977.

      Probabilistic Algorithm for Testing Primality, Michael Rabin, Journal of Number Theory, 1980.

      Let n be an odd number and n≠rq for r,q>1. Fermat's little theorem shows that for any a, 1<a<n, if an≠a (mod n) then n is composite. But for a special set of composites called the Carmichael Numbers, this test will fail for all a. Miller adds the condition that a(n-1)/2k-1 and n are relatively prime for all 2k dividing n-1 and shows that assuming the Extended Riemann Hypothesis, for any composite n there is an a < O(log2 n) passing the test. This gives a polynomial-time (in the number of bits to express n) algorithm for primality assuming ERH. Rabin showed that one could instead choose random a's giving a probabilistic algorithm with no assumption. The resulting test is now called the Miller-Rabin primality test. Solovay and Strassen give a different test based on the Jacobi Symbol.

      Of course now we know that primality has a polynomial-time algorithm with no unproven assumptions. So why did I choose these obsolete algorithms for favorite theorems? They are not so obsolete as they are considerably more efficient then Agrawal-Kayal-Saxena. But more importantly they served as the motivation for the study of probabilistic computation, much like Shor's Quantum Factoring Algorithm motivates quantum computing today. Without studying probabilistic computation we would have had no modern cryptography, no derandomization results, no Toda's theorem, no interactive proofs and no probabilistically checkable proofs with their hardness of approximation consequences.

      Posted by Lance to Computational Complexity at 5/19/2006 07:44:00 AM

    Your message has been successfully submitted and would be delivered to recipients shortly.