Loading ...
Sorry, an error occurred while loading the content.
 

Factor Record

Expand Messages
  • paulmillscv@yahoo.co.uk
    Hi to all, The new Wilson-Mills SG primality test has an important corollary. A new world record for the existing SG (Sophie Germain) WR holders Underbakke,
    Message 1 of 12 , Apr 24, 2001
      Hi to all,
      The new Wilson-Mills SG primality test has an important
      corollary. A new world record for the existing SG (Sophie Germain)
      WR holders Underbakke, Jobling and Gallot.
      Namely, the largest Sophie Germain prime p is also the largest known
      prime factor of a factorial closed form, possibly of any non trivial
      closed form. This closed form is (2p)! + 1.

      Reminder: The Wilson-Mills primality test for Sophie Germain's is
      if q and p are prime and q = 2*p +1 then q divides (2p)! + 1.

      i.e. if 2*p +1 (=q) is a factor of (2p)! + 1 then 2*p + 1 is
      prime.

      Once again, thanks to Henri Lifchitz, the existing record holder who
      had a test of
      q divides 3^p –1. But (2p)! + 1 is a larger number
      than 3^p –1 !

      Stare at this long enough and you will see that the `salient'
      feature of the Wilson-Mills test is that the Sophie Germain primes
      are `nestled' inside the test as (q,p). All you have to do then is
      substitute a real SG pair for (q,p) to obtain a factor ( or `lever' a
      factor ) from the closed form (2p)! + 1. Of course, a factor q
      can also be `levered' from the form 3^p – 1 via a Sophie Germain
      (q,p) which is Henri's test. An interesting question is now, what
      are the Sophie Germain 'primality test forms' f(p).
      I.e. q factors f(p). We now have 2. 3^p -1 and (2p)! + 1.

      So to conclude: The largest p of a Sophie Germain pair is
      109433307 * 2^66452 – 1 (from the Prime Pages) and so the largest
      known closed factorial form number with the largest known prime
      factor is
      (2* 109433307* 2^66452 – 2)! + 1 with factor
      109433307*2^66452 –1

      Regards,
      Paul Mills,
      Kenilworth,
      England.
    • paulmillscv@yahoo.co.uk
      Hi, Yves Gallot has pointed out that directly from Wilson s theorem (p-1)! == -1 mod p so that p divides (p-1)! + 1 (p-1)! +1 is a `closed factorial
      Message 2 of 12 , Apr 25, 2001
        Hi,
        Yves Gallot has pointed out that directly from Wilson's theorem
        (p-1)! == -1 mod p so that p divides (p-1)! + 1

        (p-1)! +1 is a `closed factorial form' and the generic term I
        am probably looking for is an `irreducible' form. (polynomial or
        otherwise).

        But as Yves pointed out, by substituting the largest known prime
        (Mersenne) for p we obtain the p / f(p) World record
        p / (p-1)! + 1 . The New World record is that Wilson's theorem
        shows that (2^6972593 -2)! + 1 is the largest 'irreducible form'
        number for which we have a prime factor.

        I approached this problem from Henri Lifchitz's work where he
        showed that
        q / 3^p – 1 . This is a different `factored form' because
        it is q / f(p) and q is distinct from p. So Henri's unsung
        world record and now my world record still stands too! In fact if
        q / f(p) then either q = p or is distinct from p so there are just 2
        world records for this factored form! Held by the Mersenne (q = p)
        and Sophie Germain (q /= p) WR primes. Now the theory starts, of
        course.

        Incidentally, it is easy to prove that (2p)! + 1 > 3^p – 1 by
        induction.
        Also I claim to have missed the simple p / f(p) because I was
        deflected by the nice formula
        (a + b)^p == a^b + b^p mod p

        This gives a large number of expressions for the factored form
        p / f(p)

        i.e (2 + 1)^p == 2^p + 1 mod p => p / 3^p – 2^p – 1



        regards,
        Paul Mills
        Kenilworth
        England.
      • d.broadhurst@open.ac.uk
        ... Rules are unclear. Why not f(p)=(p-1)! + (p+1)^p or for that matter f(p)=(p-1)! + (p^p^p...^p^p^p+1)^p^p^p....^p^p^p with as many exponentiations as one
        Message 3 of 12 , Apr 25, 2001
          Paul Mill wrote:
          > by substituting the largest known prime (Mersenne) for p
          > we obtain the p / f(p) World record p / (p-1)! + 1
          Rules are unclear. Why not
          f(p)=(p-1)! + (p+1)^p
          or for that matter
          f(p)=(p-1)! + (p^p^p...^p^p^p+1)^p^p^p....^p^p^p
          with as many exponentiations as one cares to type?
          David
        • paulmillscv@yahoo.co.uk
          Hi to all, I conclude the discovery of irreducible a priori `factorised closed forms with a bit more theory and an interesting question. We already have that
          Message 4 of 12 , Apr 25, 2001
            Hi to all,
            I conclude the discovery of irreducible a priori `factorised'
            closed forms with a bit more theory and an interesting question.


            We already have that q ( = 2*p + 1) / (2p)! + 1.
            Now if we look for primes of the form q = 3*p + 1 then we can use
            Wilson's theorem to obtain (q-1)! == -1 mod q or
            (3p)! + 1 == 0 mod q or q / (3p)! + 1
            Similarly we can look for primes of the form q = X*p +1 and
            the new `factored' form is q / (X*p)! + 1 . We can even look
            for primes of the form
            q = X*p + Y X, and Y integers. Then the factored form is
            q / (X*p + (Y-1))! + 1

            So we now have a `large' number of new primes to look for, of the
            form q = X*p + Y
            and each new record prime of this form (Generalised Sophie Gemain)
            will enable a still larger number to be factored `a priori.'

            My question is this , can anyone prove there are in fact an
            infinite number of primes of the form q = 2*p +1 ? (Sophie
            Germain) or indeed of the form q = X*p + Y?

            From David Burton's book if p and q are twin primes there is a
            formula,
            p*q / 4((p-1)! + 1) + p which is a `doubly factored' form
            p*q / f(p)
            And p and q are twin primes. So Phil Carmody and David Underbakke
            can claim a new world record factorisation by plugging their record
            twin into this form!

            Finally, to end on a lighter note and also with another question and
            possibly a new standard. H.E. Rose writes in his famous book
            p12 `For a typical 500 digit integer it would take more than a
            lifetime to find its factors.' My question is, how long would it
            take to factor ANY random 500 digit integer today? Perhaps this can
            be called the `Rose500' benchmark in honour of H.E. Rose.

            Regards,
            Paul Mills,
            Kenilworth,
            England.
          • d.broadhurst@open.ac.uk
            ... You didn t explain what is is wrong with ... Off list, you confirmed that it is irreducible ... But we do! Proof: -1 mod p + 1 mod p = 0 mod p David
            Message 5 of 12 , Apr 25, 2001
              Paul Mills wrote:

              > I conclude the discovery of irreducible
              > a priori `factorised' closed forms

              You didn't explain what is is wrong with
              this reductio ad nauseam:

              > f(p)=(p-1)! + (p^p^p...^p^p^p+1)^p^p^p....^p^p^p
              > with as many exponentiations as one cares to type?

              Off list, you confirmed that it is "irreducible"
              but made the following strange claim:

              > we don't know if f(p) has a factor by an
              > a priori method or theorem.

              But we do!
              Proof: -1 mod p + 1 mod p = 0 mod p

              David
            • Jack Brennen
              Paul, You asked how long it would take to factor a random 500 digit integer today? Sadly, the answer appears to remain more than a lifetime. Two methods
              Message 6 of 12 , Apr 25, 2001
                Paul,

                You asked how long it would take to factor a random 500 digit integer
                today? Sadly, the answer appears to remain "more than a lifetime."

                Two methods come to mind: ECM would succesfully factor such a number
                if all but one of the prime factors are "small" (less than about 50
                digits). This will not happen very often.

                The other method (for general numbers): MPQS is a powerful method for
                use with general numbers (those not of the form a^n+b, with small a & b).
                MPQS has a running time of the order O(exp(sqrt(log(n)*log(log(n))))),
                where n is the number to be factored. I'm currently running an MPQS
                program on an 88-digit number; I expect the program to take around
                54 hours on my 450 MHz PC. Run the numbers: a 500-digit number will
                take about 8*10^24 times as long, or about 5*10^22 years!

                There are other problems with using MPQS to factor such a number;
                the size of the sieve array created would be greater than any computer
                could hope to handle (probably greater than all of the computer memory
                in all the computers in the world combined).

                I'm sure that some can point to examples of numbers in the 500-digit
                range which have been completely factored; many examples do exist --
                however, these numbers either have a special form permitting algebraic
                factorization (such as 10^540-1), or they satisfy the requirements
                above for an ECM factorization (only one large prime factor), such as:

                10^499+47 == 3 * 29 * 2938069 * P491



                Jack
              • Paul Leyland
                ... I didn t see the question (are messages being dropped by the list?), but your conclusion seems reasonably sound. ... Agreed. ... Reasonable as far as it
                Message 7 of 12 , Apr 26, 2001
                  > You asked how long it would take to factor a random 500
                  > digit integer today? Sadly, the answer appears to remain
                  > "more than a lifetime."

                  I didn't see the question (are messages being dropped by the list?), but
                  your conclusion seems reasonably sound.

                  > Two methods come to mind: ECM would succesfully factor
                  > such a number if all but one of the prime factors are "small"
                  > (less than about 50 digits). This will not happen very often.

                  Agreed.

                  > The other method (for general numbers): MPQS is a powerful
                  > method for use with general numbers (those not of the form
                  > a^n+b, with small a & b). MPQS has a running time of the
                  > order O(exp(sqrt(log(n)*log(log(n))))), where n is the number
                  > to be factored. I'm currently running an MPQS program on an
                  > 88-digit number; I expect the program to take around 54 hours
                  > on my 450 MHz PC. Run the numbers: a 500-digit number will
                  > take about 8*10^24 times as long, or about 5*10^22 years!

                  Reasonable as far as it goes, but it goes off at a tangent!

                  Already we have a much better algorithm than MPQS. It's the general
                  number field sieve. The crossover point where GNFS becomes faster than
                  MPQS depends strongly on the quality of the implementation but seems to
                  lie somewhere in the range 100-140 digits.

                  > There are other problems with using MPQS to factor such a
                  > number; the size of the sieve array created would be greater
                  > than any computer could hope to handle (probably greater than
                  > all of the computer memory in all the computers in the world
                  > combined).

                  This also applies to GNFS, though a more stringent restriction in
                  practice would be the linear algebra phase.

                  All the above (both postings) assumes that there is no theoretical
                  breakthrough. A reasonably sound assumption, perhaps.

                  Approaching the question from a different direction: we can fit some
                  sort of plausible equation to the experimental data over the last few
                  decades since the development of sub-exponential factoring algorithms.
                  Once such equation I have to hand is Y = 1928 + 13D^(1/3) where Y is the
                  year of first factoring a D decimal digit integer. Sorry I can't give
                  credit to the originator but I don't have the reference equally to hand.

                  IF you're willing to make a completely wild prediction with very little
                  justification for its accuracy, plugging D=500 into this equation gives
                  2031 as the date when a 500-digit hard integer will be first factored.
                  I'm no spring chicken, but I have reasonable hopes that 2031 will be in
                  my lifetime!


                  Paul
                • Barubary
                  ... Actually, it s probably pretty accurate. Quantum computers should actually start existing in 20-30 years, and clearly factoring a huge number is a great
                  Message 8 of 12 , Apr 26, 2001
                    > IF you're willing to make a completely wild prediction with very little
                    > justification for its accuracy, plugging D=500 into this equation gives
                    > 2031 as the date when a 500-digit hard integer will be first factored.
                    > I'm no spring chicken, but I have reasonable hopes that 2031 will be in
                    > my lifetime!

                    Actually, it's probably pretty accurate. Quantum computers should actually
                    start existing in 20-30 years, and clearly factoring a huge number is a
                    great test of their power.

                    -- Barubary
                  • Jack Brennen
                    ... Mea culpa. I split off from the Factor Record thread by changing the subject line. You may have skipped over the original post by Paul Mills. ... You
                    Message 9 of 12 , Apr 26, 2001
                      > I didn't see the question (are messages being dropped by the list?),
                      > but your conclusion seems reasonably sound.

                      Mea culpa. I split off from the "Factor Record" thread by changing
                      the subject line. You may have skipped over the original post by
                      Paul Mills.

                      > Already we have a much better algorithm than MPQS. It's the general
                      > number field sieve. The crossover point where GNFS becomes faster
                      > than MPQS depends strongly on the quality of the implementation but
                      > seems to lie somewhere in the range 100-140 digits.

                      You are correct, my fellow factorer...

                      I went back and re-read the applicable section of Cohen; he estimates
                      a crossover point around 130 digits.

                      My experience with NFS is close to nil; my factoring of general-form
                      composites only ranges up to about 90 digits, so I have used
                      everything up to and including MPQS.

                      > All the above (both postings) assumes that there is no theoretical
                      > breakthrough. A reasonably sound assumption, perhaps.

                      I have to agree that a major mathematical breakthrough in factoring
                      methods needs to occur to see a 500-digit general purpose factoring
                      algorithm which executes in a reasonable time.


                      Jack
                    • paulmillscv@yahoo.co.uk
                      Hi to all, Not one to miss the chance of a world record David Broadhurst has improved on The p / f(p) by using Wilson s theorem and Fermat s Theorem. Well
                      Message 10 of 12 , Apr 27, 2001
                        Hi to all,
                        Not one to miss the chance of a world record David Broadhurst has
                        improved on
                        The p / f(p) by using Wilson's theorem and Fermat's Theorem.
                        Well done!

                        from Wilson (p-1)! + 1 == -1 mod p and Fermat (p+1)^p
                        == p + 1 mod p

                        we can add these equations and get (p-1)! + (p+1)^p == 0 mod p
                        which is nice. So p / (p-1)! + (p+1)^p is the record form at
                        the moment. The English school is doing well.

                        >You didn't explain what is is wrong with
                        >this reductio ad nauseam:

                        > f(p)=(p-1)! + (p^p^p...^p^p^p+1)^p^p^p....^p^p^p
                        > with as many exponentiations as one cares to type?

                        Yes, I see it now,
                        (p + 1)^p == p + 1 mod p
                        (p^p + 1)^p == p^p + 1 mod p etc

                        So we have a strange object, an open `closed' form. The closure
                        here being only in the logical order, and semi-open in the algebraic
                        order (extended via the theorem). I did say the theory starts.


                        Of course you can only use 1 theorem once in any form, no repeated
                        applications etc. Forms must be irreducible in the logical as well
                        as algebraic order. Once again the point of this is that we can
                        now produce ever larger numbers of which we know the factor.
                        There is an error in my previous posting, q = 3*p + 1 is difficult
                        for q and p odd prime. If q = k*p +m in a Generalised Sophie
                        Germain then k and m have opposite parity

                        regards,

                        Paul Mills
                      • d.broadhurst@open.ac.uk
                        Paul Mills wrote ... Try to imagine a mathematician saying: of course this proof of divisibility is invalid because it uses Fermat s theorem twice. Difficult
                        Message 11 of 12 , Apr 27, 2001
                          Paul Mills wrote

                          > David Broadhurst has improved ...

                          No. I merely reduced your idea to absurdity:

                          > Of course you can only use 1 theorem once in any form,

                          Try to imagine a mathematician saying: of course this proof of
                          divisibility is invalid because it uses Fermat's theorem twice.

                          Difficult to get more absurd than that.

                          Please don't try :-)

                          David
                        • MICHAEL HARTLEY
                          ... I m not sure that you can clearly define irreducible in the logical order . Or what you mean by using 1 theorem only once. After all, I m sure that most
                          Message 12 of 12 , May 1, 2001
                            > Yes, I see it now,
                            > (p + 1)^p == p + 1 mod p
                            > (p^p + 1)^p == p^p + 1 mod p etc
                            >
                            > So we have a strange object, an open `closed' form. The closure
                            > here being only in the logical order, and semi-open in the algebraic
                            > order (extended via the theorem). I did say the theory starts.
                            >
                            >
                            > Of course you can only use 1 theorem once in any form, no repeated
                            > applications etc. Forms must be irreducible in the logical as well
                            > as algebraic order. Once again the point of this is that we can

                            I'm not sure that you can clearly define "irreducible in the logical
                            order". Or what you mean by using 1 theorem only once. After all,
                            I'm sure that most of the methods we use actually use certain
                            theorems meny times over: Example

                            Theorem: for any integers x and y, x*y = y*x.

                            Does that mean they are discounted?

                            To my mind, every theorem is reducible in a logical sense, except
                            of course, the axioms of the logical system...

                            Yours, Mike H...



                            > now produce ever larger numbers of which we know the factor.
                            > There is an error in my previous posting, q = 3*p + 1 is difficult
                            > for q and p odd prime. If q = k*p +m in a Generalised Sophie
                            > Germain then k and m have opposite parity
                            >
                            > regards,
                            >
                            > Paul Mills
                            >
                            >
                            >
                            >
                            >
                            > Unsubscribe by an email to: primenumbers-unsubscribe@egroups.com
                            > The Prime Pages : http://www.primepages.org
                            >
                            >
                            >
                            > Your use of Yahoo! Groups is subject to http://docs.yahoo.com/info/terms/
                            >
                            >


                            Michael Hartley : Michael.Hartley@...
                            Head, Department of Information Technology,
                            Sepang Institute of Technology
                            +---Q-u-o-t-a-b-l-e---Q-u-o-t-e----------------------------------
                            "For centuries, people thought the moon was made of green cheese.
                            Then the astronauts found that the moon is really a big hard
                            rock. That's what happens to cheese when you leave it out."
                            --Anon
                          Your message has been successfully submitted and would be delivered to recipients shortly.