Loading ...
Sorry, an error occurred while loading the content.

Prime density question

Expand Messages
  • jbohanon3
    I m sure most of us have seen the proof that there is no largest prime by taking one plus the product of any list of primes and showing that either that number
    Message 1 of 10 , Jan 7, 2005
    • 0 Attachment
      I'm sure most of us have seen the proof that there is no largest
      prime by taking one plus the product of any list of primes and
      showing that either that number is divisible by some prime not on
      the list.

      I am curious to know if you start with the prime 2, then follow that
      argument, you get that 2 and 3 are primes. Then that 7 is prime.
      Then that 2*3*7+1=43 is prime. Then that 2*3*7*43+1=1087 has prime
      divisors 13 and 139. Then 2*3*7*13*43*139+1=3263443 is prime, etc.
      My question is, does this process pick up all prime numbers
      eventually?
    • mikeoakes2@aol.com
      In a message dated 08/01/2005 06:12:34 GMT Standard Time, ... No. For one thing, the first term of your series 2,3,7,43,... is = +2 mod 5, the second is = -2
      Message 2 of 10 , Jan 8, 2005
      • 0 Attachment
        In a message dated 08/01/2005 06:12:34 GMT Standard Time,
        jbohanon3@... writes:

        >I am curious to know if you start with the prime 2, then follow that
        >argument, you get that 2 and 3 are primes. Then that 7 is prime.
        >Then that 2*3*7+1=43 is prime. Then that 2*3*7*43+1=1087 has prime
        >divisors 13 and 139. Then 2*3*7*13*43*139+1=3263443 is prime, etc.
        >My question is, does this process pick up all prime numbers
        >eventually?

        No.

        For one thing, the first term of your series 2,3,7,43,... is = +2 mod 5, the
        second is = -2 mod 5.
        So it is straightforward to prove by induction that if n is odd, the nth
        term of your series is = +2 mod 5, else it is = -2 mod 5.

        So the prime 5 never appears.

        -Mike Oakes


        [Non-text portions of this message have been removed]
      • Jens Kruse Andersen
        ... What if we get a squareful number? Suppose, for arguments sake, that 2*3*7+1 is 4693 which is -2 mod 5 as required. 4693 = 13*19^2. If we only use one 19
        Message 3 of 10 , Jan 8, 2005
        • 0 Attachment
          Mike Oakes wrote:

          > jbohanon3@... writes:
          >
          > >Then that 2*3*7+1=43 is prime. Then that 2*3*7*43+1=1087 has prime
          > >divisors 13 and 139. Then 2*3*7*13*43*139+1=3263443 is prime, etc.

          > For one thing, the first term of your series 2,3,7,43,... is = +2 mod 5,
          > the second is = -2 mod 5.
          > So it is straightforward to prove by induction that if n is odd, the nth
          > term of your series is = +2 mod 5, else it is = -2 mod 5.
          >
          > So the prime 5 never appears.

          What if we get a squareful number?
          Suppose, for arguments sake, that 2*3*7+1 is 4693 which is -2 mod 5 as
          required.
          4693 = 13*19^2. If we only use one 19 (no rule was stated but it seems
          natural) then the next number would be 2*3*7*13*19+1 = 10375 = 5^3*83.

          --
          Jens Kruse Andersen
        • Robert
          I am curious as to the fact that, at least for small p, the chance of sum of three consecutive primes should be more prime than one might expect from 1/logx.
          Message 4 of 10 , Apr 22, 2006
          • 0 Attachment
            I am curious as to the fact that, at least for small p, the chance of
            sum of three consecutive primes should be more prime than one might
            expect from 1/logx. This appears to also be the case for the sum of 5
            consecutive primes.

            For example, take all the possible sums of three consecutive primes
            for the first 5003 primes, then there are 20% more primes than
            expected through 1/logx

            In groups of 500.

            Primes/Expected primes/% over expected
            196 146.5624722 33.73%
            143 118.4366003 20.74%
            139 111.5387222 24.62%
            114 107.5275346 6.02%
            139 104.7286242 32.72%
            108 102.6263338 5.24%
            120 100.9222961 18.90%
            121 99.53457675 21.57%
            109 98.33987502 10.84%
            113 97.3056928 16.13%

            It may also be the case that such sums of three consecutive primes
            which are composite have lower than expected numbers of factors (i.e.
            non smooth) but I do not have algorithms for checking this.

            Maybe someone has already investigated this, as the sequence of primes
            of this form is listed in Sloane's id:A034962

            Regards

            Robert Smith
          • Phil Carmody
            ... 1/1 of them are not divisible by 2, rather than 1/2 of arbitrary numbers. Density boost of (1/1)/(1/2) = 2/1, or +100% 6/8 of them are not divisible by 3,
            Message 5 of 10 , Apr 22, 2006
            • 0 Attachment
              --- Robert <rw.smith@...> wrote:
              > I am curious as to the fact that, at least for small p, the chance of
              > sum of three consecutive primes should be more prime than one might
              > expect from 1/logx. This appears to also be the case for the sum of 5
              > consecutive primes.
              >
              > For example, take all the possible sums of three consecutive primes
              > for the first 5003 primes, then there are 20% more primes than
              > expected through 1/logx
              >
              > In groups of 500.
              >
              > Primes/Expected primes/% over expected
              > 196 146.5624722 33.73%
              > 143 118.4366003 20.74%
              > 139 111.5387222 24.62%
              > 114 107.5275346 6.02%
              > 139 104.7286242 32.72%
              > 108 102.6263338 5.24%
              > 120 100.9222961 18.90%
              > 121 99.53457675 21.57%
              > 109 98.33987502 10.84%
              > 113 97.3056928 16.13%

              1/1 of them are not divisible by 2, rather than 1/2 of arbitrary numbers.
              Density boost of (1/1)/(1/2) = 2/1, or +100%

              6/8 of them are not divisible by 3, rather than 2/3 of arbitrary numbers.
              Density boost of (6/8)/(2/3) = 9/8, or +12.5%

              52/64 of them are not divisible by 5, rather than 4/5 of arbitrary numbers.
              Density boost of (52/64)/(4/5) = 65/64, or +1.6%

              Hypothesis - for each prime p there's a ((p-1)^3+1)/(p-1)^3 density boost.

              Limit ~= 2.30

              I assume that you've measured the density of primes in odd numbers rather than
              all numbers, and would therefore measure a 15% nett density boost.

              Phil

              () ASCII ribbon campaign () Hopeless ribbon campaign
              /\ against HTML mail /\ against gratuitous bloodshed

              [stolen with permission from Daniel B. Cristofani]

              __________________________________________________
              Do You Yahoo!?
              Tired of spam? Yahoo! Mail has the best spam protection around
              http://mail.yahoo.com
            • Robert
              ... rather than ... Thanks for the reply Phil. I summed individual 1/logx for each x, where x is the sum of 3 consecutive primes. Regards Robert
              Message 6 of 10 , Apr 22, 2006
              • 0 Attachment
                --- In primenumbers@yahoogroups.com, Phil Carmody <thefatphil@...> wrote:
                >
                > --- Robert <rw.smith@...> wrote:
                > > I am curious as to the fact that, at least for small p, the chance of
                > > sum of three consecutive primes should be more prime than one might
                > > expect from 1/logx. This appears to also be the case for the sum of 5
                > > consecutive primes.

                >
                > I assume that you've measured the density of primes in odd numbers
                rather than
                > all numbers, and would therefore measure a 15% nett density boost.
                >
                > Phil

                Thanks for the reply Phil. I summed individual 1/logx for each x,
                where x is the sum of 3 consecutive primes.

                Regards

                Robert
              • Phil Carmody
                Grrrr, I ll try again. Firefox really is the second biggest pile of rubbish ever written... ... Odd, (I decided to skip very small primes in case they were
                Message 7 of 10 , Apr 22, 2006
                • 0 Attachment
                  Grrrr, I'll try again. Firefox really is the second biggest pile
                  of rubbish ever written...

                  --- Robert <rw.smith@...> wrote:
                  > --- In primenumbers@yahoogroups.com, Phil Carmody <thefatphil@...> wrote:
                  > > --- Robert <rw.smith@...> wrote:
                  > > > I am curious as to the fact that, at least for small p, the chance of
                  > > > sum of three consecutive primes should be more prime than one might
                  > > > expect from 1/logx. This appears to also be the case for the sum of 5
                  > > > consecutive primes.
                  >
                  > > I assume that you've measured the density of primes in odd numbers
                  > > rather than
                  > > all numbers, and would therefore measure a 15% nett density boost.
                  >
                  > Thanks for the reply Phil. I summed individual 1/logx for each x,
                  > where x is the sum of 3 consecutive primes.

                  Odd, (I decided to skip very small primes in case they were skewing things):

                  lsum=0.;psum=1;oop=nextprime(100);op=nextprime(oop+1);p=nextprime(op+1);
                  while(p<10^6,
                  oop=op;op=p;p=nextprime(p+1);
                  if(isprime(p+op+oop),
                  psum++
                  );
                  lsum+=1/log(p+op+oop)
                  );
                  [lsum,psum]

                  [5721.3223, 15095]
                  ? 15095/5721.3223
                  2.6383761

                  The ratio seems to be dropping (I also checked to 10^7), but not at a
                  rate where I'd feel convinced it would eventually reach 2.30.

                  I was genuinely expecting something closer to 2.30.
                  It's odd for me to be 15% out when it comes to density heuristics.
                  Can Jack/Decio/Jens/Paul/Chris/David/... spot any obvious error?


                  Phil

                  () ASCII ribbon campaign () Hopeless ribbon campaign
                  /\ against HTML mail /\ against gratuitous bloodshed

                  [stolen with permission from Daniel B. Cristofani]

                  __________________________________________________
                  Do You Yahoo!?
                  Tired of spam? Yahoo! Mail has the best spam protection around
                  http://mail.yahoo.com
                • Jens Kruse Andersen
                  ... You must have used base 10 log. It should be natural log and give expected primes = 63.65 = 146.56/log 10. (Incidentally, log 10 = 2.3026 is close to
                  Message 8 of 10 , Apr 22, 2006
                  • 0 Attachment
                    Robert wrote:
                    > Primes/Expected primes/% over expected
                    > 196 146.5624722 33.73%

                    > I summed individual 1/logx for each x,
                    > where x is the sum of 3 consecutive primes.

                    You must have used base 10 log. It should be natural log and
                    give expected primes = 63.65 = 146.56/log 10.
                    (Incidentally, log 10 = 2.3026 is close to Phil's below
                    limit but that is purely a coincidence).

                    Phil Carmody wrote:
                    > Hypothesis - for each prime p there's a ((p-1)^3+1)/(p-1)^3 density boost.
                    >
                    > Limit ~= 2.30

                    I agree with this formula and limit.

                    > 2.6383761
                    >
                    > The ratio seems to be dropping (I also checked to 10^7), but not at a
                    > rate where I'd feel convinced it would eventually reach 2.30.
                    >
                    > I was genuinely expecting something closer to 2.30.
                    > It's odd for me to be 15% out when it comes to density heuristics.
                    > Can Jack/Decio/Jens/Paul/Chris/David/... spot any obvious error?

                    Not obvious. Experimentation indicates the 15% is mainly
                    caused by the factor 3.
                    I computed how often the sum of 3 consecutive prp's was not divisible
                    by 3 in the first 10000 cases after 10^d, for d = 0, 5, 10, ..., 100:

                    (01:31) gp > forstep(d=0,100,5,N=10^d;s=0;p=nextprime(N);q=nextprime(p+1);
                    for(i=1,10000,r=nextprime(q+1);if((p+q+r)%3!=0,s++);p=q;q=r);
                    print(d" "s))
                    0 8537
                    5 8416
                    10 8079
                    15 7748
                    20 7727
                    25 7863
                    30 7709
                    35 7615
                    40 7626
                    45 7672
                    50 7646
                    55 7589
                    60 7579
                    65 7540
                    70 7594
                    75 7490
                    80 7727
                    85 7576
                    90 7580
                    95 7594
                    100 7504

                    Let's call d=80 a random fluctuation in a limited data sample.
                    It looks plausible that the ratio is dropping towards
                    7500/10000 = 3/4 as Phil expects.
                    8416/7500 = 1.12 is not that far from Phil's measured 15% error
                    for small primes.
                    And factors above 3 may also have a special effect for small numbers.

                    I guess this special effect is caused by larger prime gaps "not getting
                    their chance" to occur when there has already been a prime.
                    Example: Prime gaps 10 and 20 probably have the same asymptotic frequency,
                    because they have the same prime factors (*). But 10 is significantly more
                    common among small numbers because there has more often been a
                    prime before p+20 is reached.

                    Maybe a good heuristic for our problem on small numbers would have
                    to compute an expected probability of every relatively small gap size
                    (or maybe every combination of 2 consecutive gap sizes) between
                    primes of the examined size. I haven't done that.

                    (*) The expected frequency of a prime gap depends on its prime factors.
                    Example: If p>3 is prime then p+6 never has factor 3. This means gap 6 is
                    probably twice as common as gap 4 and gap 8.
                    Heuristic exercise I recently made: What is the smallest even gap divisible by
                    3 which is probably less common than one of the even neighbours not divisible
                    by 3?

                    --
                    Jens Kruse Andersen
                  • Phil Carmody
                    ... Wise move. ... Yup. Basically, as a physicist said offlist, at the smaller values, the primes are so close that they still /know/ about each other. I too
                    Message 9 of 10 , Apr 23, 2006
                    • 0 Attachment
                      --- Jens Kruse Andersen <jens.k.a@...> wrote:
                      > > The ratio seems to be dropping (I also checked to 10^7), but not at a
                      > > rate where I'd feel convinced it would eventually reach 2.30.
                      > >
                      > > I was genuinely expecting something closer to 2.30.
                      > > It's odd for me to be 15% out when it comes to density heuristics.
                      > > Can Jack/Decio/Jens/Paul/Chris/David/... spot any obvious error?
                      >
                      > Not obvious. Experimentation indicates the 15% is mainly
                      > caused by the factor 3.
                      > I computed how often the sum of 3 consecutive prp's was not divisible
                      > by 3 in the first 10000 cases after 10^d, for d = 0, 5, 10, ..., 100:

                      Wise move.

                      > I guess this special effect is caused by larger prime gaps "not getting
                      > their chance" to occur when there has already been a prime.

                      Yup. Basically, as a physicist said offlist, at the smaller values, the
                      primes are so close that they still /know/ about each other.

                      I too later looked at the distribution of consecutive {1,1,1} and {2,2,2} mod
                      3, and at small sizes (<10^20) the skew avoiding them is huge
                      e.g.
                      ranges of size 10^7 from 10^d onwards had the follwing ratios:
                      d=8 (50752+50860)/(50752+69029+81813+69195+69030+81979+69196+50860)=0.18752
                      d=18 (26404+26389)/(26404+30359+33265+30538+30359+33443+30538+26389)=0.21879
                      d=80 (6419+6460)/(6419+6815+7045+6791+6816+7020+6791+6460) =0.23781

                      The assymptotic ratio, according to my original proposed model is 0.25. It is
                      not unbelievable that the values converge there. But the terribly small size of
                      log(n) seems to make the primes know about each other for a very long time.

                      My error was thinking that because the expression that generated the limit
                      converged so quickly, that the behaviour of the primes should converge there
                      quickly too. One simply couldn't be more wrong. It appears that the primes
                      are terribly laid back, and see no rush to conform to predicted patterns until
                      they're good and ready. Good on them!

                      Phil

                      () ASCII ribbon campaign () Hopeless ribbon campaign
                      /\ against HTML mail /\ against gratuitous bloodshed

                      [stolen with permission from Daniel B. Cristofani]

                      __________________________________________________
                      Do You Yahoo!?
                      Tired of spam? Yahoo! Mail has the best spam protection around
                      http://mail.yahoo.com
                    • Phil Carmody
                      ... The off-list physicist volunteered 0.243 at d=120 which is a very positive hint that 0.25 is in the sights. Alas my d=180 run has just completed: d=180:
                      Message 10 of 10 , Apr 23, 2006
                      • 0 Attachment
                        --- Phil Carmody <thefatphil@...> wrote:
                        > ranges of size 10^7 from 10^d onwards had the follwing ratios:
                        > d=8 (50752+50860)/(50752+69029+81813+69195+69030+81979+69196+50860)=0.18752
                        > d=18 (26404+26389)/(26404+30359+33265+30538+30359+33443+30538+26389)=0.21879
                        > d=80 (6419+6460)/(6419+6815+7045+6791+6816+7020+6791+6460) =0.23781

                        The off-list physicist volunteered 0.243 at d=120 which is a very positive
                        hint that 0.25 is in the sights. Alas my d=180 run has just completed:
                        d=180: (324+240)/(324+309+315+292+309+299+293+240) = 0.2369

                        Which is a step in the wrong direction.

                        Here's the script I run - I just change the value of st in the first line:
                        <<<
                        ? vc=vector(3^3);st=10^180;ran=10^6;p2=precprime(st);p1=precprime(p2-1);
                        ?
                        while((p3=nextprime(p2+1))<st+ran,nc++;vc[(p1%3)*9+(p2%3)*3+(p3%3)]++;p1=p2;p2=p3);
                        vc
                        [0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 324, 309, 0, 315, 292, 0, 0, 0, 0, 309,
                        299, 0, 293, 240, 0]
                        ? (324+240)/(324+309+315+292+309+299+293+240)
                        564/2381
                        ? 564/2381.
                        0.23687526
                        >>>

                        Perhaps others could select random sizes and compare results?

                        Phil

                        () ASCII ribbon campaign () Hopeless ribbon campaign
                        /\ against HTML mail /\ against gratuitous bloodshed

                        [stolen with permission from Daniel B. Cristofani]

                        __________________________________________________
                        Do You Yahoo!?
                        Tired of spam? Yahoo! Mail has the best spam protection around
                        http://mail.yahoo.com
                      Your message has been successfully submitted and would be delivered to recipients shortly.