Phil Carmody wrote:

>>

>> However, if we assume that n=10^7 is accurate for each of them,

>> there would be about a 50% chance of finding a prime for one of

>> the 7 candidate k values with n < 14675000.

>>

>> That's the good news.

>

> That's quite optmistic. Maybe the one just found was that one!

First, I realize reading this that I'm being way too precise. With

the unknown depth of search, and the inaccuracy of the Proth weight

values, I should probably have just said n < 15_000_000.

In any case, with the remaining 7 k values, if they've been

completely searched up to n < A, we're about 50% to find another

prime with n < 1.5*A. A very quick justification which is very

close to being mathematically "correct":

The remaining 7 k values should produce an aggregate total of about

1.25 primes per "octave" (A < n < 2*A), and the distribution should

be very Poisson-like.

To get a 50% chance of a hit in a Poisson distribution, we need an

expectation of log(2) primes. That requires 0.55 octaves, or

A < n < 1.47*A.

>

> Remind me to never ignorantly cross you, Jack ;-)

If you read the forum link where Louie originally trashed my math,

he admitted very quickly afterward that he did make a mistake and

that I was probably "in the ballpark"...

>

>> Still, even a single test in the n > 10^12 range is beyond our

>> reasonable capabilities today -- we're just not ready to do

>> modular arithmetic on terabit numbers.

>

> We're most of the way there. Compared with those flipping iron rings at least.

> Doing it 10^12 times I think will be a harder target.

Keeping even a single terabit number in high-speed RAM is far out of

the capability of the vast majority of computers in existence -- that's

my point. Clearly we have the capability to build such hardware, but

the whole point of SoB and other cooperative computing projects is to

use inexpensive commonly available PC-like devices, and they're still

many years away from being able to hold even a single terabit number

in RAM.

>

> Ah probably the single most pervasive snippet of utter wrongness

> I've seen various people throw around on their project fora is

> that getting rid of the dense ones is best. Of course, that's

> the worse possible situation, you want to get rid of the most

> difficult numbers sooner rather than later. I've tried telling

> them that, but most just didn't seem to grok the concept.

>

To put things in perspective, the toughest two k values have an

aggregate expectation of 0.15 primes per octave. In very rough

numbers, that means an expectation to find 1 prime between these

two k values as we push n from 10^7 to 10^9 (about 6.6 octaves).

And note of course that 1 prime between those two k values won't

resolve the conjecture.