## Estimating log (B^pi(B))

Expand Messages
• ... Hash: SHA1 Hello there, In the course of my investigation on the complexity of p-1 factoring, I found out I had made a mistake when estimating the least
Message 1 of 19 , Jan 30, 2003
-----BEGIN PGP SIGNED MESSAGE-----
Hash: SHA1

Hello there,

In the course of my investigation on the complexity of p-1 factoring, I found
out I had made a mistake when estimating the least common multiple of all
numbers up to B. It turns out that an upper bound for this is B^pi(B). Since
the cost of a modular exponentiation (in modular multiplies) is asymptotic to
the logarithm of the exponent, the cost for modular exponentiation here is
log(B^pi(B)) = pi(B) * log B = log B * B/log B = B. And this is actually an
excellent estimate:

?
a=0;B=400000000;logB=log(B);forprime(p=2,B,logp=log(p);a=a+floor(logB/logp)*logp);print(a);
400002778.0577266417502590976

This is as far as PARI/GP will go to the best of my knowledge (with the
forprime command at least). Still, it's a very small error.

In other good news, B*(log (p-1)/log B)^(log (p-1)/log B), which is the cost
of a modular exponentiation to the exponent lcm (x <= B) divided by the
probability that the number p-1 is B-smooth, has a local minimum, which I
couldn't approximate algebraically, but B is equal to ~1200 for p-1 = 10^10,
or ~8100 for 10^15.

Now I need to calculate the cost of doing stage 2. Any good estimates to the
value of the summation of 1/p, for p prime, over a finite interval?

Thanks,

Décio
-----BEGIN PGP SIGNATURE-----
Version: GnuPG v1.2.1 (GNU/Linux)

iD8DBQE+OYt5ce3VljctsGsRAvyOAJ4687I3DkVsIlY/wB3Lb1rn/EMxFwCeKIlX
Bm2N7F44H6tBmy05eOL42uk=
=lyuu
-----END PGP SIGNATURE-----
• For your stage 2 I m thinking: By the mean-value theorem, with f(x)=ln(x), exists a point c such that 1/c = (ln(b)-ln(a))/(b-a) in the closed interval [a,b];
Message 2 of 19 , Jan 30, 2003

I'm thinking:

By the mean-value theorem, with f(x)=ln(x), exists a point c such that 1/c = (ln(b)-ln(a))/(b-a) in the closed interval [a,b]; if it's [1,n], then (ln(n)-ln(1))/(n-1) = 1/c -->

--> 1/n <= ln(n)/(n-1) <= 1 --> (n-1)/n <= ln(n) <= n-1

Also

The product of the primes up to the nth prime is greater than 2^n; (n>1)

(p_n)# > 2^(n) --> PRODUCT_n(1/p_k) < 2^(-n)

then, applying logarithms to both sides, we have that the sum of the logarithms of the inverses of the primes up to n is lesser than -n*ln(2) ;

SUM(ln(1/p_k)) < -n·ln 2

Now we have the first inequality,

if ln(m) = ln(1/p) then (m-1)/m = p/(p-1) = (1-1/p) <= ln(1/p) <= 1/(p-1)

so, SUM(1-1/p) <= SUM(ln(1/p)) < -n*ln2 -->

--> SUM(1) - SUM(1/p) < -n*ln2 -->- SUM(1/p) < -(n*ln2 + SUM(1)) -->

--> SUM(1/p) > n*ln2 + SUM(1) where n is counting the primes, not the natural numbers, in fact there are n sums, and so

SUM(1/p(n)) > n*(ln2 + 1) = 1,693n ( p(n) is the n-th prime)

I've surely commited a mistake somewhere, but I can't find it... try to catch it and maybe then this reasoning will be of some utility (altough that when n increases, 2^n is a worse lower limit, but you can use as base a greater prime you now that has already been used in the product).

Jose Brox

PD I apologize for the case I have said an stupidity. Justifications: novelty, Discrete Signal Processing exam tomorrow :-S
----- Original Message -----
From: Décio Luiz Gazzoni Filho
Sent: Thursday, January 30, 2003 9:30 PM

-----BEGIN PGP SIGNED MESSAGE-----
Hash: SHA1

Hello there,

In the course of my investigation on the complexity of p-1 factoring, I found
out I had made a mistake when estimating the least common multiple of all
numbers up to B. It turns out that an upper bound for this is B^pi(B). Since
the cost of a modular exponentiation (in modular multiplies) is asymptotic to
the logarithm of the exponent, the cost for modular exponentiation here is
log(B^pi(B)) = pi(B) * log B = log B * B/log B = B. And this is actually an
excellent estimate:

?
a=0;B=400000000;logB=log(B);forprime(p=2,B,logp=log(p);a=a+floor(logB/logp)*logp);print(a);
400002778.0577266417502590976

This is as far as PARI/GP will go to the best of my knowledge (with the
forprime command at least). Still, it's a very small error.

In other good news, B*(log (p-1)/log B)^(log (p-1)/log B), which is the cost
of a modular exponentiation to the exponent lcm (x <= B) divided by the
probability that the number p-1 is B-smooth, has a local minimum, which I
couldn't approximate algebraically, but B is equal to ~1200 for p-1 = 10^10,
or ~8100 for 10^15.

Now I need to calculate the cost of doing stage 2. Any good estimates to the
value of the summation of 1/p, for p prime, over a finite interval?

Thanks,

Décio
-----BEGIN PGP SIGNATURE-----
Version: GnuPG v1.2.1 (GNU/Linux)

iD8DBQE+OYt5ce3VljctsGsRAvyOAJ4687I3DkVsIlY/wB3Lb1rn/EMxFwCeKIlX
Bm2N7F44H6tBmy05eOL42uk=
=lyuu
-----END PGP SIGNATURE-----

Unsubscribe by an email to: primenumbers-unsubscribe@yahoogroups.com
The Prime Pages : http://www.primepages.org/

[Non-text portions of this message have been removed]
• ... sum(p=p1,p2,1/p) for p2 p1 1 is given by PNT as approximately log(log(p2)/log(p1)) since integral dx/(x*log(x))=log(log(x)). If the lower limit is small
Message 3 of 19 , Jan 30, 2003
> Any good estimates to the value of the summation of 1/p,
> for p prime, over a finite interval?
sum(p=p1,p2,1/p) for p2>>p1>>1 is given by PNT as
approximately log(log(p2)/log(p1))
since integral dx/(x*log(x))=log(log(x)).
If the lower limit is small and the upper large,
do a finite sum from p1 to p_big and then add the integral
from p_big to p2.
David
• ... Hash: SHA1 On Thursday 30 January 2003 21:29, David Broadhurst ... I plugged some values and I realized there was something
Message 4 of 19 , Jan 30, 2003
-----BEGIN PGP SIGNED MESSAGE-----
Hash: SHA1

wrote:
> > Any good estimates to the value of the summation of 1/p,
> > for p prime, over a finite interval?
>
> sum(p=p1,p2,1/p) for p2>>p1>>1 is given by PNT as
> approximately log(log(p2)/log(p1))
> since integral dx/(x*log(x))=log(log(x)).
> If the lower limit is small and the upper large,
> do a finite sum from p1 to p_big and then add the integral
> from p_big to p2.
> David
>

I plugged some values and I realized there was something wrong. Not with your
estimate, of course, but rather with the approach I tried.

I guess in order to estimate the probability that p-1 is divisible by one
single prime in the interval (B, B'] (say these primes are p_k, p_(k+1), ...,
p_j), then I need to calculate prob(p_k | p-1)*prob(p_(k+1) !|
p-1)*...*prob(p_j !| p-1) + prob(p_(k+1) | p-1)*prob(p_k !| p-1)*prob(p_(k+2)
!| p-1)*...*prob(p_j !| p-1) + ... + prob(p_j | p-1)*prob(p_k !|
p-1)*...*prob(p_(j-1) !| p-1). This doesn't seem particularly easy to
evaluate.

I'd rather approximate it as a continuous variable and find a suitable
probability density function to estimate the probability that p-1 is
divisible by a prime in that interval. Anyone care to help? I'm sure this is
very intimately connected to the PNT.

Thanks

Décio
-----BEGIN PGP SIGNATURE-----
Version: GnuPG v1.2.1 (GNU/Linux)

iD8DBQE+Oc7xce3VljctsGsRAobqAJ9Ogovmd6BM4PzdaQ7PnwIOtggVdQCgqc5C
2YYA0jutPUsHZR212LtjTok=
=3Neb
-----END PGP SIGNATURE-----
• ... Dickman s theorem: (Riesel, Thm. 5.5) The probability of a number N, chosen at random, having a prime factor between N^a and N^(a+a.d) is approximately =
Message 5 of 19 , Jan 31, 2003
--- D�cio Luiz Gazzoni Filho <decio@...> wrote:
> I guess in order to estimate the probability that p-1 is divisible by one
> single prime in the interval (B, B'] (say these primes are p_k, p_(k+1), ...,
> p_j), then I need to calculate prob(p_k | p-1)*prob(p_(k+1) !|
> p-1)*...*prob(p_j !| p-1) + prob(p_(k+1) | p-1)*prob(p_k !| p-1)*prob(p_(k+2)
> !| p-1)*...*prob(p_j !| p-1) + ... + prob(p_j | p-1)*prob(p_k !|
> p-1)*...*prob(p_(j-1) !| p-1). This doesn't seem particularly easy to
> evaluate.
>
> I'd rather approximate it as a continuous variable and find a suitable
> probability density function to estimate the probability that p-1 is
> divisible by a prime in that interval. Anyone care to help? I'm sure this is
> very intimately connected to the PNT.

Dickman's theorem: (Riesel, Thm. 5.5)
The probability of a number N, chosen at random, having a prime factor
between N^a and N^(a+a.d) is approximately = d, independent of the magnitude
of a, if d is small.

So set N^a = B, N^(a.d)=B'/B.

=> d=log(B'/B)/log(B)

(logs in any base, obviously, as you're taking a ratio of 2 of them)

Phil

=====
Is that free as in Willy or as in bird?

__________________________________________________
Do you Yahoo!?
http://mailplus.yahoo.com
• ... Well I guess you should also demand that a
Message 6 of 19 , Jan 31, 2003
> The probability of a number N, chosen at random,
> having a prime factor between N^a and N^(a+a.d)
> is approximately = d, independent of the magnitude
> of a, if d is small.

Well I guess you should also demand that a < 0.5
else there is _no_ prime factor :-)

How small does a have to be for us to use this?

Suppose that a=0.2 is small enough. Now set d=0.1.
Then a 200 digit number has a 10% chance
of having a prime factor between 40 and 44 digits,
if we can use Dickman.

This is the sort of range that has been probed
by ECM effort on Cunningham cofactors.

Is this 10% estimate consistent with Cunningham
experience accumlated by Paul Leyland and company
when using ECM prior to contemplating SNFS?

David
• ... Eep, no! /Your/ number isn t chosen at random, you know it s even. Once you ve divided it by 2, however it effectively becomes random again (within the
Message 7 of 19 , Jan 31, 2003
> > The probability of a number N, chosen at random,
> > having a prime factor between N^a and N^(a+a.d)
> > is approximately = d, independent of the magnitude
> > of a, if d is small.
>
> Well I guess you should also demand that a < 0.5
> else there is _no_ prime factor :-)

Eep, no! /Your/ number isn't chosen at random, you know it's even.
Once you've divided it by 2, however it effectively becomes random again
(within the limits of what's relevant here, he says waving hands).

> How small does a have to be for us to use this?
>
> Suppose that a=0.2 is small enough. Now set d=0.1.
> Then a 200 digit number has a 10% chance
> of having a prime factor between 40 and 44 digits,
> if we can use Dickman.
>
> This is the sort of range that has been probed
> by ECM effort on Cunningham cofactors.

Eep, no! None of the cunningham numbers are chosen at random!
Cofactors in particularlt will be much less random, for thereasons you
raise above.

I'd give something like the Home Prime sequences and the Champerowne-based songs
and dances a better chance of providing random data. (Remember that these won't
be completely random either, at least 2 and 5 will be anti-gimmes, and
possibly 3 as well will have some skew. I wouldn't get against 11 being
weird either.)

> Is this 10% estimate consistent with Cunningham
> experience accumlated by Paul Leyland and company
> when using ECM prior to contemplating SNFS?

It's hard to think of a corpus that would be entirely devoid of bias to be
honest...

However, I'm not claiming that there isn't an exact analogue for numbers
with restrictive divisibility properties. However, Dickman says _nothing_
about such numbers, and is therefore undemonstrable/unrefutable using them.

... brilliant numbers. If someone has kept a full record of all the numbers
that were in the range they tested for brilliantness, then that would be a
fair corpus, surely? I'm assuming they weren't sieved terribly deeply, and
therefore only the medium sized factors would need to have been recorded, as
the small ones can be reproduced easily (hey, I'll offer to find all the
small factors to test my new factoring algorithm!). Sure they're smaller
than 200 digits by a factor of 2, but they're about as unbiased as you can
get, and there's a fair number of them.

Anyone with such a collection?

Phil

=====
Is that free as in Willy or as in bird?

__________________________________________________
Do you Yahoo!?
http://mailplus.yahoo.com
• ... The Carmody sieving doctrine seems to assert that when you have divided anything (where possible) by primes less then a trillion then you ain t got
Message 8 of 19 , Jan 31, 2003
Phil:

> Once you've divided it by 2, however it effectively
> becomes random again

The Carmody sieving doctrine seems to assert that when
you have divided anything (where possible) by primes
less then a trillion then you ain't got anything special
left, except a number with no prime factor less than a trillion:-)

However I was very well aware that the Cunningham cofactors
are special numbers, which is why I asked the *open* question:
does Dickman square with the data?

Puzzle: How did I exploit Dickman to gain a factor of
2 speedup for finding a recently posted prime? (And yes, I
do [unfortunately] have enough data to support the
factor of 2 speedup.)

David
• ... I would quite broad-brush it like that. I d say that I know a factor of 60 from a factor of one-plus-itsy-bitsy though, and I don t tend to worry about the
Message 9 of 19 , Jan 31, 2003
> Phil:
>
> > Once you've divided it by 2, however it effectively
> > becomes random again
>
> The Carmody sieving doctrine seems to assert that when
> you have divided anything (where possible) by primes
> less then a trillion then you ain't got anything special
> left, except a number with no prime factor less than a trillion:-)

I would quite broad-brush it like that. I'd say that I know a factor of 60
from a factor of one-plus-itsy-bitsy though, and I don't tend to worry about
the itsy-bitsies.

> However I was very well aware that the Cunningham cofactors
> are special numbers, which is why I asked the *open* question:
> does Dickman square with the data?

I reckon there's a perpendicular merger of Dickman and Dirichlet waiting to
be performed. i.e. I don't reckon there's much room for misbehaviour.
(Not full Dirichlet - just 1(n) and maybe -1(n). Aside - was there a name
for this restricted theory, which I've been told drops out far more easily
than full Dirichlet?)

> Puzzle: How did I exploit Dickman to gain a factor of
> 2 speedup for finding a recently posted prime? (And yes, I
> do [unfortunately] have enough data to support the
> factor of 2 speedup.)

I find it hard to imagine using Dickman for that purpose, what was the
particular number you were thinging of. And I genuinely mean, like the
landing party on Star Trek that discovers some bizarre alien artefact, I
would know how to use Dickman for that purpose - where should I hold it,
in which direction should I point it, and what button should I press?!?!
I'm just glad I'm not wearing red...

Phil

=====
Is that free as in Willy or as in bird?

__________________________________________________
Do you Yahoo!?
http://mailplus.yahoo.com
• ... the {1,2,12} button or the {1,2,15} button or the {1,4,6} button but not many others :-)
Message 10 of 19 , Jan 31, 2003
> what button should I press?!?!
the {1,2,12} button or
the {1,2,15} button or
the {1,4,6} button
but not many others :-)
• ... Woh! I can t see Dickman in that at all. What am I missing? Phil ===== Is that free as in Willy or as in bird?
Message 11 of 19 , Jan 31, 2003
> > what button should I press?!?!
> the {1,2,12} button or
> the {1,2,15} button or
> the {1,4,6} button
> but not many others :-)

Woh! I can't see Dickman in that at all. What am I missing?

Phil

=====
Is that free as in Willy or as in bird?

__________________________________________________
Do you Yahoo!?
http://mailplus.yahoo.com
• ... Let me backtrack a bit, to Aurifeuille. Some time ago, Bouk and I were trying to develop a theory of Generalized Lucas Aurifeuillian parts: the
Message 12 of 19 , Jan 31, 2003
> Woh! I can't see Dickman in that at all.
> What am I missing?

Let me backtrack a bit, to Aurifeuille.

Some time ago, Bouk and I were trying
to develop a theory of Generalized Lucas
Aurifeuillian parts: the generalization of

http://www.utm.edu/research/primes/lists/top20/LucasA.html

from Lucas(1,-1,n) to Lucas(p,q,n).
[Actually with q=+/-1 in our case.]

Before we articulated the full theory
we needed an empirical diagnostic that an
Aurifeullian factorization exists, in situations where
the numbers were too big to completely factorize.

Dickman (informally construed, with all the usual
caveats) provided it. If N=A*B exists as an algebraic
factorization of N, and we have taken out primes
up to (more or less) T1 digits and ask ECM for primes between
(more or less) T1 and T2 digits then Dickman suggests
(don't eek me, I only said sugests) that we will
get them twice as fast, compared with when there
isn't such a factorization. Anyway that turned
out to be a tolerable diagnostic which enabled
us to guess the relationship between (p,q,n) that
gives an Aurifeuillian factorization.

Now this also suggests that the seeds [Markus: sic!]

{1,2,12}
{1,2,15}
{1,4,6}

will repay people like him who use ECM to get
4-Carmichaels [without using CRT, which first
I and then, more ably, you used].

They need ECM because when the 3-seed has
produced primes {p,q,r} then the 4th Carmichael
factor 1+(p*q*r-1)/d is gotten by hacking primes out
(p*q*r-1)/lcm(p-1,q-1,r-1) and combining them
to make d such that the 4th factor is also prime.

Now what's special about those 3 buttons?

? print(factor((1+x)*(1+2*x)*(1+12*x)-1))
[x, 1; 4*x + 3, 1; 6*x + 5, 1]

? print(factor((1+x)*(1+2*x)*(1+15*x)-1))
[x, 1; 3*x + 2, 1; 10*x + 9, 1]

? print(factor((1+x)*(1+4*x)*(1+6*x)-1))
[x, 1; 2*x + 1, 1; 12*x + 11, 1]

But Markus is stuck with his NewPgen output and so
can't use CRT. So I thought it would be useful
to give {1,2,12} a proof in principle by beating the
4-Carmichael 4th factor record.

In fact I did it twice over:

2763260532*((1591638066432*3061#+19)^2-1)*3061#/
286610757607008951353515277095+1 3909 p44 03
4-Carmichael factor (4)

757849549*((72753556704*3061#+19)^2-1)*3061#/
56731664597567983567442428202862519351615138+1 3891 p44 03
4-Carmichael factor (4)

finding ECM factors at about twice
the rate for this seed as compared with
the other seeds in my plantpot.

OK, it's a trivial [and suspect] use of Dickman.

But it was funny that you should mention him
just after I had commented on another such seed
{1,2,15}. But Markus can't use that
Dickman-enhanced seed, since he
can't get x=1 mod 5 in

? print(factor((1+x)*(1+2*x)*(1+15*x)-1))
[x, 1; 3*x + 2, 1; 10*x + 9, 1]

because he used primorial mode in NewPgen.

David
• Oh, I just thought of a nice grace note. One can use algebraically factorizing seeds to find a 4-Carm with D+D+D+2*D digits, better than your D+D+D+D but short
Message 13 of 19 , Jan 31, 2003
Oh, I just thought of a nice grace note.
One can use algebraically factorizing
seeds to find a 4-Carm with
D+D+D+2*D digits,
better than your
D+D+D+D
but short of
D+D+D+3*D.
This piggy-in-middle method would enable one to use
both CRT and BLS, whereas you went for the small pig
much of action to allow BLS.
Bet that idea isn't in your 40-pages of Carmichael notes:-)
David
• ... 4-Carms were a long time back. I don t believe I expended more than about 2 pages on them. I spent a while writng the code that optimally placed factors
Message 14 of 19 , Jan 31, 2003
> Oh, I just thought of a nice grace note.
> One can use algebraically factorizing
> seeds to find a 4-Carm with
> D+D+D+2*D digits,
> better than your
> D+D+D+D
> but short of
> D+D+D+3*D.
> This piggy-in-middle method would enable one to use
> both CRT and BLS, whereas you went for the small pig
> much of action to allow BLS.
> Bet that idea isn't in your 40-pages of Carmichael notes:-)
> David

4-Carms were a long time back. I don't believe I expended more than about 2
pages on them. I spent a while writng the code that optimally placed factors
into the different slots, but that's a different matter!

Phil

=====
Is that free as in Willy or as in bird?

__________________________________________________
Do you Yahoo!?
http://mailplus.yahoo.com
• ... Spot on. I like it. That s a completely sideways way of looking at things! I like it! ... Speak for yourself. I used {1,2,12} and {1,4,6} /precisely/ for
Message 15 of 19 , Jan 31, 2003
> > Woh! I can't see Dickman in that at all.
> > What am I missing?
>
> Let me backtrack a bit, to Aurifeuille.
...
> (more or less) T1 and T2 digits then Dickman suggests
> (don't eek me, I only said sugests) that we will
> get them twice as fast, compared with when there
> isn't such a factorization.

Spot on. I like it. That's a completely sideways way of looking at things!
I like it!

> Now what's special about those 3 buttons?
>
> ? print(factor((1+x)*(1+2*x)*(1+12*x)-1))
> [x, 1; 4*x + 3, 1; 6*x + 5, 1]
>
> ? print(factor((1+x)*(1+2*x)*(1+15*x)-1))
> [x, 1; 3*x + 2, 1; 10*x + 9, 1]
>
> ? print(factor((1+x)*(1+4*x)*(1+6*x)-1))
> [x, 1; 2*x + 1, 1; 12*x + 11, 1]
>

Speak for yourself. I used {1,2,12} and {1,4,6} /precisely/ for that reason.
I think one of my results was a {1,4,6} wasn't it?

> But Markus is stuck with his NewPgen output and so
> can't use CRT. So I thought it would be useful
> to give {1,2,12} a proof in principle by beating the
> 4-Carmichael 4th factor record.
>
> In fact I did it twice over:
>
> 2763260532*((1591638066432*3061#+19)^2-1)*3061#/
> 286610757607008951353515277095+1 3909 p44 03
> 4-Carmichael factor (4)
>
> 757849549*((72753556704*3061#+19)^2-1)*3061#/
> 56731664597567983567442428202862519351615138+1 3891 p44 03
> 4-Carmichael factor (4)
>
> finding ECM factors at about twice
> the rate for this seed as compared with
> the other seeds in my plantpot.
>
> OK, it's a trivial [and suspect] use of Dickman.

Indeed it is.
It's a great idea for trying to squeeze out factors quicker, certainly.

I'm surprised it works, because the factors need to be of a particular
residue in order to make the thing carmichael (like your lucky 2s in the
3-carmichael).

There's the Saarinen DLOG trick one can use. However, he's not published
that yet. (It's one of the tricks that helped him generate the multi-
billion-digit Carmichael so easily last month.)

You seem to have claimed a *2 speedup over the naive technique? I think I
claimed either a *4 or *6 speedup. However, as you have noted elsewhere I
become constipated with CRT. (Not every element in my speedup contributed a
linear factorisation, and therefore caused doping-bloat.)

Phil

=====
Is that free as in Willy or as in bird?

__________________________________________________
Do you Yahoo!?
http://mailplus.yahoo.com
• ... Thank ye, kind sir. ... Yep. I said you didn t need to bother, not that you didn t bother. ... begin{explanation} lcm{1,2,12}=3*4 We get the 3 from 4*x+3,
Message 16 of 19 , Feb 1, 2003
Phil:

> That's a completely sideways way of looking at things!
> I like it!

Thank ye, kind sir.

> I think one of my results was a {1,4,6} wasn't it?

Yep. I said you didn't need to bother, not that you
didn't bother.

> I'm surprised it works, because the factors need to
> be of a particular residue in order to make the thing
> carmichael

\begin{explanation}

lcm{1,2,12}=3*4

We get the 3 from 4*x+3, since
x=0 mod 3, because of the primorial.

We get the 4 from (1+1/d).
All possible d's are odd.
So half of the combinations of ECM primes work.
But we double these using the 5 from 6*x+5,
since x=0 mod 5, because of the primorial.

So {1,2,12} is as rich as {1,2,3}.
Except that it's twice as fast
as {1,2,3} for ECMing.

\end{explanation}
• Given the 1,2,15 seed, or 1,2,12 seed etc Couldn t we check for a 4th element using ECM, and a 5th element (e) such that we can form a 5-carm of the form N
Message 17 of 19 , Feb 1, 2003
Given the 1,2,15 seed, or 1,2,12 seed etc
Couldn't we check for a 4th element using ECM, and a 5th element (e) such
that we can form a 5-carm of the form N = e*1*2*15*ECMFactor

Carms of that form

5,7,13,193,439,9637697
5,7,13,193,499,2657,42829
5,19,37,547,1009,3211937
7,19,37,541,601,58741
7,79,157,2341,3541,9181
13,19,37,541,631,2689
13,19,37,541,739,811,1231
13,19,37,541,1009,2311
17,37,73,1093,2081,65521
17,37,73,1093,93809

With my dataset i will have 4 billion possible e values and 350 titanic
seeds of whatever 3 part seed that is picked. Is it possible to create a
carm by extending a "seed" at both ends. IE creating a custom formula for
each e*1*2*15 which is then ecm'd. to get a prime that
makes e*1*2*15*ECMFactor PRP

Markus

>But Markus is stuck with his NewPgen output and so
>can't use CRT. So I thought it would be useful
>to give {1,2,12} a proof in principle by beating the
>4-Carmichael 4th factor record.
>
>In fact I did it twice over:
>
>2763260532*((1591638066432*3061#+19)^2-1)*3061#/
>286610757607008951353515277095+1 3909 p44 03
>4-Carmichael factor (4)
>
>757849549*((72753556704*3061#+19)^2-1)*3061#/
>56731664597567983567442428202862519351615138+1 3891 p44 03
>4-Carmichael factor (4)
>
>finding ECM factors at about twice
>the rate for this seed as compared with
>the other seeds in my plantpot.
>
>OK, it's a trivial [and suspect] use of Dickman.
>
>But it was funny that you should mention him
>just after I had commented on another such seed
>{1,2,15}. But Markus can't use that
>Dickman-enhanced seed, since he
>can't get x=1 mod 5 in
>
>? print(factor((1+x)*(1+2*x)*(1+15*x)-1))
>[x, 1; 3*x + 2, 1; 10*x + 9, 1]
>
>because he used primorial mode in NewPgen.
>
>David
>
>
>Unsubscribe by an email to: primenumbers-unsubscribe@yahoogroups.com
>The Prime Pages : http://www.primepages.org/
>
>
>
>Your use of Yahoo! Groups is subject to http://docs.yahoo.com/info/terms/
• ... Phil, I have been keeping such a list and was going to offer up all my findings once I have found my next brilliant number. The program I used to do the
Message 18 of 19 , Feb 1, 2003
Phil Carmody wrote:
>
> ... brilliant numbers. If someone has kept a full record of all the numbers
> that were in the range they tested for brilliantness, then that would be a
> fair corpus, surely? I'm assuming they weren't sieved terribly deeply, and
> therefore only the medium sized factors would need to have been recorded, as
> the small ones can be reproduced easily (hey, I'll offer to find all the
> small factors to test my new factoring algorithm!). Sure they're smaller
> than 200 digits by a factor of 2, but they're about as unbiased as you can
> get, and there's a fair number of them.
>
> Anyone with such a collection?
>
> Phil
>
> =====
> Is that free as in Willy or as in bird?

Phil,

I have been keeping such a list and was going to offer up all my
findings once I have found my next brilliant number. The program I used
to do the "sieving" was Miracl's factor.exe (slightly modified by me)
and it does some ecm to remove up to 25 digit factors. So, would my
list be too sparse to be useful? I didn't keep any of the sieved
numbers, but everything that fell through that I have nfsx or ppsiqs
factorings of.

I was also thinking of giving the timings for the work thus far, and
maybe also giving a list of the primes that were found in this range.
Would either of the last two pieces of information be useful to anyone?
I'm still compiling the information and still waiting for the next
brilliant, but as soon as both are ready I'll e-mail the list.

-David C.
• ... I think that what you have so far cleaved/cleft/cloven is rather impressive and that the more you document it the better. Since the files are presumably
Message 19 of 19 , Feb 1, 2003