Recently, while working on something else, I invented as a side effect, a new(?) algorithm

to find factors of an integer N.

Specifically, I believe I could prove that it will always find a nontrivial factor of N

in time bounded by O(N^0.251) steps, each step an arithmetic operation on

O(logN)-digit numbers. Really, runtime is O(N^(1/4)*logN^somepower) but I have

not carefully thought about the log terms.

Question: Is this new & interesting?

This factoring algorithm is clearly a lot slower asymptotically than the best

(quadratic sieve, ECM, NF-sieve) and comparable in time bound to Pollard-rho method

(except I think my algorithm will usually perform worse than Pollard rho in practice),

so my runtime is unexciting. BUT my algorithm is deterministic and rigorous.

I think perhaps those other better algorithms are randomized and/or their

time bounds are nonrigorous.

So in that (not very interesting) sense, maybe I have a new record.

Or, maybe I do not even have that. You tell me please.