9

This is essentially a Bertrand's postulate version for twin primes. I am interested in both an explicit example and large lower bounds for it because of this answer of mine. In the comments below the answer, it is shown that there is no such $n$ below $8\times 10^{15}$.

An efficient algorithm would be as follows: take an initial point $m$ for which Bertrand's postulate for twin primes is true (say, $13$). Find the greatest prime twin $p\lt 2n$. The new initial point is $p$. Iterate.

An explicit example of such $n$ would cause a very large gap $\approx n$. Although it seems quite unlikely for such $n$ to exist, a proof remains far from reality, so I am interested in a computational effort.

Do we know $n\gt 5$ with no twin prime $n\lt q \lt 2n$? If not, what's the best known lower bound?

Ian Mateus
  • 7,561

2 Answers2

5

We can't yet prove that there are infinitely many twin primes, so we certainly can't prove that there's a twin prime in (n, 2n). But it's surely true. Indeed, between n and 2n we expect there to be about $Cn/\log^2 n$ twin primes for some positive constant $C$.

If you look at the worst case, consider A113275: Kourbatov found that the twin prime 1121784847637957 was followed by a longer gap of non-twin primes than any smaller twin prime, but the distance to the next twin prime was just 23382. That's a lot smaller than the 1121784847637957 you'd need to make the twin prime Bertrand fail!

Using the same approach as Michael Stocker I was able to check that there are no exceptions up to $10^{120}.$

Edit: I extended the range to $10^{262}$ and proved all the primalities. Time required was a few hours.

Charles
  • 32,999
  • 1
    $+1$. The $10^{120}$ was computed assuming Michael's result or unconditionally? Here is the relevant ArXiV paper. – Ian Mateus Feb 04 '14 at 15:54
  • @IanMateus: Unconditionally, starting from 13. – Charles Feb 04 '14 at 15:59
  • This shows that we can rearrange $1, 2, 3,\ldots, n$ such that the sum of any adjacent numbers is prime for every number $n$ up to $10^{165}$. Nice! – Ian Mateus Feb 04 '14 at 16:31
  • 1
    @IanMateus: Indeed -- see https://oeis.org/A103839 – Charles Feb 04 '14 at 16:47
  • @Charles You have done some large calculations, $10^{262}$. I wanted to connect with you for help regarding https://math.stackexchange.com/questions/4423531/do-prime-of-the-form-4k1-ever-lead-the-greatest-prime-factor-race which will also require large calculations. In case you are interested, let me know how to connect (email or otherwise) – Nilotpal Sinha Nov 05 '22 at 09:29
1

You can get very far with very little effort. I think an example to make your algorithm clearer wouldn't be bad. If we have a twin-prime p,p+2, for every n in the interval [((p+2)+1)/2 , p-1] there is (at least) one twin-prime in [n,2n], namely p,p+2.

We know that 17 and 19 form a twin prime. So every n in [10,16] satifies "Bertrands postulate for twin-primes" So the next twin-prime we want so find should generate a Interval that starts no earlier than 17. So we want ((p+2)+1)/2 < 18. Now we look for the biggest twin prime that satisfies that unequality. We find the twin-prime 29,31 and the Interval [16,28] --- 41,43 ; [22,40] --- 71,73 ; [37,70] --- 137,139; [70,136] --- ......

I let my pc compute for only a few minutes without much optimization and got to the 96 digit twinprim 65401729995203484466533471060061363152550275078004047522007054662124597261449032607389217013721 (+2)

  • Did you use any probabilistic prime test in the code? They are probably not safe for such large numbers. – Ian Mateus Feb 04 '14 at 13:44
  • @IanMateus: Considering that a primality proof of a number that size takes 200 milliseconds, Michael probably didn't have to use a PrP test. – Charles Feb 04 '14 at 15:17
  • 1
    @Charles if true, that's impressive! Only $5$ orders of magnitude from a googol. – Ian Mateus Feb 04 '14 at 15:22
  • @IanMateus: That's how long it took my aging PC to prove the primality of the number 654...721 in the answer. – Charles Feb 04 '14 at 15:29
  • I used javas BigInteger isProbablePrime function. They are not 100% primes, but you can get the probability very high. Numbers of that size can be tested for "certain" primality very easily too. – Michael Stocker Feb 04 '14 at 15:52
  • @IanMateus While it's true that 'probably prime' is not 'certainly prime', your assertion of 'not safe for such large numbers' may be based on a misconception; for these algorithms, the size of the number has nothing to do with the primality probability generated by the test. – Steven Stadnicki Feb 04 '14 at 16:56
  • @StevenStadnicki this might be the case, I'm not quite familiar with such tests. I mean, up to a certain bound, we know such probability test is always true (or we have already a compiled table to fix the pseudoprimes up to a certain limit). For example, in this code there is a huge "beware!" telling us the probability of a pseudoprime is $(1/4)^{30}$. – Ian Mateus Feb 04 '14 at 16:58
  • @IanMateus That's true (sort of; that's actually the upper bound on the probability, but close enough) - but you're not going to be testing $2^{60}$ numbers, or anywhere close to it, and birthday paradoxes don't apply here. That's actually an interesting question in its own right - I'm not aware of any number that someone found probabilistically prime by the standard algorithm that was not in fact actually prime. – Steven Stadnicki Feb 04 '14 at 17:17
  • The funny thing is, the reverse is true: the tests are less reliable on small numbers than on large. If you pick large random numbers with a reasonable probable-primality test the failure rate drops with the size of numbers you examine. See, for example, Damgård-Landrock-Pomerance 1993. – Charles Feb 04 '14 at 19:49
  • Yes, there are some numbers which could pass 30 rounds of Miller-Rabin with nearly 1/4^30 probability (one in a million trillion); one such example is 4279607175691. But these numbers can be small or large and there's no significant difference between the probabilities for small or large. The difference is that the higher up you look, the rarer these bad cases become. For ten-digit numbers they're reasonably common; for 10,00-digit numbers they're vanishingly rare. – Charles Feb 04 '14 at 19:52