I just read this "news" report about Stanford professors who discovered a pattern in prime numbers: https://www.nature.com/news/peculiar-pattern-found-in-random-prime-numbers-1.19550 According to this report (which is admittedly not the original research paper), it sounds like the claim is merely that consecutive primes share a units digit less often than chance.
My first thought was "why would one expect differently?". If one merely makes a list of "potential primes" (namely any number with at least two digits that ends in 1,3,7,or 9) and then assigns each number an independent probability of being prime, then one would expect the units digits in consecutive primes to be equal slightly less often than chance.
To see what I mean, let's re-cast this as the following problem: Suppose I roll an ordinary 6-sided die many, many times, take 1000 as an example. I number the rolls 1-1000 in a list, and I put a check mark next to every time the number "1" comes up on the die. The probability that the numbers next to a pair of consecutive check marks have the same units digit will be less than 1/10. Why you ask? Because, the number of "failures" (rolls that are NOT a "1") follows a geometric distribution, which is monotone decreasing. Therefore, the probabilities that, after rolling a 1, it will take 1,2,3,...9 additional rolls to roll the next "1" are each greater than the probability it will take 10 rolls. Similarly, the probabilities that it will take any of 11-19 rolls are each greater than the probability it will take 20 rolls, and so on. So the probability of taking exactly 10, 20, 30,... etc. rolls is less than 1/10. In fact, it's less than 0.04.
Of course, the probability a number N is prime decreases as N increases. However, the above argument is independent of the probability, as long as the probabilities of success on successive trials don't change much. But since this probability is proportional to 1/ln(N), which varies very slowly indeed, this should be true.
However, what I think is going on here is that one only expects a significant bias when the average gap between primes is less than or about 10. In fact, for lists of small primes (e.g. primes less than 400), I've checked some cases and the above argument (using the observed fraction of potential primes in that range that ARE prime, as if it's uniform) predicts the chance of sharing units digits remarkably well. Above about 22,000, the average gap is longer than 10 but they looked at the first billion primes. Of them, such a small fraction are less than 22,000 that the bias should be negligible at that point. Yet they still see biases of more than a few percent.
So, I'm suspecting that the authors of the original paper are actually making the claim that the bias decays substantially slower than predicted by the geometric series argument, for some rigorous definition of "substantially slower", and this subtlety was lost in the general-audience reporting. I don't know enough number theory to understand what that might be, but does anyone here understand what they are actually claiming?