12

This might be elementary (or obviously wrong) for mathematicians. I am an engineer.

Since we write numbers using finite strings of symbols (not necessarily digits - even formulas are finite strings of symbols) does that mean that there are a lot more real numbers which cannot be written in any way, than there are real numbers in general?

Is it true that we cannot even think in any useful way about most real numbers, other than knowing that they exist?

Bill Dubuque
  • 282,220
  • 4
    Perhaps you are thinking of Computable Numbers. – lulu Oct 13 '24 at 11:25
  • 2
    @lulu: I don't think that's quite right. There are well-defined real numbers that are not computable $-$ Chaitin's constant, for example. – TonyK Oct 13 '24 at 11:33
  • 1
    that cannot be expressed? - this depends on your definition of an expression. $1/3$ is expressible with $3$ symbols, but its decimal representation needs infinitely many digits. What about $\pi$? It can be defined by $\sin(\pi)=0$, the first root. Would you say, it cannot be expressed? This is unclear. – Dietrich Burde Oct 13 '24 at 11:34
  • 7
    @DietrichBurde: whatever your definition of expression, most real numbers will not be expressible. I think the question is a good one. – TonyK Oct 13 '24 at 11:36
  • @TonyK I don't see, why this should be true. It will certainly need more details, on how to understand the question. Call a real number expressible, if it is the root of a polynomial with real coefficients. Then $f(x)=x-a$ will do for all $a$. – Dietrich Burde Oct 13 '24 at 11:38
  • 8
    @DietrichBurde: That is disingenuous. The number of expressions is countable, however you define "expression". Unless you can provide a counterexample? – TonyK Oct 13 '24 at 11:40
  • Any time you write a number, you use a finite set of set of symbols from a finite alphabet. When you say "the positive solution of equation x*x=2" (meaning square root of 2), you have used a finite set of symbols from whatever is on your keyboard). That makes the set of numbers you can express with your keyboard finite. This can be extended to any set of symbols you might use. –  Oct 13 '24 at 11:41
  • @DietrichBurde There is a Wikipedia page on the topic of "definable numbers", which the OP may be interested in. As it points out, "Different choices of a formal language or its interpretation give rise to different notions of definability... Because formal languages can have only countably many formulas, every notion of definable numbers has at most countably many definable real numbers. However, by Cantor's diagonal argument, there are uncountably many real numbers, so almost every real number is undefinable." – Theo Bendit Oct 13 '24 at 11:42
  • @Censoredtoprotecttheguilty This definition is... problematic. Refer to the Berry paradox for an example of how this kind of definition can be undone without proper guardrails. – Theo Bendit Oct 13 '24 at 11:45
  • @TonyK Not sure what point you are making. I think the OP is speaking of the notion that "most" real numbers lack a definition by explicit formula or algorithm. I entirely agree that this lack does not make them useless, irrelevant, nor even undefinable. – lulu Oct 13 '24 at 11:58
  • 6
    If you want to do a deep dive, this JDH answer presents many subtleties about the question: https://mathoverflow.net/questions/44102/is-the-analysis-as-taught-in-universities-in-fact-the-analysis-of-definable-numb#44129 – Lavender Oct 13 '24 at 11:58
  • 1
    @lulu No. I am saying that here are more numbers that there are more real numbers than formulas so many (actually most) real numbers cannot be expressed using any finite set if symbols. And because we cannot express them, we cannot think about them in any meaningful way. –  Oct 13 '24 at 12:29
  • Well, that's the same thing, and it is not correct. There are meaningfully defined uncomputable numbers. – lulu Oct 13 '24 at 13:44
  • Consider Graham's number. – Integreek Oct 14 '24 at 05:03
  • @MathGuy: Graham's number is computable. – TonyK Nov 07 '24 at 23:23
  • @TonyK are you referring to computing the exact value of Graham's number? I don't think it's possible, given the enormous magnitude of even $g_1$. I have seen that a its last $400$ digits can be computed, but not the entire number. – Integreek Nov 08 '24 at 07:02
  • @MathGuy: The word "computable" has a technical meaning in this context. It doesn't mean that you can generate it on a real physical computer. – TonyK Nov 08 '24 at 11:09

3 Answers3

13

Yes, in a sense this is a consequence of the fact that the real numbers are uncountable whereas there will only ever be countably many strings in any finite alphabet.

However this has to be qualified somewhat because otherwise you run into various paradoxes - one which comes to mind of most relevance is the Berry paradox, which shows that one cannot use natural language descriptions of number freely in a naive way.

So whenever you talk about definability, you have to discuss it by reference to a particular formal language - for example, first order Peano arithmetic, or ZFC set theory, or otherwise come up with notions of what it means to define a number such as computability (which is a bit more restrictive but is in the same spirit).

In effectively any formal language we can come up with (that includes the real numbers) which consists of finite strings in a fixed alphabet, except in trivial and unrealisable cases such as the language having uncountably many symbols, there will be real numbers not expressible in that language.

This largely assumes we are working in a theory in which the real numbers are genuinely uncountable. There are externally countable models of the real numbers which think they are uncountable internally, but I don’t imagine that’s what you mean with the question.

Edit: the link to mathoverflow by lavender below gives more information about the subtleties of this concept. It really only makes sense to talk about this outside of the formal theory you are considering and only if you have a given model in mind which genuinely is uncountable. There are countable models of ZFC for example - mathematical ”universes” which obey all of the axioms of the now fairly standard foundational system for mathematics - in which every object, including every real number, can be seen outside to be definable. The axioms don’t know which model you’re working in, so unless you step outside the formal system, this whole concept falls apart a bit.

Nethesis
  • 4,244
  • Even in the case of the last paragraph, I suspect the language itself cannot express all the real numbers it thinks exists, even though there are only countably many of them. That goes beyond any knowledge of model theory I think I have, though. – Nethesis Oct 13 '24 at 11:45
  • 6
    your suspicion is false: see https://mathoverflow.net/questions/44102/is-the-analysis-as-taught-in-universities-in-fact-the-analysis-of-definable-numb#44129 – Lavender Oct 13 '24 at 12:00
  • Well then I’m very glad I didn’t state it as a fact, thank you for the link! – Nethesis Oct 13 '24 at 12:15
  • 6
    I’ve also answered something similar before: https://math.stackexchange.com/a/4852973/465145. In short, from outside, in a suitably powerful metatheory you can always prove the collection of (parameter-free) definable objects is countable. But countability from outside and countability inside the model are completely distinct from each other, and neither implies the other. In particular, $\mathbb{R}$ is uncountable in-universe, but it can still be pointwise definable - there are models of ZFC in which every real number is definable. – David Gao Oct 13 '24 at 21:54
  • What if we take all numbers that can be expressed as a sum of countable subset of countable set of all possible expressions? Then there is $2^{\aleph_0}$ such numbers which is equal to C. I didn't understand half of your answer so maybe, this question is dumb. – hans Oct 14 '24 at 15:28
  • 1
    @hans You are either writing finite sums, in which case the total number of such expressions is countable, or you are writing infinite sums - most of which you will not be able to specify given a finite number of characters (to circle back to the above). – Nethesis Oct 14 '24 at 17:45
4

This is trickier than it sounds.

One is tempted to say that, for any notion of "expressing" or "describing" something, the descriptions can be represented as finite objects from a finite alphabet, so there are only countably many descriptions. Each description specifies at most one object, so there are only countably many describable objects. In particular, since there are uncountably many real numbers, there should be real numbers that are not describable (in fact, many such numbers).

But this only works if the notion of "describing" is itself definable.

It's reasonable to say that a set is expressible or describable if it's first-order definable without parameters. (Note that first-order definability is not itself definable in set theory.)

Then we have:

(1) There is a transitive model of ZFC in which every set is definable without parameters (for example, the minimal transitive model of set theory — the smallest $L_\alpha$ that satisfies ZFC). In particular, in this model, every real number is definable without parameters.

(2) There is a transitive model of ZFC in which there are only countably-many real numbers that are definable without parameters (for example, $\mathrm V_\kappa$ where $\kappa$ is a strongly inaccessible cardinal). In such a model, there are real numbers that are not definable.

Example (1) above requires the existence of a transitive model of ZFC; this assumption is clearly necessary.

Example (2) above requires the existence of a strongly inaccessible cardinal; this assumption can be reduced in strength.

If you believe in a Platonist universe that satisfies ZFC, that's presumably like (2), with only countably many definable real numbers (and therefore with real numbers that are not definable). But this isn't provable, or even expressible, in ZFC.

  • 1
    +1. In particular (AIUI) all of the following claims are (or can be) true: (1) All first-order formulas can be encoded as finite strings of symbols from a finite alphabet. (2) There are only countably many such strings. (3) There are uncountably many real numbers. (4) There is a model of ZFC in which everything, including every real number, is uniquely definable by a first-order formula. (The "trick", of course, is that the function mapping finite symbol strings to the real numbers defined by the first-order formulas they encode is not first-order definable, and thus not part of the model.) – Ilmari Karonen Oct 14 '24 at 07:47
1

Yes, it is true that there are real numbers that cannot be expressed.

The number of real numbers is infinity, and it can be shown that this infinity is of a different kind than the finite strings of symbols we use to represent numbers. For example, although we can write rational numbers (such as fractions) using a finite number of symbols, the number of rational numbers is countable, meaning that they can be enumerated.

In contrast, real numbers, which include both rationals and irrationals, are noncountable. This can be shown through Cantor's argument, which states that a one-to-one correspondence cannot be established between natural numbers and real numbers. Therefore, there are many more real numbers that cannot be expressed as finite strings of symbols, and it can be argued that most real numbers cannot be described or written in any way.

Furthermore, most real numbers are irrational and transcendental numbers, which do not have simple algebraic representations. Often, we can only affirm their existence, but we cannot have a complete understanding of them in terms of representation or description.

I apologize if some parts of the text are unclear; my English is not fluent.

Felix.S
  • 21
  • This explains essentially the same concepts as in Nethesis answer but using much simpler language which might actually be more useful given OPs stated background. I don't understand why this was downvoted. – quarague Oct 14 '24 at 07:04
  • 2
    "Therefore, there are many more real numbers that cannot be expressed as finite strings of symbols, and it can be argued..." this conclusion (and follow up) doesn't follow from the fact that there are more reals than rationals. Nethesis answer explains why this is wrong. (And see JDH answer on MO) – Burnsba Oct 14 '24 at 13:53