Although some people find it counterintuitive, the decimal expansions and represent the same Real number.
Informal proofs
These "proofs" can help give insight, but be careful; a similar technique can "prove" that . They work in this case because the [-series] corresponding to is [-absolutely_convergent].
The real numbers are [-dense], which means that if , there must be some number in between. But there's no decimal expansion that could represent a number in between and .
Formal proof
This is a more formal version of the first informal proof, using the definition of Decimal notation.
%%hidden(Show proof): is the decimal expansion where every digit after the decimal point is a . By definition, it is the value of the series . This value is in turn defined as the [-limit] of the sequence . Let denote the th term of this sequence. I claim the limit is . To prove this, we have to show that for any , there is some such that for every , .
Let's prove by induction that . Since is the sum of { terms, , so . If , then
\begin{align} 1 - a{i+1} &= 1 - (ai + 9 \cdot 10^{-(i+1)}) \newline &= 1-a_i - 9 \cdot 10^{-(i+1)} \newline &= 10^{-i} - 9 \cdot 10^{-(i+1)} \newline &= 10 \cdot 10^{-(i+1)} - 9 \cdot 10^{-(i+1)} \newline &= 10^{-(i+1)} \end{align}
So for all . What remains to be shown is that eventually gets (and stays) arbitrarily small; this is true by the [archimedean_property] and because is monotonically decreasing. %%
Arguments against
These arguments are used to try to refute the claim that . They're flawed, since they claim to prove a false conclusion.
- and have different digits, so they can't be the same. In particular, starts "," so it must be less than 1.
%%hidden(Why is this wrong?): Decimal expansions and real numbers are different objects. Decimal expansions are a nice way to represent real numbers, but there's no reason different decimal expansions have to represent different real numbers. %%
- If two numbers are the same, their difference must be . But .
%%hidden(Why is this wrong?): Decimal expansions go on infinitely, but no farther. doesn't represent a real number because the is supposed to be after infinitely many s, but each digit has to be a finite distance from the decimal point. If you have to pick a real number to for to represent, it would be . %%
- is the limit of the sequence . Since each term in this sequence is less than , the limit must also be less than . (Or "the sequence can never reach .")
%%hidden(Why is this wrong?): The sequence gets arbitrarily close to , so its limit is . It doesn't matter that all of the terms are less than . %%
- In the first proof, when you subtract from , you don't get . There's an extra digit left over; just as , .
%%hidden(Why is this wrong?): There are infinitely many s in , so when you shift it over a digit there are still the same amount. And the "decimal expansion" doesn't make sense, because it has infinitely many digits and then a . %%
Comments
Eric Rogstad
If these are included I think it would be good to also include explanations of why each one is wrong.