0.999...=1

https://arbital.com/p/5r7

by Dylan Hendrickson Aug 3 2016 updated Aug 4 2016

No, it's not "infinitesimally far" from 1 or anything like that. 0.999... and 1 are literally the same number.


Although some people find it counterintuitive, the decimal expansions and represent the same Real number.

Informal proofs

These "proofs" can help give insight, but be careful; a similar technique can "prove" that . They work in this case because the [-series] corresponding to is [-absolutely_convergent].

Formal proof

This is a more formal version of the first informal proof, using the definition of Decimal notation.

%%hidden(Show proof): is the decimal expansion where every digit after the decimal point is a . By definition, it is the value of the series . This value is in turn defined as the [-limit] of the sequence . Let denote the th term of this sequence. I claim the limit is . To prove this, we have to show that for any , there is some such that for every , .

Let's prove by induction that . Since is the sum of { terms, , so . If , then

\begin{align} 1 - a{i+1} &= 1 - (ai + 9 \cdot 10^{-(i+1)}) \newline &= 1-a_i - 9 \cdot 10^{-(i+1)} \newline &= 10^{-i} - 9 \cdot 10^{-(i+1)} \newline &= 10 \cdot 10^{-(i+1)} - 9 \cdot 10^{-(i+1)} \newline &= 10^{-(i+1)} \end{align}

So for all . What remains to be shown is that eventually gets (and stays) arbitrarily small; this is true by the [archimedean_property] and because is monotonically decreasing. %%

Arguments against

These arguments are used to try to refute the claim that . They're flawed, since they claim to prove a false conclusion.

%%hidden(Why is this wrong?): Decimal expansions and real numbers are different objects. Decimal expansions are a nice way to represent real numbers, but there's no reason different decimal expansions have to represent different real numbers. %%

%%hidden(Why is this wrong?): Decimal expansions go on infinitely, but no farther. doesn't represent a real number because the is supposed to be after infinitely many s, but each digit has to be a finite distance from the decimal point. If you have to pick a real number to for to represent, it would be . %%

%%hidden(Why is this wrong?): The sequence gets arbitrarily close to , so its limit is . It doesn't matter that all of the terms are less than . %%

%%hidden(Why is this wrong?): There are infinitely many s in , so when you shift it over a digit there are still the same amount. And the "decimal expansion" doesn't make sense, because it has infinitely many digits and then a . %%


Comments

Eric Rogstad

These arguments are used to try to refute the claim that \. They're flawed, since they claim to prove a false conclusion\.

If these are included I think it would be good to also include explanations of why each one is wrong.