Specifically, if a number $~$x$~$ is $~$n$~$ digits long \(in decimal notation\), then its logarithm \(base 10\) is between $~$n-1$~$ and $~$n$~$\. This follows directly from the definition of the logarithm: $~$\\log\_{10}(x)$~$ is the number of times you have to multiply 1 by 10 to get $~$x;$~$ and each new digit lets you write down ten times as many numbers\. In other words, if you have one digit, you can write down any one of ten different things \(0\-9\); if you have two digits you can write down any one of a hundred different things \(00\-99\), if you have three digits, you can write down any one of a thousand different things \(000\-999\), and in general, each digit lets you write down ten times as many things\. Thus, the number of digits you need to write $~$x$~$ is close to the number of times you have to multiply 1 by 10 to get $~$x$~$\. The only difference is that, when computing logs, you multiply 1 by 10 exactly as many times as it takes to get $~$x$~$, which might require multiplying by 10 a fraction of a time \(if x is not a power of 10\), whereas the number of digits in the base 10 representation of x is always a whole number\.

Is this paragraph needed? I find myself wanting to skip past it.