[summary: The logarithm base $~$b$~$ of a number $~$n,$~$ written $~$\log_b(n),$~$ is the answer to the question "how many times do you have to multiply 1 by $~$b$~$ to get $~$n$~$?" For example, $~$\log_{10}(1000)=3,$~$ because $~$10 \cdot 10 \cdot 10 = 1000,$~$ and $~$\log_2(16)=4$~$ because $~$2 \cdot 2 \cdot 2 \cdot 2 = 16.$~$]
[summary(Technical): $~$\log_b(n)$~$ is defined to be the number $~$x$~$ such that $~$b^x = n.$~$ Thus, logarithm functions satisfy the following properties, among others:
- $~$\log_b(1) = 0$~$
- $~$\log_b(b) = 1$~$
- $~$\log_b(x\cdot y) = log_b(x) + \log_b(y)$~$
- $~$\log_b(\frac{x}{y}) = \log_b(x) - \log_b(y)$~$
- $~$\log_b(x^n) = n\log_b(x)$~$
- $~$\log_b(\sqrt[n]{x}) = \frac{\log_b(x)}{n}$~$
- $~$\log_b(n) = \frac{\log_a(n)}{\log_a(b)}$~$]
[summary(Inverse exponentials): Logarithms are the inverse of [exponential exponentials]. That is, for any base $~$b$~$ and number $~$n,$~$ $~$\log_b(b^n) = n$~$ and $~$b^{\log_b(n)} = n.$~$]
[summary(Measure of data): A message that singles out one thing from a set of $~$n$~$ carries $~$\log(n)$~$ units of data, where the unit of information depends on the base of the logarithm. For example, a message singling out one thing from 1024 carries about three decits of data (because $~$\log_{10}(1024) \approx 3$~$), or exactly ten bits of data (because $~$\log_2(1024)=10$~$). For details, see Bit (of data).]
[summary(Generalized lengths): A quick way of approximating the logarithm base 10 is to look at the length of a number: 103 is a 3-digit number but it's almost a 2-digit number, so its logarithm (base ten) is a little higher than 2 (it's about 2.01). 981 is also a three-digit number, and it's using nearly all three of those digits, so its logarithm (base ten) is just barely lower than 3 (it's about 2.99). In this way, logarithms generalize the notion of "length," and in particular, $~$\log_b(n)$~$ measures the generalized length of the number $~$n$~$ when it's written in $~$b$~$-ary notation.]
The logarithm base $~$b$~$ of a number $~$n,$~$ written $~$\log_b(n),$~$ is the answer to the question "how many times do you have to multiply 1 by $~$b$~$ to get $~$n$~$?" For example, $~$\log_{10}(100)=2,$~$ and $~$\log_{10}(316) \approx 2.5,$~$ because $~$316 \approx$~$ $~$10 \cdot 10 \cdot \sqrt{10},$~$ and [ multiplying by $~$\sqrt{10}$~$ corresponds to multiplying by 10 "half a time"].
In other words, $~$\log_b(x)$~$ counts the number of $~$b$~$-factors in $~$x$~$. For example, $~$\log_2(100)$~$ counts the number of "doublings" in the number 100, and $~$6 < \log_2(100) < 7$~$ because scaling an object up by a factor of 100 requires more than 6 (but less than 7) doublings. For an introduction to logarithms, see the Arbital logarithm tutorial. For an advanced introduction, see the [advanced_log_tutorial advanced logarithm tutorial].
Formally, $~$\log_b(n)$~$ is defined to be the number $~$x$~$ such that $~$b^x = n,$~$ where $~$b$~$ and $~$n$~$ are numbers. $~$b$~$ is called the "base" of the logarithm, and has a relationship to the [number_base base of a number system]. For a discussion of common and useful bases for logarithms, see the page on [logarithm_bases logarithm bases]. $~$x$~$ is unique if by "number" we mean [4bc $~$\mathbb R$~$], but may not be unique if by "number" we mean [complex_number $~$\mathbb C$~$]. For details, see the page on [complex_logarithm complex logarithms].
Basic properties
Logarithms satisfy a number of desirable properties, including:
- $~$\log_b(1) = 0$~$ for any $~$b$~$
- $~$\log_b(b) = 1$~$ for any $~$b$~$
- $~$\log_b(x\cdot y) = log_b(x) + \log_b(y)$~$
- $~$\log_b(x^n) = n\log_b(x)$~$
- $~$\log_a(n) = \frac{\log_b(n)}{\log_b(a)}$~$
For an expanded list of properties, explanations of what they mean, and the reasons for why they hold, see Logarithmic identities.
Interpretations
Logarithms can be interpreted as a generalization of the notion of the [number_length length of a number]: 103 and 981 are both three digits long, but, intuitively, 103 is only barely using three digits, whereas 981 is pushing its three digits to the limit. Logarithms quantify this intuition: the [common_logarithm common logarithm] of 103 is approximately 2.01, and the common log of 981 is approximately 2.99. Logarithms give rise to a notion of exactly how many digits a number is "actually" making use of, and give us a notion of "fractional digits." For more on this interpretation (and why it is 316, not 500, that is two and a half digits long), see Log as generalized length.
Logarithms can be interpreted as a measure of how much data it takes to carry a message. Imagine that you and I are both facing a collection of 100 different objects, and I'm thinking of one of them in particular. If I want to tell you which one I'm thinking of, how many digits do I need to transmit to you? The answer is $~$\log_{10}(100)=2,$~$ assuming that by "digit" we mean "some method of encoding one of the symbols 0-9 in a physical medium." Measuring data in this way is the cornerstone of information theory.
Logarithms are the inverse of exponentials. The function $~$\log_b(\cdot)$~$ inverts the function $~$b^{\ \cdot}.$~$ In other words, $~$\log_b(n) = x$~$ implies that $~$b^x = n,$~$ so $~$\log_b(b^x)=x$~$ and $~$b^{\log_b(n)}=n.$~$ Thus, logarithms give us tools for analyzing anything that grows exponentially. If a population of bacteria doubles each day, then logarithms measure days in terms of bacteria — that is, they can tell you how long it will take for the population to reach a certain size. For more on this idea, see Logarithms invert exponentials.
Applications
Logarithms are ubiquitous in many fields, including mathematics, physics, computer science, cognitive science, and artificial intelligence, to name a few. For example:
In mathematics, the most natural logarithmic base is [mathematics_e $~$e$~$] ([log_e_is_natural Why?]) and the log base $~$e$~$ of $~$x$~$ is written $~$\ln(x)$~$, pronounced "[natural_logarithm natural log] of x." The natural logarithm of a number gives one notion of the "intrinsic length" of a number, a concept that proves useful when reasoning about other properties of that number. For example, the quantity of prime numbers smaller than $~$x$~$ is approximately $~$\frac{x}{\ln(x)},$~$ this is the [prime_number_theorem prime number theorem].
Logarithms also give us tools for measuring the runtime (or memory usage) of algorithms. When an algorithm uses a [divide_and_conquer divide and conquer] approach, the amount of time (or memory) used by the algorithm increases logarithmically as the input size grows linearly. For example, the amount of time that it takes to perform a [binary_search binary search] through $~$n$~$ possibilities is $~$\log_2(n),$~$ which means that the search takes one unit longer to run every time the set of things to search through doubles in size.
Logarithms give us tools for studying the tools we use to represent numbers. For example, humans tend to use ten different symbols to represent numbers (0, 1, 2, 3, 4, 5, 6, 7, 8, and 9), while computers tend to use two digits (0 and 1). Are some representations better or worse than others? What are the pros and cons of using more or fewer symbols? For more on these questions, see [number_base Number bases].
The human brain encodes various perceptions logarithmically. For example, the perceived tone of a sound goes up by one octave every time the frequency of air vibrations doubles. Your perception of tone is proportional to the logarithm (base 2) of the frequency at which the air is vibrating. See also Hick's law.
Comments
Eric Bruylant
Having a long redlink which does not point anywhere seems weird? Does the page it should point to now exist?