3.1. Probability

Proability is a concept that is central to quantum theory. This appendix summarizes some essentials about probability.

3.1.1. Definition

A very general definition of probability goes as follows: Probability is a number between 0 and 1 that quantifies our uncertainty, or our level of confidence, about something. The number 0 indicates we are fully confident that this something is false, and the number 1 indicates that we are fully confident that this something is true.

Note

Here is an example: Consider the statement “It is raining right not in Paris.” Since it is either raining or not raining in Paris right now, this statement is either true or false. But we don’t know, since we don’t have any information. Therefore, we say “the probability that it is raining in Paris now is 50%”. Whith this, we mean that, given that we have no relevant information, we are not very confident that it is raining, nor that it is not raining. If we had more information, such as today’s weather forecast for Paris, we would assign a different probability, maybe less or maybe more than 50%. With access to a webcam in Paris (even more information), our probability is probably 0% or 100%.

Often, even having all available information is not sufficient to give us complete confidence. This is the situation in quantum theory. When we have a particle in a box in its ground state, we are not confident where inside the box we can find the particle.

Quantum theory makes probabilistic predictions about experiments that can be repeated many times (such es measurements on single atoms and molecules). This allows us to check our probabilities against the statistics of the measurement outcomes.

3.1.2. The mean

Let’s assume a measurement has a set of \(N\) possible different outcomes \(x_i\). The mean of this set of outcomes is

(3.38)\[\langle x\rangle = \frac{1}{N} \left(x_1+x_2+\cdots+x_N\right) = \frac{1}{N}\sum_{i=1}^N x_i\]

In a more general fashion, we can write this as

(3.39)\[\langle x\rangle = \frac{\sum x_i P(x_i)}{\sum P(x_i)}\]

where \(P(x_i)\) is the number of times the value \(x_i\) was observed. If \(x\) is a continuous variable with probability distribution \(P(x)\), then the mean (expectation value) is

(3.40)\[\langle x\rangle = \frac{\int x P(x)\mathrm{d}x}{\int P(x)\mathrm{d}x}\]

This generalizes to the calculation of the mean of other quantities. For example, the mean of \(x^2\) is

(3.41)\[\langle x^2\rangle = \frac{\int x^2 P(x)\mathrm{d}x}{\int P(x)\mathrm{d}x}\]

3.1.3. The standard deviation

Probability distributions typically are not concentrated at a single value, rather they are spread out. There are several methods to quantify this spread numerically. One of them is the standard deviation. For a continuous probability distribution \(P(x)\) over a variable \(x\), the standard deviation of \(x2\) is defined in terms if its square, the variance, as

(3.42)\[\sigma^2_x = \int (x-\langle x\rangle)^2P(x)\mathrm{d}x\]

The integral runs over the entire range of \(x\).