In general, an event can happen "almost surely", even if the probability space in question includes outcomes which do not belong to the event—as the following examples illustrate.
Throwing a dart An example is the situation of throwing a
dart at a
unit square (a square with an
area of 1) so that the dart always hits an exact
point in the square, in such a way that each point in the square is
equally likely to be hit. Since the square has area 1, the probability that the dart will hit any particular subregion of the square is equal to the area of that subregion. For example, the probability that the dart will hit the right half of the square is 0.5, since the right half has area 0.5. The probability of the dart hitting exactly a point in the
diagonals of the unit square is 0, since the area of the diagonals of the square is 0. That is, the dart will
almost never land on a diagonal (equivalently, it will
almost surely not land on a diagonal), even though the set of points on the diagonals is not empty, and a point on a diagonal is no less possible than any other point.
Tossing a coin repeatedly Another example is tossing a (possibly biased) coin, which corresponds to the probability space (\{H,T\}, 2^{\{H, T\}}, P), where the event \{H\} occurs if a head is flipped, and \{T\} if a tail is flipped. For this particular coin, it is assumed that the probability of flipping a head is P(H) = p\in (0,1), from which it follows that the complement event, that of flipping a tail, has probability P(T) = 1 - p. An experiment is conducted where the coin is tossed repeatedly, with outcomes X_1,X_2,\ldots and the assumption that each flip's outcome is independent of all the others (i.e., they are
independent and identically distributed;
i.i.d). In this case, any infinite sequence of heads and tails is a possible outcome of the experiment. However, any particular infinite sequence of heads and tails has probability 0 of being the exact outcome of the (infinite) experiment. This is because the
i.i.d. assumption implies that the probability of flipping all heads over n flips is simply P(X_i = H, \ i=1,2,\dots,n)=\left(P(X_1 = H)\right)^n = p^n. Letting n\rightarrow\infty yields 0, since p\in (0,1) by assumption. The result is the same no matter how much the coin is biased towards heads, so long as p is strictly between 0 and 1. In fact, the same result even holds in non-standard analysis—where infinitesimal probabilities are allowed. Moreover, the event "the sequence of tosses contains at least one T" will also happen almost surely (i.e., with probability 1). If instead of an infinite number of flips, flipping stops after some finite time, say 1,000,000 flips, then the probability of getting an all-heads sequence, p^{1,000,000}, would no longer be 0, while the probability of getting at least one tails, 1 - p^{1,000,000}, would no longer be 1 (i.e., the event is no longer almost sure). ==Asymptotically almost surely==