Probability density function The
probability density function of the continuous uniform distribution is f(x) = \begin{cases} \dfrac{1}{b-a} & \text{for } a \le x \le b, \\[8pt] 0 & \text{for } x b. \end{cases} The values of f(x) at the two boundaries a and b are usually unimportant, because they do not alter the value of \int_c^d f(x)dx over any interval [c,d], nor of \int_a^b x f(x) \, dx, nor of any higher moment. Sometimes they are chosen to be zero, and sometimes chosen to be \tfrac{1}{b-a} . The latter is appropriate in the context of estimation by the method of
maximum likelihood. In the context of
Fourier analysis, one may take the value of f(a) or f(b) to be \tfrac{1}{2(b-a)} , because then the inverse transform of many
integral transforms of this uniform function will yield back the function itself, rather than a function which is equal "
almost everywhere", i.e. except on a set of points with zero
measure. Also, it is consistent with the
sign function, which has no such ambiguity. Any probability density function integrates to 1, so the probability density function of the continuous uniform distribution is graphically portrayed as a rectangle where is the base length and {{tmath|\tfrac{1}{b-a} }} is the height. As the base length increases, the height (the density at any particular value within the distribution boundaries) decreases. In terms of mean \mu and variance \sigma ^2 , the probability density function of the continuous uniform distribution is f(x) = \begin{cases} \dfrac{1}{2 \sigma \sqrt{3}} & \text{for } - \sigma \sqrt{3} \le x - \mu \le \sigma \sqrt{3} , \\[2pt] 0 & \text{otherwise} . \end{cases}
Cumulative distribution function The
cumulative distribution function of the continuous uniform distribution is: F(x) = \begin{cases} 0 & \text{for } x b. \end{cases} Its inverse is: F^{-1} (p) = a + p (b - a) \quad \text{ for } 0 In terms of mean \mu and variance \sigma ^2 , the cumulative distribution function of the continuous uniform distribution is: F(x) = \begin{cases} 0 & \text{for } x - \mu its inverse is: F^{-1} (p) = \sigma \sqrt{3} (2p-1) + \mu \quad \text{ for } 0 \le p \le 1.
Example 1. Using the continuous uniform distribution function For a
random variable X \sim U(0,23), find \Pr(2 \Pr(2 In a graphical representation of the continuous uniform distribution function [f(x) \text{ vs } x], the area under the curve within the specified bounds, displaying the probability, is a rectangle. For the specific example above, the base would be and the height would be {{tmath|\tfrac{1}{23} .}}
Example 2. Using the continuous uniform distribution function (conditional) For a random variable X \sim U(0,23), find \Pr(X > 12 \mid X > 8): \Pr(X > 12 \mid X > 8) = (23-12) \cdot \frac{1}{23-8} = \frac{11}{15}. The example above is a
conditional probability case for the continuous uniform distribution: given that is true, what is the probability that Conditional probability changes the sample space, so a new interval length has to be calculated, where b = 23 and a' = 8. M_X = \operatorname{E}\left[ e^{tX} \right] = \int_a^b e^{tx} \frac{dx}{b-a} = \frac{ e^{tb} - e^{ta} }{t(b-a)} = \frac{B^t - A^t}{t(b-a)} , from which we may calculate the
raw moments m_k : m_1 = \frac{a+b}{2} , m_2 = \frac{a^2+ab+b^2}{3} , m_k = \frac{ \sum_{i=0}^k a^i b^{k-i} }{k+1} . For a random variable following the continuous uniform distribution, the
expected value is m_1 = \tfrac{a+b}{2} , and the
variance is m_2 - m_1 ^2 = \tfrac{(b-a)^2}{12} . For the special case a = -b, the probability density function of the continuous uniform distribution is: f(x) = \begin{cases} \frac{1}{2b} & \text{for } -b \le x \le b, \\[8pt] 0 & \text{otherwise} ; \end{cases} the moment-generating function reduces to the simple form: M_X = \frac{ \sinh bt }{bt} .
Cumulant-generating function For the n-th
cumulant of the continuous uniform distribution on the interval {{tmath|[- \tfrac{1}{2} , \tfrac{1}{2}]}} is \tfrac{B_n}{n} , where B_n is the n-th
Bernoulli number.
Standard uniform distribution The continuous uniform distribution with parameters a = 0 and b = 1, i.e. U(0,1), is called the
standard uniform distribution. One interesting property of the standard uniform distribution is that if u_1 has a standard uniform distribution, then so does 1 - u_1 . This property can be used for generating
antithetic variates, among other things. In other words, this property is known as the
inversion method where the continuous standard uniform distribution can be used to generate
random numbers for any other continuous distribution. If u_1 is a uniform random number with standard uniform distribution, i.e. with U(0,1), then x = F^{-1} (u_1) generates a random number x from any continuous distribution with the specified
cumulative distribution function F.
Relationship to other functions As long as the same conventions are followed at the transition points, the probability density function of the continuous uniform distribution may also be expressed in terms of the
Heaviside step function as: f(x) = \frac{ \operatorname{H} (x-a) - \operatorname{H} (x-b) }{b-a} , or in terms of the
rectangle function as: f(x) = \frac{1}{b-a} \ \operatorname{rect} \left( \frac{x - \frac{a+b}{2}}{b-a} \right) . There is no ambiguity at the transition point of the
sign function. Using the half-maximum convention at the transition points, the continuous uniform distribution may be expressed in terms of the sign function as: f(x) = \frac{ \sgn{(x-a)} - \sgn{(x-b)} }{2(b-a)} . == Properties ==