Powered by MathJax
We use MathJax

Gamma Distributions

Gamma distributions often occur when we want to know the probability for the waiting time until the $k$th success occurs, given that the mean rate of success per unit of time is $\lambda$. In this respect, the gamma distribution is related to the exponential distribution in the same way that the negative binomial distribution was related to the geometric distribution. Gamma distributions are always defined on the interval   $[0,\infty)$.

The Formulas

If $X$ has a gamma distribution over the interval   $[0,\infty)$,   with parameters $k$ and $\lambda$, then the following formulas will apply.

\begin{align} f(x) &= \dfrac{\lambda^k}{\Gamma(k)} x^{k-1} e^{-\lambda x} \\ M(t) &= \left( \dfrac{\lambda}{\lambda - t} \right)^k \\ E(X) &= \dfrac{k}{\lambda} \\ Var(X) &= \dfrac{k}{\lambda^2} \end{align}

To use the gamma distribution it helps to recall a few facts about the gamma function.

Time Until the kth Occurrence

Suppose that accidents occur at a particular intersection at a frequency of 16 accidents per year. What is the probability that the third accident will occur before the first month elapses?

The random variable in this problem is the time, and in particular that the third accident will occur during the first month. That is, we need to find   $P(0 < T < \dfrac{1}{12})$.

The problem gave the accident frequency as 16 per year, so   $\lambda = 16$.   For the third accident, we will use   $k = 3$,   which gives the following PDF on the interval   $[0, \infty)$.

\begin{equation} f(x) = \dfrac{16^3}{\Gamma(3)} x^{3-1} e^{-16x} = \dfrac{16^3}{2!} x^2 e^{-16x} = 2048 x^2 e^{-16x} \end{equation}

The CDF of this function can be found through integration by parts. We obtain   $F_X (x) = 1 - e^{-16x}(128x^2 + 16x + 1)$.   Therefore, the probability we seek is

\begin{align} P \left( T < \dfrac{1}{12} \right) &= F \left( \dfrac{1}{12} \right) \\ &= 1 - e^{-16(1/12)} \left( 128 \left(\dfrac{1}{12} \right)^2 + 16 \left(\dfrac{1}{12}\right) + 1 \right) \\ &= 1 - \dfrac{29}{9} e^{-4/3} \approx 0.1506 \end{align}

Therefore, there is a 15.06% probability that the third accident will occur in the first month. This would be equivalent to the probability of having at least three accidents in that first month.

Derivation of the Formulas

Recall that the PDF of the Poisson distribution is   $P(X=x) = \dfrac{\lambda^x e^{-\lambda}}{x!}$,   where $X$ is the number of successes for some unit of time, and $\lambda$ is the mean number of successes per unit time. If we extend the time under consideration by a factor of $t$, and since the successes in a Poisson distribution are independent, we would expect $\lambda t$ successes in $t$ units of time.

From this result, we can obtain the probability of fewer than $k$ successes in $t$ units of time, by summing the various cases of the Poisson distribution formula. We also note that having fewer than $k$ successes in $t$ time units is equivalent to waiting longer than $t$ time units to achieve the $k$th success. For our formula, we obtain

\begin{equation} P( X < k) = P(T > t) = \sum\limits_{x=0}^{k-1} \dfrac{(\lambda t)^x e^{-\lambda t}}{x!} \end{equation}

Then the CDF for the waiting time is

\begin{equation} F_T(t) = P(T \le t) = 1 - \sum\limits_{x=0}^{k-1} \dfrac{(\lambda t)^x e^{-\lambda t}}{x!} \end{equation}

We can take the derivative of this result with respect to the variable $t$ to obtain the PDF.

\begin{align} f_T (T \le t) &= \lambda e^{-\lambda t} - \sum\limits_{x=1}^{k-1} \dfrac{\lambda x (\lambda t)^{x-1} e^{-\lambda t} + (\lambda t)^x (-\lambda) e^{-\lambda t}}{x!} \\ &= \lambda e^{-\lambda t} - e^{-\lambda t} \sum\limits_{x=1}^{k-1} \left[ \dfrac{\lambda x (\lambda t)^{x-1}}{x!} - \dfrac{\lambda (\lambda t)^x}{x!} \right] \\ &= \lambda e^{-\lambda t} - e^{-\lambda t} \left[ \sum\limits_{x=1}^{k-1} \dfrac{\lambda (\lambda t)^{x-1}}{(x-1)!} - \sum\limits_{x=2}^k \dfrac{\lambda (\lambda t)^{x-1}}{(x-1)!} \right] \\ &= \lambda e^{-\lambda t} - e^{-\lambda t} \left[ \lambda - \dfrac{\lambda (\lambda t)^{k-1}}{(k-1)!} \right] \\ &= \dfrac{\lambda (\lambda t)^{k-1}}{(k-1)!} e^{-\lambda t} \\ &= \dfrac{\lambda^k}{\Gamma(k)} t^{k-1} e^{-\lambda t} \end{align}

Changing the variable $t$ to $x$ will produce the PDF formula given above.

The moment generating function $M(t)$ can be found by evaluating $E(e^{tX})$.

\begin{align} M(t) &= E(e^tX) = \int_0^\infty e^{tx} \dfrac{\lambda^k}{\Gamma(k)} x^{k-1} e^{-\lambda x} \,\mathrm{d}x \\ &= \int_0^\infty \dfrac{\lambda^k}{\Gamma(k)} x^{k-1} e^{(t-\lambda)x} \,\mathrm{d}x \end{align}

By making the substitution   $y = (\lambda - t)x$,   we can transform this integral into one that can be recognized.

\begin{align} M(t) &= \int_0^\infty \dfrac{\lambda^k}{\Gamma(k)} \left( \dfrac{y}{\lambda - t} \right)^{k-1} e^{-y} \dfrac{1}{\lambda - t} \,\mathrm{d}y \\ &= \dfrac{\lambda^k}{\Gamma(k) (\lambda - t)^k} \int_0^\infty y^{k-1} e^{-y} \, \mathrm{d}y \\ &= \dfrac{\lambda^k}{\Gamma(k) (\lambda - t)^k} \Gamma(k) \\ &= \left( \dfrac{\lambda}{\lambda - t} \right)^k \end{align}

To obtain the expected value, $E(X)$, we rewrite the moment generating function as   $M(t) = \lambda^k (\lambda - t)^{-k}$,   and then obtain the first derivative (with respect to $t$),   $M'(t) = k \lambda^k (\lambda-t)^{-k-1}$.   Evaluating this at   $t=0$,   we find

\begin{equation} E(X) = M'(0) = k \lambda^k \lambda^{-k-1} = \dfrac{k}{\lambda} \end{equation}

And we find the value $E(X^2)$ from the second derivative of the moment generating function,   $M''(t) = k(k+1) \lambda^k (\lambda-t)^{-k-2}$.   Evaluating this at   $t = 0$,   we find

\begin{align} E(X^2) &= M''(0) = k(k+1) \lambda^k \lambda^{-k-2} = \dfrac{k(k+1)}{\lambda^2} \\ Var(X) &= E(X^2) - (E(X))^2 = \dfrac{k(k+1)}{\lambda^2} - \left( \dfrac{k}{\lambda} \right)^2 = \dfrac{k}{\lambda^2} \end{align}

And therefore, the standard deviation of a gamma distribution is given by   $\sigma_X = \dfrac{\sqrt{k}}{\lambda}$.

Sums of Independent Gamma Distributions

Suppose the random variable $X$ has a gamma distribution with parameters $k_x$ and $\lambda$, and the random variable $Y$ has a gamma distribution with parameters $k_y$ and $\lambda$. Note that the parameter $\lambda$ is common, but $k$ is not. If these random variables are independent, then the random variable given by the sum of $X$ and $Y$ will also be a gamma distribution, with parameters   $k_x + k_y$   and $\lambda$.

The proof of the statement follows immediately from the moment generating functions. We have   $M_X(t) = \left( \dfrac{\lambda}{\lambda - t} \right)^{k_x}$   and   $M_Y(t) = \left( \dfrac{\lambda}{\lambda - t} \right)^{k_y}$.   When two random variables are independent, the moment generating function of the sum is the product of the moment generating functions. Therefore we have

\begin{equation} M_{X+Y}(t) = \left( \dfrac{\lambda}{\lambda - t} \right)^{k_x} \left( \dfrac{\lambda}{\lambda - t} \right)^{k_y} = \left( \dfrac{\lambda}{\lambda - t} \right)^{k_x+k_y} \end{equation}

Now moment generating functions are unique, and this is the moment generating function of a gamma distribution with parameters   $k_x + k_y$   and $\lambda$.

Special Cases

Therefore, the sum of two independent exponential distributions is a gamma distribution, and the sum of two independent chi-square distributions is a chi-square distribution.